AI’s Shocking Carbon Footprint

How Techniques First Invented in the 1920s are Changing That

Introduction

What do the melting glaciers of Greenland have in common with artificial intelligence? On the surface, not much. But if you dig just a bit deeper, the connection is scarier than you might imagine. Turns out that data is not only the new metaphoric oil, but it also has a carbon impact just as bad.

There is no other way to put it, AI is simply everywhere these days. Whether it is being used in autonomous vehicles, in diagnosing various kinds of cancers, or in doing more mundane things like prioritizing which bills to pay first, the use cases of AI are well documented and growing fast. Speaking of the mundane, there is even a toothbrush powered by AI (not sure why the world needs such a device, but I am sure it is an incredible toothbrush nonetheless!).  

More Emissions Than 5 Cars

What most people don’t realize, however, is the tremendous carbon impact that AI has.

An article published recently in the MIT Technology Review tags the impact of training a single large-sized AI model at “626,000 pounds of carbon dioxide equivalent i.e. nearly five times the lifetime emissions of the average American car (and that includes the manufacture of the car itself)”. Granted this was for a seminal model that forms the basis of natural language processing (Transformer 213M parameters w/ neural architecture search), but even very simple models ended up with a carbon impact much larger than flying from San Francisco to New York City. And since there is never just one model that gets trained, you can imagine the cumulative energy consumption impact of such exercises.

Datacenters are Energy Hogs

If all the energy consumed by a data center was from renewable sources, we wouldn’t need to worry much about the carbon impact of AI models. However, according to the U.S. Department. of Energy, renewables are only 17% of the energy mix and most data centers don’t run 100% on renewable energy (except Google, which went 100% renewable in 2017).

The other option is to alter the way we train AI models, i..e., make that process becomes much less compute-intensive. This is a huge area of opportunity and where companies like Latent AI are driving the innovation.

 Quantization

This brings me to the almost 100-year old mathematical technique that can help relieve the carbon impact of AI.

Quantization has been around since the days of Niels Bohr and Max Planck, mostly referenced in the context of energy waves and quantum mechanics. The earliest references of the term go back to the early 1920s. Only in the past few decades, however, since the emergence of digital signal processing, has this technique been more commonly used.

Essentially, quantization is a bit of an umbrella term that refers to quantitative methods that can convert input values from a large data set to output values in a much smaller set. While sampling, for example, only picks a few items from a given set of values, quantization picks every data point but, say, rounds it off from 15 decimal points down to only two.

When you do such an exercise in deep learning, some information is lost but with the right training, the loss in accuracy can be managed. So, one can convert from 32-bit floating point to an 8-bit fixed integer, which reduces the amount of data that needs to be moved on and off the chip. This has a significant reduction in the compute horsepower, the memory needed and ultimately the energy needed to run the machine.

The challenge is to do it well enough such that you can retain 85%-90% model accuracy while delivering on the computational efficiency. And that is where Latent AI excels.

Enabling Adaptive AI™ for a Smarter Edge

Latent AI develops core technologies & tools to enable AI at the edge by optimizing deep neural networks to perform efficiently in resource-constrained environments. Their solutions optimize AI models for compute, memory, and power consumption while supporting a seamless integration to leading AI/ML infrastructure and frameworks. Latent AI’s technology is about training neural networks with the target device in mind and compiling the trained model to run as an executable object in the target environment. Their tools build models and provide compilers targeting any hardware, be it Intel x86, ARM, DSP or Micro-controllers.

Conclusion

The average American consumes about 55 lbs of beef annually, which translates to about 1,500 lbs of CO2 produced per year, roughly the same impact per passenger as a single airplane flight from San Francisco to Chicago. It may also be worth noting that according to some estimates, a typical Internet search uses as much energy as illuminating a 60-watt bulb for 17 seconds. Given that Google alone processes over 2 trillion searches annually, that’s a lot of lightbulbs on all the time.

Not to mention that it also leads to some very interesting conundrums – does a meat eater who is not online all the time have a lower carbon footprint than a vegan hipster who is hooked to his devices?

On a global basis, power generation remains the single biggest contributor of greenhouse gases. As the world evolves to being more automated, data-driven and AI-led, the carbon impact of this new paradigm will be non-trivial. Which is why what Latent AI is doing becomes even more critical. 

By Manju Bansal, Advisor, Latent AI

Photo Credits:  Adobe Stock

 


Leave a Reply

Your email address will not be published. Required fields are marked *