High computing power in training AI models inevitably also entails high energy consumption. On the other hand, the use of these models potentially saves a lot of unnecessary work. What about the carbon footprint?
Cryptocurrencies are an absolute hot topic at the moment; there are reports about them every day, almost every hour. And more and more about how energy-intensive mining and trading are. For the beginning of February 2021, the consumption was calculated at about 117 terawatt hours per year - that roughly corresponds to the energy consumption of Austria and Switzerland combined.
Well, the comparison is somewhat flawed if you only look at the numbers. Most of the electricity for this is generated in countries like China, Canada, Russia or Iran, where it is cheaper. Nevertheless, it is estimated that about 70 million tonnes of CO₂ are emitted for it every year. Just as examples: A single Bitcoin transaction causes on average about 300 kg of CO₂ - as much as a flight from Berlin to Munich and back. Or about 750,000 transactions with a Visa card ...
There are several estimates that show, for example, the energy consumption needed to train an AI model. That is roughly quantifiable. Approximately. The emissions from training an NLP model, for example, are estimated to be roughly equivalent to the CO₂ emissions of five passenger cars over their lifetime (including manufacturing). Other estimates are lower. Nevertheless: AI also means a lot of computing power and thus high energy consumption. What is not included in this calculation, however, is the simultaneous reduction of CO₂ emissions - because this is difficult to capture in figures.
This can be well described using the example of NLP, Natural Language Processing: A trained NLP model makes it possible to capture and analyse vast amounts of data and documents in a fraction of a second. A job that otherwise has to be done by hand. Computers are also used for this, and employees travel to work for it, sometimes even across the country to do this in archives at different locations. Yes, and CO₂ is also emitted in the process. But that is difficult to quantify. AI saves these emissions. Likewise, another example, as well as intelligent chat bots that support employees and take work off their hands so they can concentrate on their core competencies. A different view of energy efficiency.
What's more, AI can be used in a targeted way to reduce CO₂ emissions. Google, for example, uses an AI model to shift different tasks in its data centres to different times of day and locations where energy consumption is fed from renewable sources. The goal here is 100% green power. AI also helps to exploit energy-saving potential in real estate management - with forecasts on plant and building behaviour, combined with meteorological data.
So while cryptocurrencies are virtual and volatile, they consume a lot of energy to serve only one purpose: to be used only as a means of payment (or used as an investment), and in the process release much more CO₂ emissions than existing means of payment. AI solutions, on the other hand, are tangible and allow real savings to be made - even if it is difficult to calculate the CO₂ footprint in the process. Moreover, AI is being used in environmental technology to achieve better, greener goals.
It is worth looking at AI not only for the negative, in this case the high energy consumption. No, but also the simultaneous positive effect, because it avoids a lot of unnecessary work. Or because AI enables optimized energy management, in many areas of the economy. To what extent this exactly adds up, I don't want to say here - I just can't. But hopefully there will soon be models that can do that.