Programming

You can’t imagine how much energy artificial intelligence consumes. ARM does some math

You can't imagine how much energy artificial intelligence consumes.  ARM does some math

I generative models and artificial intelligence applications are destined to exponentially increase electricity consumption. Especially since the use of these tools is becoming increasingly widespread and rooted in many sectors of industry and the working world in general.

Rene Has, CEO of ARM, joins the chorus of those who want to warn against the insatiable hunger for electricity of AI. The adoption, at all levels, of modern models based on generative artificial intelligence cannot ignore the issue of environmental sustainability.

What are the consumptions of artificial intelligence and generative models

The International Energy Agency (IEA) recently published (January 2024) an interesting report which shows that every single request sent to ChatGPT consumes approximately 2.9 watt hours, a value in terms of energy consumption which is equal to 10 times that required for a standard search with the engine of Google. By “standard search,” AIE refers to the traditional sending of a query on the search engine of the Mountain View company, without the cooperation of any tool based on artificial intelligence.

If Google itself changed its “backbone” by replacing the algorithms used up to now by its search engine with new queries based on generative models, the energy consumption it could jump to 11 terawatt hours a year from the current 1 terawatt hour. 2.9 watt hours is equivalent to keeping one on 60 watt hour bulb for just under three minutes.

Furthermore, AIE notes that the transition to even more powerful models, designed to support the functioning of future AGIs (Artificial General Intelligence), could further complicate the picture in terms of energy impact.

Other numbers are merciless: according to estimates developed by Factorial Fundsthe new OpenAI Sora model would employ the equivalent computing power of at least one GPU NVidia H100 for an hour in order to generate a 5 minute video.

The generative model Grok 3 of xAI (Elon Musk) would have requested 100,000 NVidia H100 cards for training activities; with a single H100 capable of consuming 3,740 kilowatt hours per year.

A step change is needed to reduce AI energy consumption as demand increases

Has observes that without intervention by the legislator and without making significant improvements in terms ofenergy efficiencythe triggered trend will be impossible to sustain.

Also according to IEA, the USA generated a total of 4,240 terawatt hours of electricity in 2022, 22% of which came from renewables. The overall annual consumption instead stands at 3,900 terawatt hours. In Europe, energy production stands (year 2022) at 2,800 terawatt hours, with 1,400 terawatt hours coming from renewables.

At this rate, AI solutions could consume as much as 25% of the energy produced by 2030, Has says. Obviously a decisive change of direction is necessary.

The most logical step is, of course, to increase the renewable energy production: the virtuous circle must be promoted at central level, at individual country level and in each company.

These efforts, however, will not be enough and as Oracle explains, to develop sustainable AI it is necessary to abandon the path of larger generative models, focusing instead on more compact and effective solutions for specific needs.

The phase of inference it is computationally much less demanding than model training. But, on the other hand, the sum of the numbers adds up when a number of instances of the same model are simultaneously used in multiple realities.

The process by which the model draws conclusions or makes decisions based on the input data provided is called inference. In data science and artificial intelligence, the inference phase is the final one that involves using an already trained model.

Opening image credit: iStock.com – BlackJack3D

Leave a Reply

Your email address will not be published. Required fields are marked *