aAI applications should also help protect the climate in the future, but researchers warn of a sharp increase in energy consumption in AI data centres. “AI tools consume a lot of electricity, and the trend is increasing,” Managing Director of the Hasso Plattner Institute (HPI) in Potsdam and Head of the Department of Artificial Intelligence and Sustainability, Ralf Herbrech, told the German news agency DPA. Simply training a single AI model is an energy-intensive process with complex predictive calculations.
Data scientist Alex de Vries from Amsterdam compares energy consumption with that of entire countries using a search engine that uses artificial intelligence. Scientists and Internet companies are working to improve the environmental footprint of artificial intelligence.
The topic of artificial intelligence has received significant attention, among other things, through the ChatGPT text bot from California startup OpenAI. Safety technology in cars is also controlled by artificial intelligence, making heating systems more efficient. AI systems are also found in healthcare and businesses.
Energy consumption can increase up to 30 percent
“Data centers now consume 4 to 5 percent of global energy consumption,” Herbrich said. If we add the use of digital technologies such as laptops and smartphones, the number reaches 8 percent. There are estimates that consumption will rise to 30 percent in the next few years.”
The AI expert compares Herbrich to an oven: To train the AI model, processors from hundreds of graphics cards, each consuming about 1,000 watts, were run for several weeks. “1000 watts is equivalent to an oven.”
According to Herbrich, the research aims to ensure that calculations can be performed with fewer parameters and thus use less energy, while at the same time the accuracy of predictions is reduced to a minimum. Technology companies have also pushed research into energy savings using artificial intelligence. However, it will take a few years to develop solutions.
Higher costs due to additional AI servers required
Researcher De Vries, who recently published a commentary in the journal Joule, wants to point out that it is not just AI training that consumes a large amount of energy. Power requirements also arise every time the tool generates text or an image. “Running ChatGPT, for example, can cost 564 MWh of electricity per day,” says de Vries. However, it is difficult to predict the future of energy consumption in AI.
De Vries estimates that Google currently processes up to 9 billion searches per day. According to his calculations, if AI was used in every Google search, about 29.2 terawatt hours of electricity would be needed annually – equivalent to Ireland’s annual electricity consumption. However: De Vries also talks about an extreme scenario that will not happen in the short term. He cited higher costs due to the additional need for AI servers and bottlenecks in the supply chain. For comparison: According to the Federal Grid Agency, electricity consumption in Germany was about 484 terawatt-hours in 2022.
Internet company Google, which launched the Cool chatbot this year, at its request, said that according to studies and its own experiments, the energy needed to operate the technology is increasing much more slowly than many forecasts expected. Google used tried and tested methods to significantly reduce power consumption to train the model. Google also points out that the company is using artificial intelligence to protect the climate and mentions, for example, “fuel-efficient route planning” using Google Maps and river flood forecasting.
The Hasso Plattner Institute is organizing the “Clean IT” conference on October 25-26, where representatives from science, business and politics want to discuss artificial intelligence and the fight against climate change in Potsdam.
“Certified tv guru. Reader. Professional writer. Avid introvert. Extreme pop culture buff.”