Home Tech AI’s energy demands are out of control. Welcome to the era of hyperconsumption on the Internet

AI’s energy demands are out of control. Welcome to the era of hyperconsumption on the Internet

0 comments
AI's energy demands are out of control. Welcome to the era of hyperconsumption on the Internet

At this moment, generative Artificial intelligence is impossible to ignore online. An AI-generated summary may randomly appear at the top of the results every time you perform a Google search. Or you may be prompted to try Meta’s artificial intelligence tool while browsing Facebook. And that Always present shiny emoji It continues to haunt my dreams.

This rush to add AI to as many online interactions as possible dates back to OpenAI’s groundbreaking launch of ChatGPT in late 2022. Silicon Valley soon became obsessed with generative AI, and nearly two years later, AI tools powered by large language models permeate the online user experience.

An unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems are much more resource-intensive. This has led to the arrival of the hyper-consumption era of the Internet, a period defined by the spread of a new type of computing that demands excessive amounts of electricity and water to build and operate.

“On the back end, these algorithms that need to be run for any generative AI model are fundamentally very, very different from the traditional Google search or email type,” he says. Sajjad Moazenicomputer engineering researcher at the University of Washington. “For basic services, they were very lightweight in terms of the amount of data that had to go back and forth between processors.” By comparison, Moazeni estimates that generative AI applications require 100 to 1,000 times more computational resources.

The technology’s energy requirements for training and deployment are no longer the dirty secret of generative AI, as last year, one expert after another predicted increases in power demand in the data centers where companies work on AI applications. Almost as if on cue, Google recently stopped considering itself as a carbon neutraland Microsoft can trample on your sustainability goals underfoot in the ongoing race to build the biggest and best AI tools.

“The carbon footprint and energy consumption will be linear to the amount of computing that is done, because basically these data centers are powered proportionally to the amount of computing they do,” he says. Junchen Jiangnetworked systems researcher at the University of Chicago. The bigger the AI ​​model, the more computations are required, and these cutting-edge models are becoming absolutely gigantic.

Although Google’s total energy consumption doubled between 2019 and 2023, Corina Standiford, a company spokeswoman, said it would not be fair to claim that Google’s energy consumption In an email, Google notes that emissions from artificial intelligence (AI) service providers have skyrocketed during the AI ​​race. “Reducing emissions from our suppliers is a big challenge, as they account for 75 percent of our carbon footprint,” it says. Among the suppliers Google blames are manufacturers of servers, networking equipment and other technical infrastructure for data centers — an energy-intensive process required to create physical parts for cutting-edge AI models.

You may also like