Home Tech Artificial Intelligence Brings Microsoft’s ‘Green Moonshot’ to Earth in West London

Artificial Intelligence Brings Microsoft’s ‘Green Moonshot’ to Earth in West London

0 comments
Artificial Intelligence Brings Microsoft's 'Green Moonshot' to Earth in West London

If you want evidence of Microsoft’s progress toward its environmental goal of “reaching the moon,” then look closer to Earth: at a construction site on an industrial estate west of London.

The company’s Park Royal data centre is part of its commitment to drive the expansion of artificial intelligence (AI), but that ambition clashes with its goal to be carbon negative by 2030.

Microsoft says the center will run entirely on renewable energy. However, building data centers and the servers that fill them means the company’s Scope 3 emissions, such as CO2 related to the materials of their buildings and the electricity that people consume when using products like Xbox, are more than 30% above its 2020 level. As a result, the company is exceeding its overall emissions target at roughly the same rate.

This week, Microsoft co-founder Bill Gates claimed that AI would help combat climate change because Big Tech is “really willing” to pay more to use clean sources of electricity to “say they’re using green energy.”

In the short term, AI has proven problematic for Microsoft’s green goals. Brad Smith, Microsoft’s outspoken president, once called its carbon ambitions a “moonshot.” ​​In May, taking that metaphor to the breaking point, accepted that because of their AI strategy, “the moon has moved.” It plans to spend £2.5bn over the next three years to grow its AI data center infrastructure in the UK and this year has announced new data center projects around the world, including the US, Japan, Spain and Germany .

Training and operating the AI ​​models that underpin products like OpenAI’s ChatGPT and Google’s Gemini use a lot of electricity to power and cool the associated hardware, and additional carbon is generated when manufacturing and transporting the related equipment.

“It’s a technology that is increasing energy consumption,” says Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies.

The International Energy Agency estimates that total electricity consumption of data centers could Double 2022 levels to 1,000 TWh (terawatt hours) in 2026equivalent to the energy demand of Japan. AI will make data centers use 4.5% of global energy generation by 2030, according to calculations by the research firm SemiAnalysis.

.

This means that, amid concerns about the impact of AI on jobs and the longevity of humanity, the environment is also at stake. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to capture the environmental cost of AI, in the form of a blanket carbon tax that captures server emissions as part of its scope. , or other methods such as a specific CO2 tax.2 generated by that equipment.

All the big tech companies involved in AI (Meta, Google, Amazon, Microsoft) are looking for renewable energy resources to meet their climate goals. In January, Amazon, the world leader The largest corporate buyer of renewable energy, announced that it had I bought more than half. production from an offshore wind farm in Scotland, while Microsoft said in May it was investing $10bn (£7.9bn) in renewable energy projectsGoogle aims to have its data centers run entirely on carbon-free energy by 2030.

A Microsoft spokesperson said: “We remain steadfast in our commitment to meeting our climate goals.”

Microsoft co-founder Bill Gates, who left the company in 2020 but retains a stake in the company through the Gates Foundation Trust, has argued that AI can directly help combat climate change. The additional demand for electricity would be offset by new investments in green generation, he said Thursday, which would more than offset usage.

A recent report backed by the UK government agreed that the “carbon intensity of the energy source is a key variable” for calculating AI-related emissions, although it added that “an important part of AI training to “The world still depends on high-carbon sources, such as coal or natural gas.” The water needed to cool the servers is also a problem, since A study estimating that AI could account for up to 6.6 billion cubic meters of water use by 2027, almost two-thirds of the annual consumption of England.

De Vries argues that the quest for sustainable computing power is putting pressure on demand for renewable energy, which would lead to fossil fuels being taken up by other sectors of the global economy.

“Increased energy consumption means we don’t have enough renewables to fuel that increase,” he says.

Server rooms in a data center consume a lot of energy. Photography: i3D_VR/Getty Images/iStockphoto

NexGen Cloud, a UK company that offers sustainable cloud computing, an industry that relies on data centers and offers IT services such as data storage and computing power over the Internet, says that renewable energy sources for AI-related computing are available to data centers if they avoid cities and are located next to hydro or geothermal energy sources.

Youlian Tzanev, co-founder of NexGen Cloud, says:

“The industry norm has been to build around economic centers rather than renewable energy sources.”

This makes it more difficult for any AI-focused tech company to meet its carbon emissions goals. Amazon, the world’s largest cloud computing provider, aims to reach net-zero emissions (eliminate as much carbon as it emits) by 2040 and match its global electricity consumption with 100% renewable energy by 2025. Google and Goal they pursue the same goal of net zero emissions by 2030. OpenAI, the developer of ChatGPT, uses Microsoft data centers to train and operate its products.

There are two key ways that large language models (the technology behind chatbots like ChatGPT or Gemini) consume energy. The first is the training phase, where a model receives a large amount of data curated from the internet and beyond, and builds a statistical understanding of the language itself, ultimately allowing it to generate compelling answers to queries.

The upfront energy cost of training an AI is astronomical, preventing smaller companies (and even smaller governments) from competing in the space without a spare $100 million to invest in a training trial. But it pales in comparison to the cost of actually running the resulting models, a process known as “inference.” According to analyst Brent Thill of the investment firm Jefferies, 90% of the energy cost of AI is in that inference phase — the electricity used when people ask an AI system to answer factual queries, summarize a piece of text or write an academic essay.

The electricity used for training and inference is funneled through a vast and growing digital infrastructure. Data centers are filled with servers, which are built from the ground up for the specific part of the AI ​​workload they’re in. A single training server might have a central processing unit (CPU) barely more powerful than your own computer, paired with dozens of specialized graphics processing units (GPUs) or tensor processing units (TPUs) — microchips designed to quickly process the massive amounts of simple calculations that AI models are made of.

If you use a chatbot, as you watch it spit out answers word for word, a powerful GPU is using about a quarter of the energy needed to boil a kettle. All of this is hosted in a data center, either owned by the AI ​​provider itself or by a third party, in which case it might be called “the cloud,” a fancy name for someone else’s computer.

SemiAnalysis estimates that if generative AI were built into every Google search, this could translate into annual energy consumption of 29.2 TWh – comparable to what Ireland consumes in a year – although the financial cost to the tech company would be prohibitive. That has led to speculation that the search company could start charging for some AI tools.

But some argue that it is not correct to consider the energy consumption of AI. Instead, consider the energy that new tools can save. A provocative paper published earlier this year in Nature’s peer-reviewed journal Scientific Reports argued that the carbon emissions from writing and illustrating are lower for AI than for humans.

Artificial intelligence systems emit “between 130 and 1,500 times” less carbon dioxide per page of text generated compared to human writers, the University of California, Irvine researchers estimated, and up to 2,900 times less per image.

Of course, what is not said is what those human writers and illustrators are doing: reorienting and retraining their work in another field, such as green jobs – could be another moonshot.

You may also like