data centre

 

The artificial intelligence (AI) boom has driven tech share prices to new heights but at a significant environmental cost.

Google recently acknowledged that its data centres, crucial for AI infrastructure, have contributed to a 48% increase in its greenhouse gas emissions since 2019. The company cited “significant uncertainty” regarding its goal of achieving net-zero emissions by 2030 due to the unpredictable environmental impact of AI.

Similarly, Microsoft, a major backer of ChatGPT developer OpenAI, has expressed doubts about meeting its 2030 net-zero target due to its AI initiatives. This raises the question: can the tech industry mitigate AI’s environmental impact, or will it prioritize AI supremacy despite the environmental cost?

Data centres, essential for training and operating AI models like Google’s Gemini and OpenAI’s GPT-4, house the sophisticated computing equipment that processes vast amounts of data. These centres require substantial electricity, generating CO2 depending on the energy source and producing “embedded” CO2 from manufacturing and transporting the equipment.

The International Energy Agency (IEA) predicts that total electricity consumption from data centres could double from 2022 levels to 1,000 TWh (terawatt-hours) by 2026, equating to Japan’s energy demand.

Research firm SemiAnalysis estimates that AI will account for 4.5% of global energy consumption by 2030. Additionally, a study estimates AI could use up to 6.6 billion cubic meters of water by 2027—nearly two-thirds of England’s annual consumption.

A UK government-backed report on AI safety highlighted that the carbon intensity of the energy used by tech firms is crucial in determining AI’s environmental cost. The report noted that a significant portion of AI model training still relies on fossil fuel-powered energy.

In response, tech firms are securing renewable energy contracts to meet their environmental goals. For instance, Amazon is the world’s largest corporate purchaser of renewable energy. However, some experts argue that this can push other energy users toward fossil fuels due to insufficient clean energy supply.

“Energy consumption is not just growing, but Google is also struggling to meet this increased demand from sustainable energy sources,” says Alex de Vries, the founder of Digiconomist, a website monitoring the environmental impact of new technologies.

Global governments aim to triple renewable energy resources by the end of the decade to reduce fossil fuel consumption in line with climate targets. However, the IEA warns that even with record growth in global renewable energy capacity in 2023, current government plans may only double renewable energy by 2030.

To address AI’s energy demands, tech companies might need to invest heavily in new renewable energy projects. Onshore renewable energy projects like wind and solar farms can be developed relatively quickly, within six months. However, slow planning processes in many developed countries and delays in connecting new projects to the power grid could extend this timeline. Offshore wind farms and hydro power projects face similar challenges, with construction times ranging from two to five years.

This raises concerns about whether renewable energy can keep pace with AI expansion. Major tech companies have already secured a third of US nuclear power plants to supply low-carbon electricity to their data centres. Without investing in new power sources, these deals could divert low-carbon electricity from other users, increasing overall fossil fuel consumption.

Normally, rising energy costs would force industries to economise. However, tech giants might choose to absorb these costs, spending billions of dollars to maintain their competitive edge. The most expensive data centres in the AI sector are those used to train “frontier” AI models like GPT-4 and Claude 3.5, which are more advanced than any others. Leading companies like OpenAI, Anthropic (maker of Claude), and Google’s Gemini are locked in a “winner takes all” competition, where each must either outspend the others or risk falling behind.

The race for Artificial General Intelligence (AGI), AI systems capable of performing any human task, intensifies this spending. Companies might invest hundreds of billions of dollars in a single training run if it could lead to a technological monopoly that, as OpenAI suggests, could “elevate humanity.”

Despite new breakthroughs that enable more efficient AI training, such as DeepMind’s Chinchilla project, which showed how to train models with less computing power, overall electricity usage has not decreased. Instead, the same amount of energy is used to develop more advanced AI systems. This phenomenon, known as “Jevons’ paradox,” occurs when technological improvements lead to increased resource consumption because lower costs create new, previously unfeasible applications.

The AI industry’s growth and its environmental implications present a complex challenge. While technological advancements promise immense benefits, balancing these with sustainable practices is crucial to mitigate the sector’s growing environmental footprint.

 

——————————————————————————

At Natural World Fund, we are passionate about rewilding the UK to stop the decline in our wildlife.

Donate now and join in the solution!

 

Leave A Comment