A study by a PhD candidate at the VU Amsterdam School of Business and Economics, Alex De Vries, warns that the AI industry could be consuming as much energy as a country the size of the Netherlands by 2027. 

The Impact Of AI 

De Vries, the founder of Digiconomist, a research company that focuses on unintended consequences of digital trends, and whose previous research has focused on the environmental impact of emerging technologies (e.g., blockchain), based the warning on the assumption that certain parameters remain unchanged.  

For example, assuming that the rate of growth of AI, the availability of AI chips, and servers work at maximum output continuously, coupled with chip designer Nvidia supplying 95 per cent of the AI sectors processors, Mr De Vries has calculated that by 2027 the expected range for the energy consumption of AI computers will be of 85-134 terawatt-hours (TWh) of electricity each year. 

The Same Amount Of Energy Used By A Small Country 

This figure approximately equates the amount of power used annually by a small country, such as the Netherlands, and half a per cent of the total global electricity consumption. The research didn’t include the energy required for cooling (e.g. using water). 

Why? 

The large language models (LLMs) that power popular AI chatbots like ChatGPT and Google Bard, for example, require huge datacentres of specialist computers that have high energy requirements and have considerable cooling requirements. For example, whereas a standard data centre computer rack requires 4 kilowatts (kW) of power (the same as a family house), an AI rack requires 20 times the power (80kW), and a single data centre may contain thousands of AI racks.  

Other reasons why large AI systems require so much energy also include: 

  • The scale of the models. For example, larger models with billions of parameters require more computations. 
  • The vast amounts training data processed increases energy usage. 
  • The hardware (powerful GPU or TPU clusters) is energy intensive.
  • The multiple iterations of training and tuning uses more energy, as does the fine-tuning, i.e. the additional training on specific tasks or datasets.
  • Popular services hosting multiple instances of the model in various geographical locations (model redundancy) increases energy consumption.
  • Server overhead (infrastructure support), like cooling and networking, uses energy.
  • Millions of user interactions accumulate energy costs, even if individual costs are low (the inference volume).
  • Despite optimisation techniques, initial training and model size is energy-intensive, as are the frequent updates, i.e., the regular training of new models to stay state-of-the-art.

Huge Water Requirements Too – Which Also Requires Energy

Data centres typically require vast quantities of water for cooling, a situation that’s being exacerbated by the growth of AI. To give an idea how much water, back in 2019, before widescale availability of generative AI, it was reported (public records and online legal filings) that Google requested (and was granted) more than 2.3 billion gallons of water for data centres in three different US states. Also, a legal filing showed that in Red Oak, just south of Dallas, Google may have needed as much as 1.46 billion gallons of water a year for its data centre by 2021. This led to Google, Microsoft, and Facebook pledging ‘water stewardship’ targets to replenish more water than they consume. 

Microsoft, which is investing heavily in AI development, revealed that its water consumption had jumped by 34 per cent between 2021 and 2022, to 6.4 million cubic metres, around the size of 2,500 Olympic swimming pools. 

Energy is required to operate such vast water-cooling systems and recent ideas to supply adequate power supplies for the data centre racks and cooling have even includes directly connecting a data centre to its own 2.5-gigawatt nuclear power station (Cumulus Data – a subsidiary of Talen Energy). 

Google In The Spotlight

The recent research by Alex De Vries also highlighted how much energy a company like Google would need (it already has the Bard chatbot and Duet, its answer to Copilot) if it alone switched its whole search business to AI. The research concluded that in this situation Google, a huge data centre operator, would need 29.3 terawatt-hours per year, which is the equivalent to the electricity consumption of Ireland!

What Does This Mean For Your Organisation? 

Data centres are not just a significant source of greenhouse gas emissions, but typically require large amount of energy for cooling, power, and network operations. With the increasing use of AI, this energy requirement has also been increasing dramatically and only looks set to rise.

AI, therefore, stands out as both an incredible opportunity and a significant challenge. Although businesses are only just getting to grips with the many benefits that the relatively new tool of generative AI has given them, the environmental impact of AI is also becoming increasingly evident. Major players like Google and Microsoft are already feeling the pressure, leading them to adopt eco-friendly initiatives. For organisations planning to further integrate AI, it may be crucial to consider its environmental implications and move towards sustainable practices. 

It’s not all doom and gloom though because while the energy demands of AI are high, there are emerging solutions that may offer hope. Investments in alternative energy sources (such as nuclear fusion) although it’s still in its very early development (it’s only just able to generate slightly more power than it uses) could redefine how we power our tech in the future. Additionally, the idea of nuclear-powered data centres, like those proposed by Cumulus Data, suggest a future where technology can be both powerful and environmentally friendly. 

Efficiency is also a key issue to be considered. As we continue to develop and deploy AI, there’s a growing emphasis on optimising energy use. Innovations in cooling technology, server virtualisation, and dynamic power management are making strides in ensuring that AI operations are as green as they can be, although they still aren’t tackling the massive energy requirement challenge. 

Despite the challenges, however, there are significant opportunities too. The energy needs of AI have opened the door for economic growth and companies that can offer reliable, low-carbon energy solutions stand to benefit, potentially unlocking significant cost savings. 

Interestingly, AI itself might be part of the solution. Its potential to speed up research or optimise energy use positions AI as a tool that can help, rather than hinder, the journey towards a more sustainable future. 

It’s clear, therefore, that as we lean more into an AI-driven world, it’s crucial for organisations to strike a balance. Embracing the benefits of AI, while being mindful of its impact, will be essential. Adopting proactive strategies, investing in green technologies, and leveraging AI’s problem-solving capabilities will be key for businesses moving forward.

If you would like to discuss your technology requirements please:

Back to Tech News