Global power shortages mean data centers could struggle to shoulder the burden of energy-intensive generative AI demands in 2024

Data centers stock image showing a worker on laptop in a server room
(Image credit: Getty Images)

Power and storage requirements for data centers are growing exponentially and creating problems for the industry, according to JLL’s data centers global outlook report for 2024.

The increased enterprise focus on generative AI requires a huge amount of power, which in turn is exacerbating a “scarcity of data center colocation supply” caused by regional power limitations.

Generative AI is expected to be a major factor in rising global electricity consumption in the coming years, according to research, which the European Commission estimates will have increased 60% by 2030.

All told, the rising electricity consumption of high-performance computing (HPC) and AI workloads are putting pressure on the energy infrastructure of data centers, according to JLL. 

Increasingly, data center operators are having to meet power requirements ranging from 300 megawatts (MW) to 500MW, causing a drastic shifting site selection criteria and day-to-day management. 

With the generative AI market set to grow to $1.3 trillion over the next 10 years according to Bloomberg Intelligence, the JLL report suggests that now is the time for companies to realize that AI-specialized data centers “look different than conventional facilities.”

“AI, especially with the newfound fondness for deep learning (read generative AI), is a compute-hungry beast,” chief AI architect at UST Adnan Masood told ITPro

This is “not just an uptick,” Masood added, but instead represents a growing demand that “dwarfs traditional data centers”.

Generative AI requires more densely clustered and performance-intensive infrastructure than traditional data centers. This, in turn, creates more heat and requires more sophisticated cooling techniques. 

“High computational power equals a lot of heat, and managing this heat is crucial,” Masood said. 

“Efficient, advanced cooling systems are essential to prevent hardware damage (think on chip cooling) and maintain performance, but these systems are neither simple nor cheap,” he added.

Data centers are going to face serious power issues

The key takeaway from JLL is that data centers are going to be under increasing levels of strain in years to come.

Irish electricity company EirGrid has estimated that electricity demand from data centers could more than double to 30% of all consumption in 2028, while data center electricity usage in Denmark is forecast to grow from 1% to 15% of total consumption by 2030. 

To make matters worse, global data infrastructure is in poor condition. One-third of Europe’s grid structure is reportedly over 40 years old and in desperate need of around $641 (€584) billion for modernization. 

JLL predicts that total data generation will double in the next five years, requiring a reflected growth in data center capacity from 10.1 zettabytes (ZB) in 2023 to 21ZB in 2027.  

Generative AI will inevitably drive this growth in data, and managing the way in which this volume of data is managed, accessed, and stored will affect decisions about data center development, according to Niklas Lindqvist, general manager for Nordics at Onnec.

“Data centers must invest in robust power infrastructure and devise efficient cooling solutions to prevent overheating to combat relentless energy consumption” he said. 

“While air cooling still has a role to play, it often falls short in cooling AI hardware efficiently. Often, liquid cooling is the preferred option for high-performance chips, offering superior cooling and potential cost benefits.”

Lindqvist added that cabling, often referred to as the “forgotten child” of data center design, is still being habitually overlooked by data center operators, which in turn has a negative impact on efficiency. 

“As with power connections and cooling systems, cabling is embedded infrastructure. These are all built into the structure of a data center complex,” he said. “This means systems can be extortionately expensive to replace, if not outright economically impossible – leading to huge problems down the line.”

How the global data industry can secure itself against power and storage problems 

Training and tuning application workloads for AI are not typically latency-sensitive, giving site developers more freedom in terms of location, JLL noted.

As such, prospective AI-specialized data center developers can take advantage of this flexibility to pick sites that support sustainable power as in the case of green data centers, or free cooling.

Masood echoed Lindqvist’s comments on alternative solutions, highlighting liquid immersion cooling in particular. This involves the complete submersion of servers in non-conductive liquids alongside AI-optimized processors that are designed to fulfill high computing requirements.

  What the industry should avoid, however, is load shedding; the practice of selectively shutting down power in areas to avoid complete outages.

 The JLL report critiques this method as short-sighted, making the point that data centers rely on Uninterrupted Power Supply (UPS) to operate critical equipment. 

George Fitzmaurice
Staff Writer

George Fitzmaurice is a staff writer at ITPro, ChannelPro, and CloudPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.