Can we disrupt the data center designs?

Distributed approaches or new power sources could enable data centers to grow with minimal harm to the environment

A data center technician overseeing a rack submerged in an immersion cooling tank.
(Image credit: Getty Images)

It's no secret the demand for data processing and storage is exploding. Even before AI, between 2010 and 2025, the amount we created and stored went from around 2 to 181 zettabytes – with much stored and processed in data centers.

Of course, generative AI has massively increased data center demand. Last year Deloitte predicted data centers would consume 4% of global electricity by 2030.. According to the US Department of Energy, that figure's already 4-5% in the US.

Water use, too, is a major concern for data centers. Large data centers can consume the same amount of water for cooling as a town of 10-50,000 people, per data from the Environmental and Energy Study Institute. It’s clear something must be done to prevent putting additional strain on beleaguered electricity grids and water-stressed regions.

What’s being done now to address data center environmental impact and what are our options for the future?

The data centers of tomorrow

Data centers face steep power constraints. In theory, if you could completely clean up the power supply for data centers, you could dramatically reduce their overall emissions.

Solar, wind and hydro power have all been used for data center power and cooling here on earth, with renewables offering an increasingly cost-positive route to creating green data centers.

Some have suggested that space-based data centers might be the next step – and several serious plans for orbiting data centers are afoot, including one between an Australian compute module manufacturer and an Indian space-based infrastructure company. Nvidia has even released computing platforms intended for orbital data centers.

But despite the rate of advancement in space industries, we're a long way off orbital computing replacing ground-based server farms. The extra load on power needs and the environmental impact need action now.

Despite a long history of political and public mistrust, some are seriously considering nuclear. In 2024, Microsoft signed a 20-year power purchase agreement (PPA) with energy provider Constellation, to power data centers with nuclear energy from a reopened Three Mile Island plant. The same year, AWS announced a $650 million plan to acquire a data center next to the 2.5GW Susquehanna nuclear power station in Pennsylvania, in a deal with Talen Energy. The original plan was to power a 480MW data center, though this was later scaled back to just the initial 300MW on site.

While nuclear doesn't reduce total power or cooling loads, it has the potential to ease the demand on the local grid. Small modular reactors could achieve the same, at a lower cost and in a more scalable form factor, beating some of the most prevalent criticisms of nuclear power.

Most of the efforts towards reducing energy and water consumption are centered around optimizing current data center components. A Google data center in a former paper mill in Hamina, Finland, includes tunnels to the sea – pumps bring ocean water in where it cools components, taking the heat back out to sea again. Another small data center is situated in the Loire River in Nantes, France, cooling provided by the natural flow of the water. Future plans include an offshore model that does the same on the ocean swell.

Adnan Masood, chief AI architect at technology business consultancy UST, calls such systems “early proof that chillers [fan, freshwater and pump systems] aren't destiny”.

A body called The Ocean Sewage Alliance, which forges partnerships with industry to solve environmental problems related to the seas, says one possible solution is to use reclaimed or treated wastewater instead of potable or drinking water sources. Spokesperson Larissa Balzer says the adoption of reclaimed water for cooling is a “growing trend among major tech companies”.

Small wins

In trying to save power, water and waste in data processing, it's a good time to remind ourselves that all data isn't created, stored or accessed equally. There's a whole class of information that can work with lower power inputs, low latency and intermittent connectivity.

A study published in the January-March 2025 issue of IEEE Pervasive Computing talked about the possibility of taking discarded and retired smartphones and wiring them together to form little data centers. With processors and storage already on board, the promise is to kill two birds with one stone – reduce the breakneck demand for first generation data center equipment and do something about the growing e-waste problem.

Amit Chadha, CEO and managing director of L&T Technology Services, agrees about what he calls 'micro server farms'. "They can take on lighter but essential workloads like IoT data aggregation, local caching or microservices,” he says.

“They won't replace GPU-heavy AI clusters, but they can free up capacity in data centers for more intensive applications."

On a much larger scale this process already takes place with retired supercomputer parts, which are repurposed for less intensive workloads after they're deemed outdated.

Ezra Hodge, who works in AI and Data Centers at EMA Partners, likewise imagines idle smartphone chipsets in a globally distributed, low-cost parallel processing grid. "For emerging markets like education, or early-stage AI startups, it provides a way to access massive horsepower without the overhead," he says.

Former electric vehicle (EV) batteries may also be used as power sources in the near future. Reduced range and power and safety issues means EV batteries aren't used in cars by the time they reach about 75% of their original capacity, but data processing has far lower power input needs, particularly for off-peak workloads.

Batteries can be clean replacements for diesel generators in backup power, and they can be taken offline and recharged according to workload schedules to take up the slack. This year, battery recycling provider Redwood Materials used second life storage for a Nevada data center, the second largest battery-powered grid in North America.

And with the amount of e-waste in the world, the sky's the limit. Retired laptops and gaming consoles have been suggested to build micro-clouds for small organisations like schools. Displays from old phones or tablets can become monitoring dashboards for server racks. Microphones, accelerometers/gyros and cameras can become monitoring stations warning operators about server room sounds or vibrations, watching for fan or motor failures.

When do we want it?

Space-based data centers, nuclear power and mass e-waste recycling programs are going to take time, and the race for alternative ways of handling data better is a problem for today. What can we do with what we have now?

First is the need to handle data with the equipment we're using today more efficiently. Just one is separating compute from storage.

"'Helper' components like SmartNICs can move, filer, encrypt or prepare data while main processors, the 'brains' doing queries, analytics or AI, can focus on the heavy lifting," says Tobie Morgan Hitchcock, CEO and co founder at AI-native database firm SurrealDB.

If you decouple your systems, Hitchcock says the storage layer can deliver smaller, cleaner, more focused information to operate on. "It lets you scale the storage and compute components separately based on how much each really needs."

In some ways, this is similar to edge computing. While edge doesn't separate computer and storage, it does a lot of data preparation at the source where it's collected and stored – sorting, indexing, cleaning and packaging it neatly before being sent off to the data center for action.

"Embedding compact, high-efficiency compute modules into existing environments like factories or offices doesn't only help reduce latency but effectively turns everyday spaces into mini data centers, alleviating pressure on central facilities," says Chadha.

Reducing the demand on data centers might also remind you of a longstanding principle to reduce the amount of data we keep and store. "What's being overlooked in the discussion is the fact that companies are storing data they don't understand or don't need," says Maggie Laird, president of data management software provider Pentaho.

She says enterprises are storing exabytes of stuff spread across clouds, apps, endpoints and shadow IT, much of it ungoverned and unused and leading to ROT (redundant, obsolete, or trivial) data. "Companies are spending millions storing data they can't use, undermining the very AI initiatives meant to unlock value, while bloated storage needs are driving up energy costs and straining water access where data centers are built.".

As Masood puts it; "Not all data deserves a 24/7 heartbeat. Archive deep, run hot where it matters.

Drew Turney
Freelance journalist

Drew Turney is a freelance journalist who has been working in the industry for more than 25 years. He has written on a range of topics including technology, film, science, and publishing.

At ITPro, Drew has written on the topics of smart manufacturing, cyber security certifications, computing degrees, data analytics, and mixed reality technologies. 

Since 1995, Drew has written for publications including MacWorld, PCMag, io9, Variety, Empire, GQ, and the Daily Telegraph. In all, he has contributed to more than 150 titles. He is an experienced interviewer, features writer, and media reviewer with a strong background in scientific knowledge.