A 24-megawatt data center is the pilot project for a new approach to AI infrastructure. Credit: Shanghai Hailanyun Technology.
The engines of modern artificial intelligence are running hot — and they run on data centers. These sprawling, power-hungry factories of the digital age are devouring staggering amounts of electricity and water. The electricity keeps them alive; the water keeps them cool.
Now, AI’s appetite is growing so fast that data centers are starting to compete with local communities for…
A 24-megawatt data center is the pilot project for a new approach to AI infrastructure. Credit: Shanghai Hailanyun Technology.
The engines of modern artificial intelligence are running hot — and they run on data centers. These sprawling, power-hungry factories of the digital age are devouring staggering amounts of electricity and water. The electricity keeps them alive; the water keeps them cool.
Now, AI’s appetite is growing so fast that data centers are starting to compete with local communities for these resources, especially water. China, however, is betting on a bold solution: moving its data centers to the wettest place on Earth — the ocean.
China already has two large underwater data centers. The first, off the coast of Hainan, launched in 2022 and is now in full commercial use. A second, $226 million project off Shanghai recently went online and is powered partly by offshore wind. Both rely on the ocean’s naturally cold water for cooling.
The Cloud Under the Ocean
Data centers are dedicated buildings, packed with thousands of powerful computers called servers. Servers have become critical infrastructure, storing, processing, and distributing data for everything from your emails to streaming and training AI. They were already booming because of the explosive growth of data, and AI sent things into overdrive. The massive computational demands of AI have created a voracious, nonstop appetite for more processing power, forcing companies to build these data factories at an unprecedented rate.
These powerful computers generate a lot of waste heat that needs cooling. This process accounts for a whopping 40% of their total electricity consumption, which is already very high.
Data centers consume between 2% and 3% of the world’s electricity, according to the International Energy Agency. AI is expected to increase this consumption by 165% by 2030. Simply put, they use a lot of energy, and will use up even more.
Most of that energy is spent chilling water, which is then evaporated or sprayed into the air. The idea with the underwater center is to use the ocean’s frigid, stable temperature as a massive, free “heatsink.”
“We put the entire data cabin in the deep sea because seawater can help cool down the temperature,” Pu Ding, project manager at Shenzhen HiCloud Data Centre Technology, told the Chinese media outlet Financial News. “Compared to land-based data centres, data centres under the sea can reduce energy consumption needed for cooling, helping to lower operational costs.”
A depiction of the submerged data center. Image credits: Xinhua.
The engineering part is, in principle, fairly straightforward. The system pumps cold seawater through a radiator on the back of the server racks, absorbing the heat and carrying it away. For power, engineers can connect, the system can to a nearby offshore wind farm, as is the case with the Shanghai one. The result, on paper, is a standalone data center with a near-zero footprint that requires no freshwater and no grid connection. Of course, putting it all together is easier said than done, and there’s another important problem.
What Happens If Something Breaks?
That’s the billion-dollar question. Obviously, you can’t just send in a guy with a screwdriver to fix things. In fact, you can’t repair on-site at all. The entire system is built on two core principles: extreme reliability and modularity.
The ideal scenario is that, of course, nothing breaks. It’s not as crazy as it sounds. In big, well-designed data centers, most server failures are caused by humans, humidity, or dust. These capsules are sealed and surrounded by inert nitrogen, not reactive oxygen. This stops corrosion in its tracks. A Microsoft study from 2020 found that ocean-dumped data centers are eight times more reliable than their land-based equivalents.
But if something does go wrong, China is opting for a “swap, don’t fix” approach.
The data center isn’t one giant structure; it’s a series of massive, 1,400-ton “cabin-pods” chained together on the seabed. Each cabin contains 24 server racks capable of hosting up to 500 servers. So, if something does break, the system can be hauled back to shore, where the faulty module is swapped for a functioning one and then sunk again. The idea is that the massive savings on energy will outweigh this risk and potential logistical cost.
China Racing (or Gambling?) Ahead
This idea isn’t new. Microsoft pioneered this technology over a decade ago with its Project Natick. Microsoft launched Project Natick in 2014, submerging an experimental data center off Scotland’s coast in 2018. However, as of 2024, Microsoft said it’s no longer pursuing this technology. The company never really explained why it stopped the project. But it was likely due to the high logistical cost of large-scale oceanic deployment and repair.
China thinks it’s got that covered. Or rather, it’s willing to take gamble. Earlier this year, the country launched another data center in Shanghai. The Hainan project aims for a network of 100 cabins. The $226 million Shanghai facility is a 24-megawatt prototype for planned 500-megawatt versions.
China is making a massive wager. It is betting that the complex engineering challenge of servicing these deep-sea pods is a far easier problem to solve than the global crisis of energy and water scarcity that AI is creating. While the West, having run the numbers, stepped back, China is diving in. The rest of the world is now watching to see if this gamble will cool a dangerously hot planet or become a very expensive, high-tech folly at the bottom of the sea.