Ever think about where your emails actually live? Most people imagine a giant, beige warehouse in the middle of a desert, humming with thousands of fans trying desperately to keep silicon from melting. But for a few years, a small chunk of the internet lived at the bottom of the ocean. Specifically, it lived off the coast of Scotland’s Orkney Islands.
Microsoft called it Project Natick.
It sounds like a Bond villain’s lair. Or maybe a very expensive way to drown a computer. But the reality of the Microsoft datacenter under sea experiment is actually way more practical—and honestly, a bit weirder—than you’d expect. This wasn't just some PR stunt to look "green." It was a genuine attempt to solve the two biggest headaches in cloud computing: keeping servers cool and getting data to people faster.
The Problem With Land-Based Servers
Land is a pain. If you want to build a massive datacenter on solid ground, you have to deal with real estate prices, local zoning laws, and the nightmare of cooling. Air is a terrible conductor of heat. We spend millions of dollars and billions of gallons of water just trying to keep servers from throttling.
💡 You might also like: The 4.6 Liter Ford Engine: Why This High-Mileage Workhorse Still Refuses to Die
Then there's the "edge" problem. About half the world’s population lives within 200 kilometers of the ocean. If your datacenter is stuck in a rural cornfield in Iowa, but your users are in New York or Tokyo, those bits of data have a long way to travel. Latency kills.
Microsoft’s Ben Cutler, who led Project Natick, basically looked at the map and realized the ocean is a giant, built-in heat sink sitting right next to where everyone lives. Why not use it?
864 Servers in a Giant Pressure Cooker
In 2018, they dunked a 40-foot-long container into the Northern Isles. It was packed with 12 racks of servers—864 of them in total. This wasn't just a waterproof box. It was a sophisticated, nitrogen-filled environment.
Why Nitrogen?
Oxygen is the enemy of electronics. On land, the air we breathe causes corrosion. Dust gets into everything. People walk around, bump into racks, and accidentally unplug things. By sealing the Microsoft datacenter under sea in a steel hull and replacing the oxygen with dry nitrogen, Microsoft created an environment that was incredibly stable. No oxygen meant no corrosion. No people meant no "fat-finger" errors.
The results were kind of shocking. When they hauled the thing back up in 2020, they found that the servers inside had a failure rate of only one-eighth of what they see on land. Imagine that. Your computer lasts eight times longer just because you threw it in the ocean and stopped touching it.
It's All About the Heat Exchange
Water is much better at moving heat than air. Project Natick used a heat exchange system similar to what you’d find on a submarine. Cold seawater was piped through the back of the server racks and then pumped back out.
Because the ocean is consistently cold at those depths, the cooling was essentially "free." This makes the Power Usage Effectiveness (PUE) of an underwater pod significantly better than traditional builds. Most land-based centers struggle to hit a PUE of 1.1 or 1.2. The undersea pod was hitting numbers that make engineers drool.
The Orkney Advantage
Microsoft didn't just pick Scotland because it's pretty. The Orkney Islands have a power grid that is 100% supplied by renewable energy—mostly wind and tidal.
One of the coolest (literally) things about this experiment was that the datacenter was powered by the very water it sat in. Tidal turbines and wave energy converters provided the juice. This proved that you could deploy "lights out" datacenters in remote coastal areas without needing to drag a massive coal-powered grid along with you. It was a closed-loop system of sorts.
What Most People Get Wrong About Underwater Clouds
Some folks worried that these pods would boil the ocean. They pictured a ring of dead fish surrounding a glowing orange tube.
In reality, the thermal impact was negligible. The water coming out of the heat exchanger was only a few degrees warmer than the intake, and it dissipated almost instantly into the vastness of the Atlantic. When the team recovered the pod, it was covered in sea anemones and small fish. It had basically become an artificial reef.
Another misconception? That we’re all going to be "cloud diving" soon.
Building these things is expensive. While the failure rate is low, the "repair cost" is astronomical. You can't just send a technician down with a screwdriver if a hard drive fails. You have to wait for the entire pod’s lifecycle to end—usually about five years—and then replace the whole thing. It requires a complete shift in how we think about hardware maintenance. You don't fix it; you just over-provision and let it run until it dies.
The Long-Term Play for Microsoft
Project Natick is currently in a "research concluded" phase. You won't see a fleet of thousands of pods being dropped this year. But the lessons learned are already being baked into Azure’s land-based architecture.
Microsoft is now using the nitrogen-sealing techniques and specialized cooling liquids (immersion cooling) in their traditional datacenters. They realized that if the "sealed environment" worked under the sea, it could work in a warehouse in Virginia too.
Actionable Insights for the Tech-Curious
If you’re watching the evolution of the Microsoft datacenter under sea, here is what you should actually be paying attention to:
- Look at Immersion Cooling: The future of your home PC or office server might not be fans. It’ll be "two-phase immersion cooling," where chips are dunked in non-conductive liquid. Microsoft is already testing this at scale.
- Latency Over Capacity: We are moving toward a "distributed" internet. Instead of five massive datacenters globally, we’re looking at thousands of small "edge" nodes. The undersea experiment proved that these nodes can exist in places we previously thought impossible.
- Sustainability Metrics: If you're a business owner, look for "PUE" (Power Usage Effectiveness) ratings in your cloud provider. Lower is better. Natick proved we can get close to the theoretical limit of 1.0.
- Hardware Longevity: The fact that oxygen-free environments reduced failure by 8x suggests that our current "open-air" server rooms are actually incredibly corrosive. Expect to see more "sealed" hardware in the enterprise space.
The ocean might not be full of glowing Microsoft pods just yet, but the way we build the internet changed the second that container hit the water in Scotland. It proved that the cloud doesn't have to be a thirsty, land-hungry beast. Sometimes, the best way to move forward is to take a dive.
Key Takeaways for Decision Makers
- Edge Computing is Viable: If you can run a server 117 feet underwater, you can run it anywhere. This opens up massive opportunities for localized data processing in coastal cities.
- Reliability through Isolation: Reducing human touchpoints and environmental contaminants (oxygen, dust) is the single most effective way to extend hardware life.
- Renewable Integration: Small-scale datacenters can be paired directly with local renewable sources like offshore wind, bypassing the inefficiencies of the national grid.
Next time your Netflix stream loads instantly, just remember: there’s a decent chance the tech making that happen was perfected while being nibbled on by crabs in the North Sea.