The Future of Data Centers Might Be Underwater

Microsoft’s recent experiment shows that this is not just feasible, but also practical

Earlier this summer, Microsoft made an unusual call to some marine specialists asking for their help to retrieve a shipping-container sized capsule from the seafloor off Scotland’s Orkney Islands. After an intricate performance by ropes and winches attached to a gantry crane, out came a data center covered in algae, barnacles, and sea anemones. The Microsoft team was “pretty impressed with how clean it was.” While most electronics go kaput with the slightest water spillage, this one thrived in it.

Image for post
Marine specialists retrieve the data center from the seafloor off the Orkney Islands. Source: Microsoft

This bizarre endeavor is the brainchild of Microsoft’s Project Natick team. This team is studying the feasibility and practicality of underwater data centers and they sunk the recently hauled out data center in spring 2018 for a two years-long experiment. But this was not the first time they did that.

What is Project Natick?

In most companies, outlandish ideas proposed by its employees are rarely given heed, but Microsoft is not most companies. So during ThinkWeek, an event that encourages employees to share out-of-the-box ideas, Sean James and some of his colleagues circulated an internal white paper outlining the concept of underwater data centers. Sean had previously served on board submarines for the U.S. Navy and knew a thing or two about underwater structures. His idea stemmed from this experience, and he hoped it might help tech companies handle the explosive growth in demand for data centers in an environmentally sustainable way.

Microsoft jumped on board and a small team was formed in August 2014 to pursue this concept. Within a year this team developed a Phase 1 prototype that they drowned in the Pacific Ocean off California’s coast. The capsule was fitted with hundreds of sensors to study humidity, pressure, temperature, motion, and other variables that helped the team better understand how a data center works in this novel environment.

Image for post
Project Natick Phase 1 being deployed. Source: Microsoft

The data center not only held up but proved to be more successful than expected. So the team extended the time of the trial and used it to process real-world commercial data stored in Microsoft’s Azure cloud service. This proof of concept encouraged the team to further indulge in this idea with a data center four times as large. The Phase 2 trials in Northern Isles, which concluded earlier this summer, consisted of 12 racks containing 864 servers and 27.6 petabytes of storage. This is as powerful as thousands of high-end consumer PCs and has enough storage for about 5 million movies.

Microsoft reported earlier in September that the experiment was a resounding success and “the concept of underwater data centers is feasible, as well as logistically, environmentally and economically practical.”

Image for post
The Northern Isles data center (Project Natick Phase 2) going through system checks before its deployment. Source: Microsoft

Why underwater?

These experiments are intriguing, but what benefits does underwater provide that inland doesn’t. It turns out that this list is pretty long.

Proximity to users

Nearly half the human population lives within 200 km of the ocean, but data centers are often located in nondescript warehouses far away from this population where land and electricity is cheap. By placing data centers in waterbodies near major population centers, companies can deliver content more efficiently and quickly than ever before.

If the data center is located at 200 km from the user, the expected roundtrip latency (time taken by data to travel between its source and destination) is 2 milliseconds. Currently, the median latency in the US is between 12 ms to 37 ms. With a fast-growing need for low-latency internet that enables cutting edge AR/VR games and self-driving cars, data centers closer to the users are a promising solution. Closer data centers also better handle the increasing demand for edge computing created by the rapidly growing Internet of Things ecosystem.

Efficient cooling

Underwater data centers also provide an inherent benefit in cooling the servers. If you think your laptop is giving you upper leg sweats when you edit a video or open multiple tabs, you can imagine how much more heat a data center produces. To keep the servers functioning efficiently, this heat is removed by giant air conditioning systems that consume as much power as it takes to run the servers, raking thousands of dollars in electricity bills every year.

Many massive data centers use water cooling rather than air cooling because water has a specific heat capacity four times that of air. Companies try to locate their data centers in regions where the climate is cooler and there is a nearby water source. By putting data centers underwater, we are engaging in water cooling by the literal sense of the term. When placed deep enough in water (a few hundred meters would suffice), the temperature is constantly around 15 degrees Celsius (60 degrees Fahrenheit) and is relatively stable all year long. The data center then uses the same heat-exchange process used for cooling submarines. Seawater is pumped through the radiators on the back of the server racks and back out into the ocean. This simple process is not just enough to keep servers cool, but it costs next to nothing.

Image for post
Sinking the Northern Isles data center. Source: Microsoft

Sustainability

Underwater data centers become even more attractive options when it comes to sustainability. The power consumption of these data centers are already much lower than inland centers since they do not require expensive cooling systems. Additionally, the they can operate “lights-out” because there are no people in it.

On top of this, Microsoft envisions to create a model where they are fully powered on green energy. In the Phase 2 deployment, the data center was connected to the Orkney power grid in Scotland, which produces surplus renewable energy through a combination of offshore wave and tide energy and onshore wind and solar energy. In the future, the team hopes to power the data centers on green energy obtained from generators attached to the vessel or from nearby offshore generators with no connection to the grid, making them fully self-sufficient.

The sustainability factor becomes even more important when considering deployment near developing countries where electricity production is not reliable or sufficient and where there is water scarcity. These data centers will not put pressure on the limited resources available there.

The data centers themselves are made to be fully recyclable, including the vessel, heat exchangers, servers, and all other components. While the servers have an anticipated lifespan of five years before they require upgrades, the vessel has a target lifespan of 20 years before it’s recycled.

Rapid provisioning

Apart from latency and energy benefits, these data centers also allow for rapid provisioning according to market demand. Since these data centers are standard shipping-container sized modules, it’s relatively easy to mass-produce them, fit them with servers, and ship them to a place where market demand is surging.

The data center deployed off the Orkney Islands coast was manufactured in France by a well-known marine equipment manufacturer, Naval Group, and shipped out on a flatbed truck to Scotland where it was attached to the base and deployed on the seabed. All this was done with standard equipment, reassuring us that the infrastructure and logistics for large scale production and transportation already exists. The Project Natick team estimates that a new underwater data center can be deployed at scale from start to finish within 90 days.

Meanwhile, an inland brick-and-mortar data center of the same capacity might take years to build. It will also be far more expensive to construct buildings, employ staff to maintain them, carry out landscaping, and provide round the clock security to the premise. Companies also have to guess and account for future demand when acquiring lands and constructing buildings, whereas underwater data centers can be deployed as and when the demand arises.

More reliable

One surprising finding following the Phase 2 trials in Orkney islands was that underwater servers had one-eighth the failure rate compared to the land-based control group. This suggests that these systems are not only more sustainable and offer better performance but are also more reliable. The Natick team believes that the atmosphere of nitrogen within the vessel, which is less corrosive than oxygen, the absence of people meddling with the equipment, and the relatively stable temperature are the primary reasons for the improved reliability. The team hopes to transfer these findings to Microsoft’s land-based data centers by making nitrogen-filled chambers for servers and reducing the number of staff in these centers.

The Caveats

One of the biggest questions is what to do when some component breaks down. Unless you see scuba divers listed on Microsoft’s career page, no one can go underwater and repair a server or replace a faulty part. These servers need to be highly fault-tolerant and designed to operate without human intervention for at least five years. Microsoft says that this is how they design their land-based systems as well, so the cause for concern is lesser than what one might think. If a component fails, it’s simply taken offline and the rest of the devices continue to function normally.

There is also the pertinent issue of biofouling. Since these data centers are likely to be present in the photic zone, marine organisms like barnacles and algae will attach to it and hinder the heat-exchange process. Microsoft hopes to sidestep this issue by using antifouling materials and coatings, but they admit that it remains an active research area.

Some are apprehensive of the cost benefits that these data centers present. Mark Monroe, President at Energetic Consulting, argues that there are land-based modular data centers that can be quickly built as well, and the cost of building water-tight vessels and placing them underwater is much higher per square foot than the land-based alternatives.

There is also an environmental concern on how the heat produced by these data centers might affect aquatic life around it. Monroe argues that while one vessel might not do much harm, many such sunken vessels might disrupt the oceanic ecosystem. But Microsoft reiterated that this model will not affect the aquatic ecosystem in any way and that the water around the vessel will be “few thousandths of a degree warmer at most.” On a funnier note: one enthusiastic swimming pool builder reached out to the Project Natick team asking if their underwater data centers can help provide heat to the swimming pools he installs. As outlandish as this business model might sound, IBM has a data center outside Zurich that does exactly that!

How important is this for the future?

Microsoft’s next step is to figure out a way to submerse multiple such underwater data centers and create a network that can perform the same scale of work as Microsoft’s inland centers. If they work this out, it has important implications for the future of data centers, the cloud, and the internet. I’ll leave you with an engaging thought that Ben Cutler, Director of Project Natick, left me with: “Most of the data centers that we’ll ever build we haven’t built yet.”

Written by

freelance technology writer | sarveshmathi.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store