Microsoft Takes Datacenter Under Water

The company claims the under water datacenters provide some inherent advantanges such as natural cooling, low latency and in future, even use of hydrokinetic energy

Earlier this week, Microsoft took the tech world by surprise by announcing that it was working on under water datacenters and has already operated one for four months last year—from August to November 2015.  

 

Project Natick, as the project is called, involved is Microsoft operating an experimental prototype vessel, on the seafloor approximately one kilometer off the Pacific coast of the United States. The vessel was christened the Leona Philpot after a popular Xbox game character.

 

Microsoft describes the initiative as its ongoing quest for “cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.”

 

“Project Natick is focused on a cloud future that can help better serve customers in areas which are near large bodies of water (where nearly 50% of society resides),” says the company.

 

Microsoft claims that rapid provisioning—ability to deploy a datacenter from start to finish in 90 days—and rapid response to market demand, quick deployment for natural disasters and special events are major advantages of underwater datacenters.

 

Low latency is another advantage the company says underwater datacenters could provide. “Half of the world’s population lives within 200 km of the ocean; so placing datacenters offshore increases the proximity of the datacenter to the population dramatically reducing latency and providing better responsiveness,” it explains.

 

Since people cannot be placed under water, the Microsoft team moved to the idea of a “lights out” situation: a very simple place to house the datacenter, very compact and completely self-sustaining. The team chose a a round container, as it’s the best shape for resisting pressure, says an article in Microsoft News site.

 

“This initial test vessel wouldn’t be too far off-shore, so they could hook into an existing electrical grid, but being in the water raised an entirely new possibility: using the hydrokinetic energy from waves or tides for computing power. This could make datacenters work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy,” hopes the article.

 

Cooling is another major aspect of datacenters and is a significant cost in datacenters today. “The cold environment of the deep seas automatically makes datacenters less costly and more energy efficient,” claims the article.

 

Once the vessel was submerged last August, the researchers monitored the container from their offices in Building 99 on Microsoft’s Redmond campus. Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current.

 

A diver would go down once a month to check on the vessel, but otherwise the team was able to stay constantly connected to it remotely – even after they observed a small tsunami wave pass.

 

“The team is still analyzing data from the experiment, but so far, the results are promising,” says the article at Microsoft News.

 

The project website clarifies that a Natick datacenter deployment is intended to last up to 5 years, which is the anticipated lifespan of the computers contained within. After each 5-year deployment cycle, the datacenter would be retrieved, reloaded with new computers, and redeployed. The target lifespan of a Natick datacenter is at least 20 years. After that, the datacenter is designed to be retrieved and recycled.

 

Right now, it is an experiment which shows promise. But there’s some distance to cover before it could be used as a regular, commercial datacenter model.

 

Stay tuned. 

Adidas Performance


Add new comment