Is offshore data the next major growth opportunity for maritime?

This month, global headlines were made off northern Scotland, as Green Marine’s 55-meter gantry barge, GM700, pulled into the calm waters of Stromness Harbour, Orkney Isle. Hanging by the slings from its double A-frame crane was the final phase of a project five years in the making, the results of which could launch a new boom for the maritime industry.

The cargo was a complete subsea data centre, dubbed “Northern Isles”. As phase two of Project Natick, the airtight 12.2m x 2.8m pressure cylinder contained 864 standard Microsoft data centre servers. If you’re unfamiliar with what exactly such data servers function as, the colloquial term, “brains of the internet” may clarify the concept. The units payload is compared in power to several thousand high-end PCs with enough storage for nearly 5 million full-length Hollywood films.

Deployed June of 2018, at a depth of 117ft and a half-mile offshore, Microsoft worked in collaboration with the European Marine Energy Center’s Naval Group. The “Northern Isles” was stationed at EMEC’s test site for wave energy converters and tidal turbines, dubbed Billie Croo. Known to regularly have tidal currents of 9mph and waves of up to 60ft in stormy conditions, was the ideal location for their subsea datacenter. Powered by local green energy production, the cylinder would sit on the seafloor, untouched and untampered by human hands, cooled by the frigid waters of the Norwegian Sea.

Though this “phase two” research program ran for just two years, the uniquely designed unit could operate independently of human interaction for five or more years. Filled with an atmosphere of dry nitrogen, a semi-inert gas, there was little evidence of corrosion, a common problem for land-based servers. The cold ocean waters provided ambient cooling and external dispersion for the heat exchanger, a process which typically accounts for 30-55% of a data centre’s power consumption. Temperature is critical to the operation of such hardware, should the ambient temperature rise, so will power consumption and likewise, equipment failure

Overall, the project was a wild success. The unit performed exceptionally well, with only 8 of the total 864 installed servers breaking down. Natick project manager, Ben Cutler is quoted: “Our failure rate in the water is one-eighth of what we see on land”. While cooling and atmosphere certainly played a significant role, removing human involvement from the equation broadly reduces the technical variables of maintenance.

But why? Why would Microsoft pour money into such an outlandish research and development project? Well, the project’s slogan lay’s it out: “50% of us live near the coast. Why doesn’t our data?” As the world’s use of technology increases, so will the demand for data services – and at no small cost.

According to Statista, there is expected to be over 7.2 million data centres worldwide by 2021. Though the numbers are on a gradual decline from years prior, the number of hyperscale data centres are increasing. With a useful lifespan of around twenty years, the construction and maintenance of any such facility come with an enormous price tag; as they must be built to maintain optimum operating conditions for the electronic equipment, not to mention connectivity infrastructure linking to the users.

Regarding energy use, it’s staggering to think that globally, data centres accounted for over 1% of the world’s energy consumption in 2018. Fortunately, advances in processing technology have mitigated a significant increase in demand for energy from these facilities. In 2010 the same proportion of electricity was consumed, but overall it has increased. Consider that global energy use has risen 11.56% during the same period, applied to data centres worldwide it computes to a 1573.66 TWh increase for the systems. To put it in context, watching 30 minutes of Netflix could equate to the energy used for a leisurely car drive between 1/8th and four miles depending of course, on the device and vehicle. 

So what are the implications of Microsoft’s Natick project? To start, we should understand that this is not the company’s first go-around at subsea data servers, nor is it the first attempt to place a data centre in an energy-strategic location. 

Microsoft’s plunge into the water began in 2014, as a research paper was produced by several of their data centre specialists, seeking to rethink how things had traditionally been done. By the summer of 2015, Project Natick was well underway and in phase one. The first subsea prototype, “Leona Philpot”, was submerged off the coast of California for a 105-day trial, during which the unit was successfully utilized for online cloud computing services. Encouraged by the success of the first trial, Microsoft double-downed and began phase two in 2018, the results of which we’ve just now observed.

Seemingly bizarre locations for data centres are not a new concept. Prime real estate for the hardware is where it’s cool and energy is abundant and cheap. For example, Green Mountain Data Center of Norway is carved deep into a mountain, powered by a hydro dam and chilled by cold fjord waters. At the same time, other northern countries, such as Finland, Iceland, Canada and Sweden, offer similar frigid facilities in close proximity to natural energy sources such as geothermal.

Although Microsoft admits the impracticality of deploying large subsea hyperscale data centres, units such as the “Northern Isles” serves as a prototype for broader commercialization of the technology. Still, in its research phase, Microsoft hopes to someday offer the product to customers on a rapid provisioning basis. From initial order to deployment, it took a mere 90 days for the implementation and construction of phase two for Project Natick, a timeline marvel in the world of data centre production. As offshore windfarms become more commonplace, so will readily available access to coastal energy. Perhaps the most significant aspect of the technology would be the reduction of latency. The closer a user is to the server, the faster and more reliable the internet speed; with technological demand increasing exponentially, there are few other solutions. If the multinational corporate giant Microsoft is behind it, then the money is sure to follow. 

In the mid-2010s, the maritime sector was thriving amidst the offshore oil boom. Today, we are at the cusp of a booming global wind industry, which has infinite potential to power such technologies resulting from Project Natick. Indeed, if data is the new oil – then perhaps a data centre boom is in store for the maritime industry. As the slogan goes, “50% of us live near the coast. Why doesn’t our data?”