Microsoft envisions ultra-modular data centers

The software giant is experimenting with efficient designs that can be ramped up quickly when it needs capacity

In the years to come, Microsoft's data centers may not be huge buildings tightly packed with server racks, but rather rows of small, stand-alone IT units spread across acres and acres of cool, cheap land.

At the DatacenterDynamics conference in New York on Wednesday, Microsoft data center general manager Kevin Timmons outlined some prototype work his unit is doing to design its next generation of data centers, in collaboration with Microsoft Research.

His vision is radically different from most of what the company already has in place.

The company is field-testing something Timmons calls IT PACs, or IT preassembled components, which are small, self-contained units that are assembled off-site and can be linked together to build out an entire data center.

Microsoft, he said, is facing the same challenges as most data center operators. It needs the ability to ramp up capacity in short order, but would like to avoid the massive up-front costs and long lead times required to build out traditional data centers. Given this set of conditions, Microsoft's goal for building its next set of data centers is "ultra-modularity," Timmons said.

Instead of paying US$400 million or more up front to build a data center, Microsoft would prefer to purchase some land, build a sub-station and then populate the acreage with modular units of servers as demand grows.

"We want to view our data centers as more of a traditional manufacturing supply chain, instead of monolithic builds," he said. "It won't all be built on-site in one shot."

By going with this approach, Microsoft can cut the time it takes to ramp up new server capability in half, as well as reduce the costs of building out new data centers, Timmons predicted. "You don't have to commit to a $400 million data center and hope that demand shows up," he said.

Over the past few years Microsoft has been moving toward more modular designs, moving from purchasing individual servers to racks of servers to, most recently, entire containers filled with servers. Microsoft built out its past two data centers, located outside of Chicago and Dublin, using, in part, containers.

The new design takes this modularity concept even further.

The IT PACs are "not really containers in a traditional sense," Timmons said. "They are really integrated air-handling and IT units."

The units themselves could hold anywhere from one to 10,000 servers. The idea is that when the software giant requires more resources, it can have one of these IT PACs shipped to location and "plugged into the spine," which supplies the power and network connectivity to the data center.

Microsoft has built two proof-of-concept models so far. Its next data center, which the company will announce in a few months, will use some form of these IT PACs, Timmons said.

The units will be assembled entirely from commercially available components. A single person should be able to build a unit within four working days. The servers will be stacked in rows, sandwiched between air intake and output vents.

For cooling, ambient air can be sucked in one side, run through the servers and exhausted out the other, with some of the air recirculated to even the overall temperature of the unit. No mechanical cooling units will be used. Networking and power buses will run over the tops of the servers.

The construction materials rely heavily on steel and aluminium, both easily recyclable. The water requirements can be met by a single hose with residential levels of water pressure, he said.

The development team considered different sizes of containers, Timmons said, keeping an eye toward making the units easily shippable. They settled on a size that could contain 1,200 to 2,100 servers and draw between 400 and 600 kilowatts.

The units can be placed inside a large building, or when equipped with outer protective panels, reside out in the open.

One of the chief requirements of IT PACs, he admitted, is that they reside in an area where the ambient temperature is mild enough that it can provide sufficient cooling. Because of their highly portable nature, this should not be a problem, he said.

"If we're doing our job right in site election, square footage will be cheap for me. I want to find a place with lots of room to expand. I don't want to worry about a watts-per-square-foot problem. I'd like to worry about having enough acreage," he said. "We're doing a good job in site selection when we don't have to squeeze in 500 watts per square foot."

Due to their minimal use of mechanical cooling, Timmons estimated that the PUE ratio for its IT PACs would be 1.26 to 1.35, depending on the outside conditions. PUE, or power usage effectiveness, compares overall power supplied to the data center against the amount that actually reaches IT equipment.

A typical data center PUE is around 2.1, according to industry estimates.

If the IT PACs are ultimately pushed into production, Timmons said he hasn't fully decided if Microsoft will build them itself or contract them out. It would probably be a mix of the two, he predicted. "I know how much it costs to build one of these now," he said.

Join the newsletter!

Error: Please check your email address.
Rocket to Success - Your 10 Tips for Smarter ERP System Selection

Tags serversstorageMicrosoftdata centrescarbon footprintpower and cooling

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.

Joab Jackson

IDG News Service
Show Comments

Cool Tech

SanDisk MicroSDXC™ for Nintendo® Switch™

Learn more >

Breitling Superocean Heritage Chronographe 44

Learn more >

Toys for Boys

Family Friendly

Panasonic 4K UHD Blu-Ray Player and Full HD Recorder with Netflix - UBT1GL-K

Learn more >

Stocking Stuffer

Razer DeathAdder Expert Ergonomic Gaming Mouse

Learn more >

Christmas Gift Guide

Click for more ›

Most Popular Reviews

Latest Articles

Resources

PCW Evaluation Team

Walid Mikhael

Brother QL-820NWB Professional Label Printer

It’s easy to set up, it’s compact and quiet when printing and to top if off, the print quality is excellent. This is hands down the best printer I’ve used for printing labels.

Ben Ramsden

Sharp PN-40TC1 Huddle Board

Brainstorming, innovation, problem solving, and negotiation have all become much more productive and valuable if people can easily collaborate in real time with minimal friction.

Sarah Ieroianni

Brother QL-820NWB Professional Label Printer

The print quality also does not disappoint, it’s clear, bold, doesn’t smudge and the text is perfectly sized.

Ratchada Dunn

Sharp PN-40TC1 Huddle Board

The Huddle Board’s built in program; Sharp Touch Viewing software allows us to easily manipulate and edit our documents (jpegs and PDFs) all at the same time on the dashboard.

George Khoury

Sharp PN-40TC1 Huddle Board

The biggest perks for me would be that it comes with easy to use and comprehensive programs that make the collaboration process a whole lot more intuitive and organic

David Coyle

Brother PocketJet PJ-773 A4 Portable Thermal Printer

I rate the printer as a 5 out of 5 stars as it has been able to fit seamlessly into my busy and mobile lifestyle.

Featured Content

Product Launch Showcase

Latest Jobs

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?