Facebook's 'green' data center design to have ripple effect
- 28 April, 2011 20:39
Facebook's innovative new data center design -- believed to be one of the world's most energy-efficient facilities of its kind -- will have a significant influence on corporate data center build-outs over the next several years, experts say.
Facebook's new Prineville, Ore., data center features an outside air-cooled building, energy-efficient power supplies, battery back-up and custom servers.
In an unusual move, Facebook revealed the details of its data center design and created the Open Compute Project, an open source community, to help improve it. Facebook has partnered with Advanced Micro Devices, Intel, Quanta and others on the Open Compute Project, and is also working with Dell, HP, Rackspace, Skype and Zynga on new designs.
BACKGROUND: Facebook shares its data center secrets
Facebook's decision to open source its data center is "groundbreaking in terms of having the potential to make the whole industry more energy efficient," says Mark Monroe, executive director of The Green Grid, an industry group dedicated to improving the energy-efficiency of data centers. Facebook recently joined The Green Grid as a contributing member.
"For folks like Google, Facebook, Microsoft and eBay, their data center is their factory, and they are focused on making transactions happen at the lowest cost possible," Monroe says, adding that when these companies share their data center secrets it "really moves the whole industry forward."
"It would be hard to overstate the significance of Facebook's Open Compute Project," says Randy Smith, director of real estate with hosting company Rackspace. "It's long been mysterious and mystified in how large, Web-scale companies achieve the efficiency that they achieve. For Facebook to demystify this ... will have a ripple effect.''
Smith says most companies are not seeking to differentiate themselves in terms of their data center design, and yet there are efficiencies to be gained by copying companies like Facebook that are leaders in this area. "The more that efficiencies can be pushed out to companies that wouldn't otherwise enjoy them, the better," he adds.
Facebook's data center design is 38% more energy efficient than conventional designs. That's why companies planning to build data centers in the next three years are likely to borrow some of Facebook's ideas, experts say.
Facebook's data center design efficiency is rated as a Power Usage Effectiveness (PUE) of 1.07 -- one of the lowest reported design PUEs ever. With the PUE rating, the closer the number is to 1.0, the better.
"That's a fantastic number," Monroe says. "It'll be interesting to see if they are able to achieve it once they've populated the whole data center and measured the performance over a yearlong period."
Among the innovations in the Facebook data center design expected to be copied are the use of outside air cooling -- which is already popular in Europe -- instead of air conditioning. This alleviates the need for not only chillers but all of the duct work required by central air conditioning.
The Facebook data center "seems to be more in concert with nature," Smith says. "What people will find with buildings like this is that they are much more efficient, they have a simpler design, and they have more of a passive type of operation. You can stand in a hot aisle and have a conversation without raising your voice because there's not this massive roar of air being pushed into the building."
One aspect of the Facebook design that may gain popularity is running the data center at temperatures as high as 81 degrees instead of the 68 degrees commonly used today. Another innovation that may prove popular is the lack of a centralized uninterruptible power supply.
Greg Huff, CTO of Industry Standard Server and Software at HP, says customers can buy components -- such as power supplies, rack-level power distribution units and connectors -- that are similar to Facebook's design right now.
"We've got power supplies, plugs, line cards, PDUs and a UPS specifically for this type of application all for sale now," Huff says, adding that HP's offerings were developed in parallel with Facebook's efforts. "Some of this we had under development four-plus years ago, but we didn't think we could drive it into the market. We got engaged with Facebook about a year and a half ago, and then we've rapidly accelerated our development."
HP also has a technical services arm -- dubbed HP Mission Critical Services -- that helps enterprises locate and design new data centers that can take advantage of innovations such as outside air cooling, which isn't possible in all geographic locations.
"We will absolutely help [customers] squeeze out every dollar of wasted energy," Huff says. "Our development, from the chassis, power supply, motherboard and UPS, are all aligned with the philosophy of lower cost and cooling."
Huff anticipates demand for these energy-efficient offerings from enterprises that have a "green agenda, primarily European-based companies," he says. "They need a low-cost build-out and low operations costs and the agility to deal with spikes in demand."
Among the vertical industries likely to adopt Facebook's approach to data center design are Internet content providers, financial services firms and cloud computing service providers.
Internet content companies "deploy tens of thousands of servers around the country and around the world. They should be interested," Monroe says. "They are on continuous build cycles."
Rackspace says Facebook's design will "be an accelerant for us" in terms of the company's ongoing efforts to improve the energy efficiency of its data centers. He expects Rackspace and other companies to start taking advantage of these innovations in new data center build-outs immediately.
Companies likely to benefit from Facebook's data center design include "people with this need to have massively flexible compute online in an instant and have an application that's fairly homogenous," Smith says. "Gaming, financials and media are some of the verticals where people need to scale so quickly."
Huff points out that Facebook's data center and server designs were optimized for Facebook's application, so their designs may not be suitable for a typical enterprise customer, which runs hundreds of diverse applications in their data centers.
Facebook "bought at a large-enough scale that it is reasonable for them to do mass customization to get exactly what's right for them," Huff says. "It's very significant that they are making an effort to share their design, but time will say what the real impact, the true significance will be."
Read more about data center in Network World's Data Center section.