Worries about the reliability of our electricity supply and the danger of power cuts this winter are a wake-up call for datacentre operators to put their houses in order.
Meridian Energy’s Keith Turner told Parliament’s Commerce Committee last month that he believes there will be enough generation and demand-control to cover any unexpected failures. But “it is a very fine margin — finer than I have ever seen it in my career”.
Vector chief executive Simon McKenzie said the national system was on the edge, while Transpower chief executive Patrick Strange indicated that the Pole One power cable in Cook Strait might be reactivated.
For datacentres, adequate UPS protection is a must because unprotected computers can be damaged when power comes back on. It’s also important they have sufficient supplies of diesel — at least a day’s worth — and systems and procedures to maintain the integrity of fuel supply.
During an Auckland power outage in 2006, Unisys’ Penrose datacentre went out in the middle of a major upgrade because the standby generator’s diesel became contaminated with water. As datacentres become more in vogue, with the development of server creep, this raises the issue of the cost of power, and how to best manage it.
Cooling looms as one of the main issues for such utility computing, but fierce the promotion by cooling suppliers could lead some IT managers down the path to unfit technologies and later meltdown, according to Revera managing director Roger Cockayne.
“Cooling technology is the current hot IT item, and suppliers are doing a good job of promoting it. But it’s being over-complicated,” he says. “The cooling issue has only ever been about putting cool air in front of cabinets, so they can draw and drag it across systems at a speed allowing the air to absorb heat.
“The rest has been over-complicated by the suppliers. There are countless solutions, and suppliers are talking about thermal dynamics, ‘hot air, the issue’, volume of air… everything except the main issue: correctly positioning cold air.”
A panel discussion, held in December, hosted by the US Mass Technology Leadership Council, was repeatedly told by speakers that datacentre managers were missing out on opportunities to use less power. (Computerworld, January 21). Air conditioning was highlighted.
Cockayne says if you put cold air in front of a computer there’s a good chance it will stay cool and the rest takes care of itself.
In-line cooling, where chilled coolant is delivered via capillary tubes to air handlers situated in hot spots, is one example of how computer rooms can come unstuck, he says.
“It sounds brilliant, and probably works, until an earthquake breaks the capillary tube carrying coolant. Then you melt down.
“That’s one reason why we backed away from it. Cooling must be practical for the environment. New Zealand needs reasonably commoditised solutions able to withstand variable conditions. It’s important to appraise all aspects — not just a system’s ability to deliver air. Will it continue to work while we’re shaking?”
He says Revera made cold air a single-minded design obsession. “The Type R design we believe has hit the right spot and has been validated by similar designs now coming out of the States. The next issue is taking advantage of hot air, rather than treating it as a problem. Deliver it to a heat exchange or a facility that requires hot air.”