The incredible shrinking datacentre

As the efficiency and density of datacentres increases, smaller and even mobile datacentres are emerging

Driven by efficiency gains in technology, including virtualisation, high-performance CPUs and high-density storage, datacentres are getting smaller, says Roger Cockayne, co-founder of Revera.

“Small is beautiful in IT these days. Mainframes and the large halls of computer space are a thing of the past,” he says.

Revera has datacentres in Auckland, Wellington and Christchurch. Some of its customers include Livestock Improvement, Quotable Value, DTZ, MAF, Minter Ellison Rudd Watts and House of Travel.

By building 1 MVA (megavolt ampere) datacentre “pods”, (see sidebar on page 13) Revera can run an efficient operation, only scaling up when it needs more pods. There is a big financial difference between investing in 1 MVA or 2 MVA, as the gear for 2 MVA is so much more expensive, says Cockayne.

“In one room that we built in 2003, we had about 400 operating systems and a few terabytes of data. In the same size room now, we can get about 30,000 operating systems and a petabyte of data. Floor space is the last frontier that has been beaten.”

When a local council recently outsourced to Revera it reduced space from 25 cabinets to half a cabinet.

Colin Price, manager of datacentre and storage, Unisys Australia and New Zealand, agrees.

“The days of the big warehouse datacentres are gone,” says Price. “Today, it is all about power-dense, compact, modular datacentres.”

The space requirements of yesterday have changed to power density requirements today, he says.

Unisys has invested quite heavily over the past three years into putting a modular system into its datacentres so as to react to the changing market demands more quickly. In its Kapiti datacentre, Unisys has put in a floating wall system, enabling the company to cool and utilise the space only as required. The Auckland datacentre has a six-step approach to growth, with the ability to double in space.

The two datacentres are treated as virtual datacentres, says Price, meaning that resilience is in the service, rather than in a particular location.

Some of company’s datacentre customers are AMI, Lumley Insurance, Ministry of Health and Inland Revenue.

Unisys is measuring its PUE (power usage efficiency) and is looking to put in power measuring equipment to better drive efficiency in the datacentre.

“This is a particularly green initiative that will allow us to improve our efficiency percentage. By measuring and understanding the flows in our Auckland datacentre we have achieved a three percent saving on cooling,” says Price.

The Kapiti centre uses what is known as “free cooling”. For nine to 10 months of the year the chilled water can be cooled with the outside air. In the event of an earthquake, the facility is equipped with iso-bases, an isolation bearing that decouples the equipment from ground motions. After the shaking stops, the bearing will smoothly return to its original location, says Price. “The iso-bases are rated to take 7.5 on the Richter scale and not fall over.”

With the Auckland datacentre having a six-step approach to growth, “we can take as many of those steps as we need. With a bigger client, we might need to take three or four steps at once,” says Price.

It’s a fast process to scale up power for those customers who need it. Unisys has more than doubled the power going into the facility. It also has the ability to double in space, but hasn’t done so yet, as power is the number-one requirement.

Datacom had just completed its new datacentre in Christchurch, which opened just after the earthquake, says the company’s COO Steve Matheson. The remote monitoring system in the centre captured a video during the earthquake, he says.

“The cabling, which was hanging at the time, was moving and shaking all over the place, but the seismically braced racks were dead solid,” he says.

Datacom also has datacentres in Auckland and Wellington. Some of its customers are Telecom, NZ Post, ASB, NZ Customs Service, Ministry of Justice, Fletcher Building, Southern Cross, Placemakers and Waikato District Health Board.

Power issues

The big issue that all datacentre providers are facing now is power density, the ability to support increasingly energy-intensive racks, says Matheson. Energy-efficiency is another trend.

“People are concerned about how much power is being used and how much they are going to have to pay for it,” he says.

Having shorter cable paths, more efficient generators and UPSs will help, but in order to make more radical changes to efficiency you need some sort of non-conventional cooling technology, that uses the environment to create the cooling, he says. Datacom’s Orbit facility in Auckland uses air from outside as a cooling medium.

“When we are operating in that mode, the conventional air conditioning in the datacentre doesn’t even run,” he says. “This works in Auckland, because the climate and humidity is appropriate for this, but it doesn’t work everywhere.”

Maxnet has been continually upgrading its two datacentres in Auckland and Christchurch for the past five years, says Maxnet CTO, Derek Gaeth. A recent improvement is the replacement of its older UPSs with brand new UPS gear, which the organisation expects will save $70,000 to $100,000 per year in power. Its customers include APN Online, M&C Saatchi, Yarrows and Trustpower.

The green trend is definitely here to stay, according to Gaeth.

“For us it’s a two-pronged approach – green is efficient and saves money,” he says.

UPS efficiency, for example, has gone from 80 percent efficiency to more than 94 percent, giving quite significant savings, he adds.

Maxnet is now working to increase density in its facilities. Traditionally, customers have bought a full rack of hosting – and would require that full rack to hold the equipment. Now, thanks to virtualisation, the same customers only need half a rack.

“We are actually encouraging our full rack customers to go to half a rack, so we can double density in our datacentres and become more efficient,” says Gaeth.

Shipping-container solution

Microsoft has recently developed its G4 (Generation 4) datacentre that is a shipping container with up to 2500 servers inside.

“Literally, all you need is a concrete block and power to plug it in. You don’t even need water,” says Bradley Burrows, product manager of Windows Server & Tools, Microsoft New Zealand. “So you can have a datacentre in a container up and running pretty fast.”

These mobile facilities are all about using the natural environment to run – they use outside air to cool the systems and solar cells for power.

“It takes one guy to build an entire container with 2500 servers, and have it up and running in four days,” says Burrows. “It is just phenomenal.”

It is a huge shift from the older datacentres where you had to build a massive piece of infrastructure, he says.

Datacentre management tools have also developed immensely, says Burrows. Hotmail, for example, which has more than 200,000 servers, has one engineer for every 35,000 servers today.

“You wouldn’t be able to do that without having really good, scalable and remote tools for the job,” he says.

Microsoft is taking on the cloud space with its Azure offering. But the software giant has no datacentres locally.

“That could be in the future,” says Steve Haddock, business group lead of Server and Tools at Microsoft New Zealand. “Our main datacentres are in Singapore, the US and Ireland.”

Microsoft has 15 datacentres around the world, adds Burrows. With Azure, it doesn’t matter where in the world the servers are, because a lot of the services it provides are web-based, he says. But if a customer is running a full infrastructure for a customer, including desktop and virtual infrastructure, they need to be local, he says. An organisation could completely outsource its infrastructure, using a local hosting company, which could back-end some of its services into Azure, he adds.

Safe storage

But how secure is cloud computing? And is it safer to have your data hosted here rather than overseas? Datacentres are becoming hubs for cloud computing, which raises the bar of what is required in datacentres, says Datacom’s Matheson.

“Customers expect cloud services to be very reliable,” he says.

In the past six months, Datacom has deployed cloud nodes in all of its facilities.

But outages do happen, even though redundancy is built into the environment. In October, Datacom had an interruption to its cloud service, caused by a simultaneous hardware failure. The primary device failed and the back-up device also failed.

“In this particular case, it is difficult to see how the environment could have been made more resilient,” says Matheson.

Maxnet also recently had an outage, due to an alleged fault by a vendor.

“It was very unfortunate,” says Gaeth. “The design we run in our virtual environment is the highest redundancy design you can possibly build,” he says.

The design was jointly created by Cisco, VMware and Dell.

Maxnet has started a full audit of the incident and has also improved its ability to see trends and catch errors like this one earlier, says Maxnet CEO John Hanna.

None of the datacentre providers spoken to feel threatened by overseas players such as Amazon and Rackspace that offer cost-effective, scalable services.

For local banks and government agencies, there are potentially performance, sovereignty and security issues with using overseas providers, says Matheson.

“And if something goes wrong, who do customers call?” he asks.

Maxnet has actually won customers over from the likes of Rackspace, Gaeth claims. Some of Maxnet’s customers run 24-hour, critical applications out of the datacentres and need a high-availability provider, he says. The SLAs that some of the international hosting companies provide are also relatively low, Hanna suggests.

EMC works with its customers to design the next generation datacentres, says EMC country manager Arron Patterson. Customers include Gen-i, Datacom, ANZ, ASB, Vodafone and Telecom. Power and cooling are top of the list of its customers’ issues, he says.

“A lot of the datacentre space was allocated five to 10 years ago,” says Patterson. “With the amount of compute capability that is going into some of these facilities, it is getting very dense.”

These datacentres are using a lot more power and producing a lot more heat than they were ever designed to cope with, he says. This has led to customers looking for new datacentre space and more efficient ways of doing things.

Converged infrastructure

Hewlett-Packard has done a lot of work in the area of converged infrastructure, which unifies individual stacks of storage, servers and networking into a single virtualised environment, says Trevor Armstrong, country manager of enterprise servers storage and networking, HP New Zealand. A properly structured datacentre helps enable businesses to focus on innovation and driving growth, he says.

Customers that are looking to use services from a datacentre provider should do their research first. Check the reliability and security of the datacentre and the reputation and stability of the vendor, says Datacom’s Matheson.

“Datacentres that are built on top of public carparks and datacentres in multi-tenanted buildings – they are not really that secure,” he says.

Revera’s Cockayne believes there are a number of companies currently in the market that are making claims they are not ready for.

“If a company says it has, for example, disaster recovery tested, make sure you see the evidence of that,” he says. “A lot of people have been caught short by the trend to bring things to datacentres. When the cloud thing took off, everybody jumped on the bandwagon and said: ‘We are cloud’, when in fact they were not.”

“If someone says they provide a 99.99 percent SLA, check that their infrastructure can actually deliver that,” says Maxnet’s Gaeth. “Make sure that the provider meets your needs today, but also your future growth needs.”

The Tier System for datacentres

US-based Uptime Institute rates and certifies facilities against its Tier Classification System, where Tier 4 is the highest and Tier 1 is the lowest.

Most datacentres in New Zealand are Tier 1 or 2, but there are some Tier 3 datacentres, for example Datacom’s Auckland datacentre; Maxnet’s datacentres; and Revera’s and Gen-i’s facilities.

In a Tier 3 datacentre, all components of the facility can be concurrently maintained, without disrupting the service.

To achieve Tier 4, a datacentre must meet the requirements of Tier 3 and at the same time maintain redundancy. So, if a datacentre needs two generators, it must have three.

Join the newsletter!

Error: Please check your email address.

Tags Special ID

Show Comments

Market Place

[]