Private clouds gaining traction among CIOs

The benefits are becoming more widely recognised

Private clouds — where companies use their own infrastructure and provision virtualised services to end users via automated tools — are gaining traction among IT leaders who want to deliver advanced services at lower cost.

However, as with any new approach to computing, private clouds today fall short on manageability, and some users worry about the risk of vendor lock-in, particularly with virtualisation and other tools that make cloud computing possible. Further, the fuzzy nature of just what private cloud computing means could slow the adoption of internal cloud setups.

That lack of definition doesn't bother Geir Ramleth, CIO at Bechtel. In fact, he says, the lack of a precise definition is a good thing, because looking at the private cloud too narrowly would "limit what it can do for us," he says. "You're talking philosophy here." Bechtel is one of the world's largest engineering and construction firms.

Alan Boehme, senior vice president and head of IT strategy at ING Financial Services in San Francisco, adds that a private cloud differs from old ways of thinking about systems architecture. "It's not just servers, storage or networks; it's every component," he argues.

That rallying cry is evolving into a real market. According to Gartner, enterprise spending on public and private cloud infrastructure services will total US$3.2 billion this year, up 28 percent from $2.5 billion in 2008. Spending in the public sphere accounts for the vast majority of those dollars. However, the market researcher expects that by 2012 IT shops will spend more than half of their cloud dollars on private cloud services.

Saving time and money

Ramleth has heard the cloud computing rallying cry and has seen dramatic results from Bechtel's private cloud platform, a standards-based setup that features virtualisation technology and automated provisioning. In 2005, more than 2000 IT employees staffed approximately 20 datacentres, where server utilisation reached two to three percent. Today, a much leaner Bechtel IT department, numbering 1100 employees, operates just three data centres, where server utilisation averages 60 to 70 percent.

At Bechtel, 44,000 employees across the globe have access to 230 applications. The IT department has already shifted about 60 percent of those applications into the company's private cloud. The rest will be moved to the cloud by the beginning of 2010, says Ramleth.

Such a "transformation", as Ramleth calls it, takes years. CIOs need to move carefully, he says, "because you don't want to move the sins of the past into new datacentres".

Before 2005, Bechtel had an IT-centric attitude about delivering services to users, Ramleth says. It had no set standards and provisioned resources manually. Now the company embraces a collaborative model of computing, one built on strict standards and guidelines that permit policy-driven access to provisioned resources.

For example, Bechtel standardised on Hewlett-Packard dual and quadcore BladeSystem servers. And because the services are separated from servers and other infrastructure elements that could change as hardware evolves, existing or future applications and services could easily run on new servers, storage systems and networks.

Ramleth also shifted IT's security standards from topology-specific to policy-oriented ones. And he has standardised the way IT prices its services to users. Where pricing used to vary based on application and user location, he says, it's now a flat per-user fee worldwide.

Adopting a collaborative model is the philosophical shift CIOs need to make in order for their cloud initiatives to be successful, says Ramleth.

Ultimately, evolving to a hybrid public-private cloud scenario will even allow him to eliminate capacity planning from his IT responsibilities, he says. CIOs should build private clouds for normal workloads and then buy into the public cloud for peaks, he argues.

ING's Boehme agrees. He predicts a policy-based, hybrid cloud approach is IT's future. "It doesn't matter where the assets and applications are running," he contends.

The private cloud at SAS

On the face of it, a private cloud appears much like other in-house datacentres, where applications run on machines that get plugged into outlets. As Cheryl Doninger, research and development director of enterprise computing infrastructure at the SAS Institute in Cary, North Carolina, says, "We licence the software, and we own the hardware."

But that's where the similarity to old-line computing ends, she adds. At SAS, users access a self-service portal to reserve resources on the company's private cloud, much like customers of Amazon's EC2 do with that public cloud service. For automatic provisioning, SAS uses a tool called the Infrastructure Sharing Facility (ISF) from Platform Computing in Markham, Ontario.

With Platform's ISF tool, SAS builds policy-based provisioning templates that help prevent chaos in the cloud, Doninger says. For example, testers in SAS's quality assurance department can reserve bare-metal machines for a few weeks, or engineers in the field can snag predefined servers, storage and software for a couple of days. ISF prevents scheduling conflicts and releases systems back to the available pool after they are no longer in use.

While some IT managers are comfortable with automating the entire resource reservation and provisioning process, others are more cautious. Sinochem Group is one example. Through a translator, Jinsong Peng, IT general manager at the Beijing-based industrial conglomerate, described how the company carried out a full SAP upgrade for its 200 subsidiaries by using a private cloud during the testing and deployment cycles.

"In the traditional model, we would have needed to replicate multiple systems throughout the development and testing processes," he says.

However, because Peng's team could share private cloud resources, Sinochem needed to add only 10 percent more capacity to the company's IBM AIX server infrastructure. As is the case at SAS, testers at Sinochem requested systems in various configurations, which IBM's CloudBurst service management tools dynamically allocated. But Sinochem instituted policies that also routed each resource request through a system administrator for approval, Peng says.

Even with that conservative, human-in-the-middle approach, the automated provisioning process helped Sinochem complete its full SAP upgrade in seven months, which is remarkable, given that projects like that can take years to complete.

As a result of the success of the cloud-based approach at Sinochem, Peng says, cloud technology remains a permanent part of the company's IT testing infrastructure.

Grey linings in the hybrid cloud

At ING, which has a few private cloud pilot projects in the works, a seamless hybrid setup is years away, Boehme says. In part, that's because of the limits of today's management tools, he adds.

"You want a single pane of glass for management," which doesn't exist today for hybrid platforms, he says. For example, if IT pushed workloads into public cloud computing services today, in-house administrators would not be able to manage the workloads in those external datacentres. Those workloads would run wherever the service provider's policies deemed appropriate. Cross-cloud policy management doesn't exist.

Paul Burns, an analyst at Neovise, a Colorado-based research firm specialising in cloud computing, echoes Boehme's view. Private clouds will dominate inside corporate IT for the foreseeable future because of classic business concerns about governance, security and, mostly control, he says.

"Although there is a trend toward invisible infrastructure, people still want to touch it today," Burns says.

For good reason, too, Burns adds. He points to a common problem in most datacentres: performance degradation of an application. Today's monitoring and management tools would not be able to diagnose a problem's source, if the application was running in a hybrid cloud environment.

Beyond concerns about manageability, Boehme raises the spectre of a move to cloud computing leading to vendor lock-in. For the most part, CIOs abhor vendor lock-in. Reliance on a single vendor can be costly and can keep a company from making necessary infrastructure changes. But Boehme worries that IT is risking "hypervisor lock-in", because of the lack of deep interoperability between the various virtual machine managers on the market.

For example, if one hypervisor leapfrogged others in terms of performance, security or features, a company that used a different hypervisor might not be able to switch and take advantage of the new functionality because the cost of switching might prove prohibitive. Also, in the event of a merger or acquisition, integrating the two companies' operations could prove difficult if each used a hypervisor that was incompatible with the other's hypervisor.

But the benefits of cloud computing far outweigh the limitations, many enterprise IT managers are finding. The biggest factor, of course, is money. Most CIOs must shrink their budgets. PricewaterhouseCoopers estimates that 30 years ago IT consumed one percent of a company's revenue, but by 2007 that figure had skyrocketed to six percent and was on track to reach 10 percent by 2010. Then the recession hit and stopped spending growth in its tracks.

Still, IT has to continue delivering services to the business — but it has to do so while spending much less money. A private cloud lets IT get immediate dividends through self-service, automated provisioning and improved system utilisation — all of which will have a big impact on IT operations costs, which chew up as much as 70 percent of a CIO's budget, according to PwC.

A director of the centre for technology and innovation at PwC, Vinod Baya observes, "IT has done great automating everyone but itself".

Now, he says, is the right time to start.

Join the newsletter!

Error: Please check your email address.

Tags private cloudsmanagementCIOs

Show Comments
[]