They might use different buzzwords — adaptive, organic, on-demand — but just about every management, systems, server and storage vendor is focused on the same thing: bringing enterprise companies to data centre nirvana.
In this glorious state, generally known as utility computing, hardware never would be over-provisioned, problem solving would be handled proactively rather than reactively, and minimal labour would be required to roll out new applications, provision servers and support users. Utility computing products run the gamut, from server and storage virtualization, vulnerability assessment, middleware integration and business process modeling to the umbrella of automated management. And vendors pitching the idea are as varied as IBM, Microsoft and Veritas Software.
But utility computing products still are immature, with no suite from a single vendor capable of delivering on the overall promise. Industry watchers don't expect to see data centres running entirely on the utility computing model for at least three to five years, and some say seven to 10 years is a more realistic expectation.
That leaves users to toy around with early products that provide some utility computing functions while closely watching developments and wondering if everything eventually will gel, as vendors propose it will. For example, at WeightWatchers.com, IT is automating server management and provisioning, but isn't adopting a utility computing model outright, says Mark McNamara, IT director at the New York company. "All the utility computing hype from the big hardware vendors is focused on virtualization. But the biggest challenge is managing the variety, complexity and frequency of change inherent to a multivendor infrastructure,"he says.
Toward that end, WeightWatchers.com six months ago rolled out BladeLogic's Operations Manager software to track configurations and changes across about 300 Compaq/Hewlett-Packard servers in three data centres. "We need the ability to make any type of change across multiple platforms (by using administrators with) various skill levels, and we need a smooth migration path to get there," he says.
Four tenets of utility computing
McNamara touches on utility computing's must-haves. Vendors, analysts and corporate IT managers mostly agree that utility computing requires management and other technologies that centralise, integrate, virtualize and automate the IT infrastructure and the applications running on it. For enterprise data-center managers, this means rethinking the traditional approach to data-center management.
Adopting a utility computing model would require corporate IT departments to eliminate the network and systems management silos and standardise on integrated, automated management tools. They would need to bring in technologies that can provide centralized management, seamless integration across platforms, virtualised pools of network, server and storage resources, and intelligent automation that eliminates the need for hands-on management.
Considering all that utility computing entails, it is no surprise that systems management heavyweights HP and IBM offer the most comprehensive packages. Big Blue, which has beaten the utility computing drum since announcing its eLiza computing initiative in 2001, offers on-demand features in its Tivoli management software and WebSphere business integration middleware, across its database and server lines, and through its outsourcing arm, IBM Global Services.
Gartner and Forrester Research put HP in the lead in the race for utility computing, with its Adaptive Enterprise strategy, Utility Data Centre services and OpenView management software products.
HP and IBM, and fellow server vendors Sun Microsystems (through its N1 program) and Microsoft (with its Dynamic Systems Initiative), address utility computing through hardware and software.
This compares to companies such as BMC Software, Computer Associates International and Micromuse, which have software-only approaches. Each vendor says it can manage an application based on its use of network, server and storage resources, and provide automated features and real-time business service impact analysis.
But the utility computing vision wasn't doable until newcomers such as BladeLogic and Opsware emerged about two years ago with automated data-center provisioning. In providing automated server and device provisioning, and automated application rollback features, these companies filled a gap in management products that didn't address change and configuration management across corporate data centres. That made these newcomers ripe for acquisition by the bigger vendors reaching to provide all-encompassing utility computing lines. IBM, Sun and Veritas, for example, all snatched up start-ups with automation technology.
The implementation challenge
Data centre managers who want to benefit from the incremental integration, virtualization and automation features of utility computing should expect to have to work at it.
Such has been the case at EFW, a full-service electronics supplier using CA's Unicenter software to manage an on-demand network. EFW has had to document processes and build applications in customer-friendly formats, and deploy some hardware, says Harry Butler, support centre manager at the Fort Worth, Texas, company. Butler chose to go the on-demand route when faced with managing the infrastructure that supports the fast-growing company (120 % in the past three years) with the same size staff.
"On-demand computing is a challenge to set up, but it's very rewarding," Butler says. "It lets you provide more support with less resources and still maintain a high customer satisfaction level."