The temptation is to say that this is just another computing pipe dream. But utility computing’s boosters describe it differently: it’s literally a dream based on pipes. As pipes – or telecommunications links – have fallen in price, the cost of pouring computing resources down them has fallen. This has happened to the extent that a local utility computing company says it it can be cheaper for a user at one end of the country to farm out its computer infrastructure to a provider at the other end than to run its own systems.
Infrastructure, it points out, is not a source of competitive advantage. But maintaining a labour force to manage the infrastructure is a significant cost, which can be eliminated by paying a computing utility to provide processing capability.
I’ve been hearing the same theme from a variety of angles. Veritas, the storage software specialist, talks utility computing from the data backup point of view. It has released software that makes it possible for very large organisations to keep tabs on the rates of storage use on a departmental basis, so that CIOs can charge back for actual resources consumed. Its application to utility computing is similar, providing a basis for billing for storage use.
The key words in the Veritas recitation seemed to be “very large” organisations, of the kind that are relatively rare in New Zealand. It suggests to me that the utility computing talked about by US vendors for US audiences is a different phenomenon from that described by our local self-styled computing utility.
Analyst IDC has a definition that seems to allow for both. According to IDC, utility computing is the ability to seamlessly access computing capacity on demand across geography, application and operating system. Private utilities refer to the ability to provide utility computing capability within an enterprise or to a closely-knit group of enterprises, while public utilities refer to the ability to provide computing capacity to any customer at any time independent of customer affiliation. The technology framework that will enable utility computing encompasses modular technology, provisioning, throughput computing, partitioning, virtualisation, management and integration.
Citrix, best known for its Metaframe thin-client software, seems slightly more in tune with local conditions than does Veritas when it talks about connecting desktops to a hodge podge of backend systems to provide on-demand computing. Citrix Australia and New Zealand boss Gary O’Brien alludes to the fact that utility computing is the latest in a succession of schemes for delivering computing, from bureaus to client-server architectures to LANs, and that each new wave hasn’t swept all else away. That means back-end systems have become progressively more complicated and messy.
Our local computing utility is mindful of the same historical progression. If utility computing sounds remarkably like the bureau-based services of old, the difference is that this time Citrix gives it a graphical interface. And it is a viable option for small organisations.
The disconcerting part is, what will it do to IT departments?