The IT department's days are numbered, due to a shift to utility computing. So predicts Nicholas Carr in his new book,The Big Switch: Rewiring the World from Edison to Google.
Carr is best known for a provocative article, "Does IT Matter?" published in the Harvard Business Review in 2003. The article asserts that IT investments don't provide companies with strategic advantages because when one company adopts a new technology, its competitors do the same.
Does IT Matter? made Carr the sworn enemy of many hardware and software vendors, as well as of CIOs and other IT professionals.
With his new book, Carr is likely to engender even more wrath among CIOs and other IT pros.
"In the long run, the IT department is unlikely to survive, at least not in its familiar form," Carr writes. "It will have little left to do once the bulk of business computing shifts out of private datacentres and into the cloud. Business units and even individual employees will be able to control the processing of information directly, without the need for legions of technical people."
Carr's rationale is that utility computing companies will replace corporate IT departments in the same way that electricity utilities replaced company-run power plants in the early 1900s.
Carr explains that factory owners originally operated their own power plants. But as electric utilities became more reliable and offered better economies of scale, companies stopped running their own electric generators and instead outsourced that critical function to electricity utilities.
Carr predicts that the same shift will happen with utility computing. He admits that utility computing companies need to make improvements in security, reliability and efficiency, but he argues that the internet, combined with computer hardware and software that has become commoditised, will enable the utility computing model to replace today's client/server model.
"It has always been understood that, in theory, computing power, like electric power, could be provided over a grid from large-scale utilities — and that such centralised dynamos would be able to operate much more efficiently and flexibly than scattered, private datacentres," Carr writes.
He cites several drivers for the move to utility computing. One is that computers, storage systems, networking gear and most widely used applications have become commodities.
He says even IT professionals are indistinguishable from one company to the next. "Most perform routine maintenance chores — exactly the same tasks that their counterparts in other companies carry out," he says.
Carr points out that most datacentres have excess capacity, with utilisation ranging from 25% to 50%. Another driver to utility computing is the huge amount of electricity consumed by datacentres, which can use 100 times more energy than other commercial office buildings.
"The replication of tens of thousands of independent datacentres, all using similar hardware, running similar software, and employing similar kinds of workers, has imposed severe economic penalties on the economy," he writes. "It has led to the overbuilding of IT assets in every sector of the economy, dampening the productivity gains that can spring from computer automation."
Carr embraces Google as the leader in utility computing. He says it runs the largest and most sophisticated datacentres on the planet, and is using them to provide services such as Google Apps that compete directly with traditional client/server software from vendors such as Microsoft.
"If companies can rely on central stations like Google's to fulfill all or most of their computing requirements, they'll be able to slash the money they spend on their own hardware and software — and all the dollars saved are ones that would have gone into the coffers of Microsoft and the other tech giants," Carr says.
Other IT companies that Carr highlights in the book for their innovative approaches to utility computing are: Salesforce.com, which provides CRM software as a service; Amazon, which offers utility computing services called Simple Storage Solution (S3) and Elastic Compute Cloud (EC2) with its excess capacity; Savvis, which is a leader in automating the deployment of IT;
and 3Tera, which sells a software program called AppLogic that automates the creation and management of complex corporate systems.
Carr points out that many leading software and hardware companies — Microsoft, Oracle, SAP, IBM, HP, Sun and EMC — are adapting their client/server products to the utility age.
"Some of the old-line companies will succeed in making the switch to the new model of computing; others will fail," Carr writes. "But all of them would be wise to study the examples of General Electric and Westinghouse. A hundred years ago, both these companies were making a lot of money selling electricity-production components and systems to individual companies. That business disappeared as big utilities took over electricity supply. But GE and Westinghouse were able to reinvent themselves."
Carr offers a grimmer future for IT professionals. He envisions a utility computing era where "managing an entire corporate computing operation would require just one person sitting at a PC and issuing simple commands over the internet to a distant utility."
He not only refers to the demise of the PC, which he says will be a museum piece in 20 years, but to the demise of the software programmer, whose time has come to an end.
Carr gives several examples of successful internet companies including YouTube, Craigslist, Skype and Plenty of Fish that run their operations with minimal IT professionals. YouTube had just 60 employees when it was bought by Google in 2006 for US$1.65 billion (NZ$2.09 billion). Craigslist has a staff of 22 to run a website with billions of pages of content, while internet telephony vendor Skype supports 53 million customers with only 200 employees. Meanwhile, internet dating site Plenty of Fish is a one-man shop.
"Given the economic advantages of online firms — advantages that will grow as the maturation of utility computing drives the costs of data processing and communication even lower — traditional firms may have no choice but to refashion their own businesses along similar lines, firing many millions of employees in the process," Carr says.
IT professionals aren't the only ones to suffer demise in Carr's eyes. He saves his most dire predictions for the fate of journalists.
"As user-generated content continues to be commercialised, it seems likely that the largest threat posed by social production won't be to big corporations but to individual professionals — to the journalists, editors, photographers, researchers, analysts, librarians and other information workers who can be replaced by . . . people not on the payroll."
Carr's argument about the future of utility computing is logical and well written. He offers a solid comparison between the evolution of electricity utilities in the early 1900s and the development of utility computing.
Carr's later chapters — about the future of artificial intelligence and the many downsides of the internet — seem less integral to his utility computing argument. And his discussion of Google's vision of a direct link between the brain and the internet seems far-fetched.
Nonetheless, The Big Switch is a recommended read for any up-and-coming IT professional looking to make a career out of providing computing services. If Carr's predictions come true, strong technical skills will still be valued by service providers.
*A rebuttal of Carr's views by IT professionals will appear on this site next week