I’ve never considered myself a technology Luddite but now I’m not so sure. My self-doubt started as I looked back over 10 years of computing.
One of the defining points of this past decade was the so-called paradigm shift to client-server computing. The term “client-server” started to infiltrate Computerworld’s pages around 1990, not long after open systems had hit the big time. As often happens with something new and a bit conceptual, a lot of the early stories were about definitions. Just what was “client-server”?
The answers vendors gave make amusing reading today. They look more self-serving than client-serving. It seemed like every supplier was suddenly doing client-server, often with the same products they’d been selling the year before. It was all in the packaging, initially at least.
There was a change, too, in the companies and issues that came to dominate the headlines. Prior to that, a pronouncement from Bill Gates would be buried deep inside the news pages of Computerworld.
So, mired though we were in recession, it felt like a period of real optimism had begun. Here at last was the long-awaited way forward which made sense of it all. It gave the stamp of corporate respectability to the troublesome PC. Just as importantly, it offered a glimmer of a future to the shaky old guard which had been struggling to find a new identity. Their business was now about servers and services and they could co-exist with the PC rather than compete with it.
Whether it was a case of vendors pushing or users pulling, the IT world seemed enamoured of client-server. Users hesitantly, slowly, started to plan for it in their futures. And the trickle, as they say, became a flood.
We’ve been living through this golden age for a few years now and 10 years of history is like a “before” and “after” picture.
And that’s where my problem started.
I had these two pictures in my head but I found myself thinking that the “after” picture didn’t look much better than the “before”. What really important things are we doing today with client-server that we weren’t doing before? How much extra productivity have we delivered? And how much more cost-effective is it? After all, it’s worth reminding ourselves that part of the early euphoria of microprocessor technology was that it would save real money over the old systems.
I believe now that we’ve been deceived and the spoiler in this idyllic picture has been the true cost of PC ownership.
It’s an insidious cost that’s often remained buried but it’s increasingly coming to the fore. And it goes beyond the hardware and software to the enormous labour costs, both overt and hidden. It has meant we’ve shifted costs, even added them, rather than replaced them, and it’s robbed users of any real gains that might have been delivered by the microprocessor revolution.
As if that’s not bad enough, we’re seeing new applications that are more and more marginal (the easy ones were picked off long ago), so it threatens to become downright unproductive.
A few months back, I read a report in which an MIS manager lamented the fact he’d been forced to throw out hundreds of cheap, simple terminals and replace them with PCs. The reason was that development of his old software package had been phased out in favour of a slick client-server model.
Most of his users on the factory floor, in the stores, at the customer services desk, in the accounts department, didn’t need PCs and were positively handicapped by learning them. The new system was no doubt a great benefit to a handful of people in terms of flexible reporting and set-up. More importantly, it looked great in the demo so clearly the software developer wasn’t going to drop it when the crunch came and one system had to go.
In this drive to “flexibility, power and user-friendliness” that the new paradigm claims to offer, we’ve lost something along the way.
I suspect the transition to the current crop of office applications is similar. We’ve just witnessed how slowly corporate New Zealand has taken to the next generation of Windows technology. The real cost is huge when you factor in hardware, training and support (including the informal support networks of “office experts” that inevitably pop up). But the productivity gains aren’t so obvious.
While the eventual move is inevitable, I’m sure many users are upgrading because--like the manufacturer in the story above--they see the end of their “old” systems rather than genuine, and cost-justifiable, productivity gains from the new generation. Typically, too, where new features are useful to a handful of users, the gains are lost by the need to upgrade all users so those few can take advantage of them.
As a PC user (indeed, enthusiast), I’m getting less and less incremental benefit from each upgrade. As a CEO, I’m seeing a diminishing justification and am much more aware of the real costs of those upgrades. I’m becoming a technology Luddite.
I’d get a bigger productivity boost from a slightly smarter email package than I would from more features in my top-of-the-line word processor or spreadsheet. Most of my staff would too. Ten years of history suggests to me that, in reality, that point was probably reached some time ago.
So when the cost starts to outweigh the benefits, when expensive upgrades are forced by the technology itself rather than by genuine productivity gains, and when we find ourselves getting pushed into marginal applications for which PCs aren’t quite suited, it’s time to sit back and take stock. Sounds like time for another paradigm shift.
I’m not predicting the death of the PC. That would be like predicting the death of the mainframe (and yes, we’ve run a few of those predictions over the last 10 years). It hasn’t happened. It won’t happen. The PC has a long and successful life ahead of it. But many new applications are about presentation and communication, not processing. They need usability and a cost of ownership that’s closer to a TV and telephone than a computer. They need different shapes, sizes, features, functions.
A PC on every desk and a PC in every home won’t happen. And that’s good. Let PCs do only what they do well. We’ve reached a point where users--whether in the office or the home--are making too many compromises, living with too many shortcomings, seeing too little return in exchange for the high real costs.
I now understand how Bill Joy felt when he despaired at the direction the computing world was taking. He went bush for a couple of years to change it and the product he eventually came up with, Java, is on the verge of shaking the very foundations of computing.
It promises choice, variety, easy communication, the chance to build the right tool for each job. And as computers drive into every corner of our work and private lives, those qualities are essential.
Interestingly, Java is subject to many of the same criticisms that were levelled against the PC years ago. It’s slow, under-powered, limited. But good technology has a habit of maturing and each improvement opens a few more doors. If there’s one thing a look back shows us, it’s that big changes take time to happen.
I’ve seen the past and it’s still a half-met promise. The PC has got us off to a start but the promise of client server, with its plethora of intelligent communicating terminals, won’t be delivered by a one-size-fits-all window into this world.
We’re in for another rocky ride but maybe this one will deliver businesses and wired homes alike the other half of the client-server promise and the real benefits, not just the costs, that will come from it.