The potentially costly answer means that almost any all-encompassing bureaucratic rule such as "We upgrade a third of our PCs every year" is a dereliction of management responsibility.
Such upgrade policies act as umpires of conduct and obviously make life easier for management. As for the manufacturers of the product, I'm sure they'd love to convince even more of us that this type of policy is the best long-term strategy, especially since achieving their strategic goals depends on us buying into the never-ending upgrade spiral. It should be equally obvious that such upgrade policies impose unnecessary financial burdens on the organisations paying the bills.
Fact: Technology, from hammers to CRM systems, doesn't decay over time. The notion that old PCs become obsolete is a myth of convenience, motivated by a desire to acquire the latest shiny thing.
The first PC I bought in the late 1980s (a North Star Horizon Z80A CPM/N*Dos machine with 64KB of memory) still does what I bought it for. If I still had a desire to run a Play by Mail game, this machine could handle the task. True. A modern fire-breathing machine can do it better, at a fraction of the cost, but the old machine still works. It hasn't changed; my expectations of what is possible have evolved.
The issue of upgrades boils down to a single question. Does the person using the technology need more functionality than they have access to?
If they do, then upgrade what they're using ... providing you can afford it and the benefit outweighs the cost. With respect to personal computers, most people, not all of us, don't need more functionality or power -- we have far more than we need on our desks already.
This "upgrade" we're discussing does not necessarily refer to either hardware or software. It might refer to an upgrade to our "wetware", (or "training" if my reference was too oblique). Knowing how to use an application properly does far more to increase productivity than adding another gigahertz of raw computing power.
It all reduces to this question of need; do they “need” more functionality? Upgrading a third of your equipment each year just because of a policy statement seems, to me at least, a waste of valuable resources.
The meaning of the word "need" will change slightly depending on the technology we're discussing, but with respect to PCs it's fairly uncomplicated. Is the machine, regardless of when it was bought, capable of handling the work we're currently demanding of it in a productive, cost-effective manner?
Of course, it takes time to ask that question of each of several hundred or several thousand PC users in an organisation, which is why we originally created annual replacement strategies.
The main objection to management policies of this type is that they quickly become accepted practice. They are rarely re-evaluated because they hide so well behind the cover of "We've always done it that way".
At some point, for most users I think this happened several years ago, the technology becomes capable of meeting 99% of the tasks we can throw at it. Upgrades are unnecessary and regular maintenance becomes the new problem to solve. Referring back to the opening paragraph ... when did you last upgrade a hammer?
Once upon a time, replacement strategies such as we've been discussing were necessary. The technology available to us, both hardware and software, was under-functioned, under-powered and overpriced. That's no longer true. I use a lot of different applications, from word processors to databases, from browsers to online surveys, and nothing I do even comes close to taxing the capability of my machine or the applications I bought years ago.
Except of course the latest games, but that's the topic of a future article.