We have self-policing languages like Java, which automatically does garbage collection, buffer overrun protection and more. Combine these features with Java's exception handling and you end up with the equivalent of a runtime debugger. All programmers have to do to exploit this capability is make it a habit to catch exceptions and report the errors. Mix in a beta program to expose those errors and you're likely to end up with a program that is relatively bug-free and, just as important, resistant to malicious attacks.
None of this is rocket science, and none of it is a particularly expensive addition to the development cycle.
If you don't happen to like Java, that's no excuse. There are plenty of other languages and runtime environments with similar features.
If you have no choice but to use error-prone languages such as C or C++, there are plenty of development tools for these languages that will sniff out buffer overruns, pointer errors and other common programming mistakes behind most application bugs. And, of course, give your applications to the most brain-dead users you can find in order to test, test and test your applications again.
That brings us to the one remaining obstacle to stable client software, the unpleasant problem nobody likes to address. I'll give you a tip on how to track it down. Sit down at a Nintendo GameCube or a Sony PlayStation 2 and play some games from start to finish. Then do the same on a PC. Chances are, you finished the console games without encountering any quirks, bugs or game crashes. At most, you might have been able to exploit a programming bug to cheat at the games.
In sharp contrast, you probably encountered your first problem with the PC games when the installer complained that your version of DirectX was out of date. (DirectX is the Microsoft graphics API designed mostly for PC games.)
Assuming you had enough CPU horsepower and memory to make the game enjoyable once it was installed, the game probably crashed at least once, if not several times, before you were done.
Console games are more stable because a game console is a highly predictable platform with a stable API. If you can find any differences between the hardware or software in two PlayStations or GameCubes, the differences will be subtle and unlikely to affect the way a program behaves.
Pick any two PCs, however, and they are likely to have radically different display cards and drivers, different DirectX APIs or different versions of the operating system. They probably won't even have the same chip sets on the motherboard.
Replace the PC with a console, aka a network appliance or network computer, and you create a predictable platform for software developers, which should result in much more stable software, not to mention more secure software. Network computing fizzled for a number of reasons the first time around, some of them good ones, some bad. For one thing, once Larry Ellison's low price tag was imprinted on everyone's brain, there was no way to build a network computer fast enough to run Java well, or to sell one at a profit. One very bad reason network computing failed is that we have such an irrational love affair with the PC that we tolerate its unstable and insecure design.
I think it's been long enough since the network computer's initial failure that we can revive and rethink concept. There may be no other way to recover some of that $US60 billion we lose on bugs each year.