In the last 15 years computing has moved from being a preserve of specialists and large corporations to become an everyday tool for the small businessperson. This new market has had a massive impact on the technology, notably in the area of graphical user interfaces.
The Macintosh, which pioneered the idea of the graphical interface for the mass market, was two years old in 1986. Against a still considerable amount of scepticism, we tracked the evolution of the GUI through improved versions of the Mac and the then even newer parallel stream of Windows development.
This first version of Windows wasn't very powerful and so nothing like the popular environment it eventually became. Microsoft Windows 2 made a bit more impact on its arrival in 1987. The first really popular version of Windows was version 3.0, released in 1990. This benefited from the improved graphics now available on PCs, and also from the 80386 processor, which actually allowed Windows to run two or more applications at the same time -- "true multitasking" on a PC.
This history of course glosses over the very first desktop GUI, Xerox PARC's Star in 1981, which occasioned this writer to write a prescient article (for another publication) headlined “The mouse crawls up the icon – is this the future of computing?”
If we were, Desert Island Discs-like, asked to point to one development that had the most effect on the use of computing over the past 15 years, most of us would almost without question point to the internet and email. Yes, it was around even (just) in New Zealand in 1986, but chiefly as a tool for university researchers, mostly in the computer disciplines, to seek out and exchange source documents and research findings.
Computerworld covered the internet in that phase, reporting the development of the embryonic network and most of its succeeding stages. In that, we can say, we were ahead of some mainstream “futurists” like Lyall Watson in Supertrends (1987), where it does not even rate a mention.
The bare “net” still had very basic text-driven presentation and functions, notably in the area of search with clunky tools like Gopher and WAIS. The arrival of the World Wide Web in the late 1980s changed the whole picture and gave internet resources the “legs” to take them out into the world to businesses filled with comparative computer novices.
Once the internet and the web were well-established we predicted positive outcomes of the use of similar technology in-house and among partners in the form of intranets and extranets.
On the back of the web came the whole apparatus of e-commerce, and again Computerworld was there, predicting great things of it. We never quite went over the top (well, maybe once or twice) in predicting a dot-com revolution which would make entrepreneurs disgustingly rich; we retained a moderate and cautious attitude and helped out readers through the mire of different solutions as choices multiplied.
The distribution of work between the desktop, departmental processors and remote “big iron” (mainframes) has oscillated back and forth, from the days of dumb mainframe-attached terminals through the whole client-server phase, and back, in some measure to a simpler desktop, increasingly with a web-like interface.
We saw great promise in client-server and it has well repaid our, the industry’s and business’s faith in it. It has evolved in style from a simple two-level architecture to at least three (database, business rules, presentation).
With the incursion of web-based systems, the complex client-server system has suffered some erosion, but it has a notable place in the evolution of IT, albeit perhaps a place in history.
The most memorable strike against the client-server framework was Oracle boss Larry Ellison’s network computing incentive.
Computerworld correctly foresaw the rise of the ultra-portable information device, the PDA, particularly the progressively more capable Palm models, now indispensable to many on-the-road businesspeople and IT workers.
At the ground level of network technology, Computerworld saw the potential in increase in the capacity of ethernet, originally devised in the 1970s, to bring 100 Mbit/s and potentially gigabit speeds to the local area network. In 1989, we interviewed a 3Com man who predicted speeds which then seemed incredible.
Of course we also covered the rise of the LAN itself as a major tool of business, tracking developments both in the base-level technology and in software concepts like the virtual LAN, which extended its capability.
We saw the early glimmerings of storage area networks (SANs) too, and picked them as a potential saviour of the woes of having to continually pile on more and more disk to separate servers and predict where the demand would come next. It has since shared the limelight with NAS (network-attached storage) but the basic principle of sharing storage which is independent of any particular server has become an important thread in the development of present-day IT shops.
In 1990 we talked to Alcatel about a technology that used existing copper phone connections for high-speed communications. The discipline was called asymmetric digital subscriber line (ADSL). We pointed out its advantages in the face of sceptics as influential as Jack Matthews of Saturn (later TelstraSaturn) -- which still has its money on fibre-coax. Perhaps ADSL is not quite become the barnstormer we expected it to be in New Zealand, but has established a comfortable niche as an alternative.
The building of data warehouses and the related discipline of data mining was another trend we accurately identified, with early studies of how the technique was being employed to draw useful information on trends out of daunting piles of data. Come to think of it, we could do with just those techniques in writing this article – but deadlines are looming and there’s no time to set a warehouse up.
On the mainstream database side and in program development, we covered the rise of object technology and the component approach to programming.
In the early 90s Computerworld talked to Sun about an obscure language called Java, developed to push small program routines over the internet, so they could be executed on the desktop. We covered subsequent trend, as the applet was joined by the servelet. Java, Microsoft notwithstanding, has been a significant thread in the rise of the networked computer paradigm.
Over the years we commented on several trends in printer technology, covering, particularly the emergence of laser printers. Thought they seemed hideously expensive and slow at first appearance, they have since become a staple of the business market – though the inkjet still rules in the SOHO market.
Anyone caring to predict what we might see hooked to computer systems over the next 15 years might recall IBM chairman Thomas Watson's comment in 1943 that there was a world market for maybe five computers, or even a BBC television current affairs programme of the mid-1950s, called Tonight. The compere interviewed a physicist who explained how light and electrons could be made to interact in strange ways inside ruby crystals and produce a concentrated light beam with all its wavelengths in step. This "laser" phenomenon was an interesting discovery, he said. “Of course, we have no idea whether it will be of any practical use.”