At the recent Vista rollout, Steve Ballmer reportedly remarked that the internet “wasn’t fully developed” in 1995. That sparked an interesting dialogue among many of the Internet Engineering Task Force luminaries, who had researched and developed the internet in the 1980s and early 1990s. Their point: since most of the key specs were in place by 1995, Ballmer’s comment was inaccurate.
Specifically, by 1995 TCP/IP, routing protocols, network address translation and congestion management were already devised, and even HTTP already existed. (Believe it or not, even variants of VoIP were up-and-running in test labs at that early date). Of course, there have been major incremental advances since then: MPLS, firewalls, and numerous enhancements and optimisations to enable security and the like.
But that raises some interesting questions: what does it mean for a technology to be developed? In other words, where does research stop and development begin?
Some people would say that research involves innovation, while development is just incremental improvement. But I don’t think that captures the whole story. Take the iPod: Apple didn’t invent MP3 players, or even the audio encoding algorithms used by the iPod (Interesting factoid: the audio compression used in the iPod was actually invented by AT&T).
But that’s not to say that the iPod lacks innovation. What Apple pioneered were things like the colourful plastic case, the circular multifunction touch-screen, and most of all the iTunes framework for ripping and organising CDs. All of these were highly imaginative creations that jump-started the market for digital audio and video.The key point here is that research and development are different things — and both involve innovation. I’d say that research involves “fundamental” innovation — which applies to a broad range of industries, such as the invention of the microprocessor and the internal combustion engine, or the discovery of DNA. Cars, cup holders and graphical user interfaces are all development innovations — creative ideas by clever engineers that make these broad advances more useful to the people who need them.
By that standard, Ballmer’s correct. Although the “fundamental research” of the internet was already largely complete by 1995, the development had barely begun. Internet applications were just beginning to emerge (the browser was still brand-new), and ideas like YouTube, AJAX and peer-to-peer file distribution were far off in the future.
But it’s worth keeping in mind that all these developments relied on the fundamental research innovation that came before. Without research breakthroughs, the engineering innovators would have nothing to innovate from. I had an MP3 player before I had an iPod — but without a microprocessor, not only is there no MP3 player, there’s no computer (or these days, no phone, or car or microwave oven).
The problem is that research innovations are less sexy and less visible than their developmental counterparts — the invention of TCP/IP didn’t turn heads, but the Netscape browser sure did.
So my hat’s off to the folks whose research innovations created the internet — and to the folks who are off creating the next generation of research ideas and innovations.
Keep up the good work!