I’m thinking of how the Mono Project has diligently worked on open-sourcing Microsoft’s. Net framework, so that you can now code in C# and Visual Basic and run apps written in those languages on other platforms than Windows. (Mono works on Linux, Solaris and FreeBSD, as well as Windows.)
Looking elsewhere, the Apache Software Foundation is an umbrella organisation for several up-and-coming technologies like web services with lots of adjunct standards like WSIF, WSIL, XML and SOAP.
The Mozilla project is home to several emerging internet technologies that we’ll see more of in the future -- all open source.
Linux kernel hacker and Red Hat employee Alan Cox points out that open source is the leader in areas such as cluster computing -- think of the large render farms used by the movie industry and other areas such as oil exploration.
In many ways, the open source movement is a bridge between yesterday’s and tomorrow’s technologies. Just look at how the open source Darwin forms the base of Apple’s Mac OS X: a mature operating system beneath some very new and radical ideas in GUI development.
What’s good about open source is that it provides an easy and low-cost entry to a large number of emerging technologies. By sharing and contributing, you can be part of a technological vision, and not just follow it (if you have the money to do so).
In fact, if an organisation wants its latest and greatest technology to become a de facto standard, the quickest way to gain acceptance is to open-source it, and be prepared to take part in the greater community of developers. But there are some problems here that need to be solved.
When it comes to hardware, the picture is a little different. Open source software is often used to extend the usable life of older hardware -- like installing a Linux distribution on a Pentium PC and using it as a network appliance.
However, hardware designers and manufacturers can be wary of releasing the exact specifications for their devices, as they worry about cloners taking advantage of this and releasing cheap knock-offs. If your company’s just spent millions of dollars creating a new technology, you tend to be suspicious of anyone wanting to peek at the details.
Open source developers cannot, for obvious reasons, sign non-disclosure agreements, so often there’s a lag between new hardware being released and open source drivers becoming available for it.
This has led to companies like Nvidia releasing Linux and FreeBSD drivers for its products, with the crucial parts in binary format and without the source code. While this enables users of some open source operating systems to run Nvidia’s cards to their full potential, it annoys some of the more politicised members of the movement (hence the "tainted" message from the Linux kernel that pops up if you load the Nvidia kernel driver).
On a practical level, not having access to the Nvidia source code makes it more difficult for companies like Red Hat to support their users, because they are not told the intimate details of the driver and how it interacts with the Linux kernel.
The topic of whether or not binary modules/drivers are acceptable regularly leads to long and intense flame wars on the Linux Kernel List and elsewhere. The opposing sides are still far apart. The purists suggest boycotting hardware that doesn’t come with open source drivers. The pragmatists say that that move won’t have any effect until Linux and open source software hits a critical mass in terms of market share. Which will be difficult to achieve without broad support for new hardware.