Bacteria-based storage systems can save data for thousands of years while protecting it against nuclear explosions. Atoms can hold 250 terabits of data per square inch of surface area. There are organic thin-film structures with more than 20,000 write-read-rewrite cycles.
It sound like science fiction, but it’s not far from being science fact — research from two prominent universities indicates that it’s possible to store digital data in the genome of a living organism and retrieve the data hundreds or even thousands of years later, after the organism has reproduced its genetic material through hundreds of generations.
“Consider [that] a millilitre of liquid can contain up to a billion bacteria, and you can see that the potential capacity of bacteria-based memory is enormous,” Pak Wong, Pacific Northwest National Laboratory (PNNL) lead scientist, noted in a 2003 paper.
In the paper, Wong and a group of PNNL researchers described an experiment three years earlier in which they stored about 100 base pairs of digital information (roughly one encoded English sentence) in one bacterium.
Earlier this year, scientists at Japan’s Keio University Institute for Advanced Biosciences reported similar results in their research, claiming that they successfully encoded “e= mc2 1905!” — Einstein’s theory of relativity and the year he enunciated it — on the common soil bacteria Bacillus subtilis. (This was reported in Computerworld last month).
According to the scientists, DNA-based data can also be passed on for long-term preservation of large data files.
One of the challenges faced by Wong’s group was providing a safe haven for DNA molecules, which are easily destroyed in any open environment inhabited by people or other hazards.
Mindful of DNA’s fragility, the PNNL scientists provided a living host for the DNA that tolerates the addition of artificial gene sequences and survives extreme environmental conditions. It was essential that the host with the embedded information be able to grow and multiply, says Wong.
Perhaps the biggest challenge faced by the researchers was retrieving embedded messages. “The retrieval of the information stored in a bacterium remains a wet-laboratory process that requires a certain amount of time and effort to accomplish. It took us about two hours in 2000 to complete the information extraction process,” says Wong, adding that it will take decades to develop data-retrieval techniques similar to those of today’s commercial IT systems.
Most of the potential applications for DNA-based data storage relate to the core missions of the US Department of Energy (DoE), which funded all of Wong’s work. Other security-related applications include information-hiding and data steganography — the hiding of data inside other data — for commercial products, as well as those related to national security. One possible application is using DNA-based storage to keep copies of data that may be destroyed in a nuclear explosion, especially if internet infrastructure is taken out by the blast.
Speaking of things nuclear, the power of the atom could also be used for good and harnessed as a storage medium.
According to University of Wisconsin professor Franz Himpsel, in 1959 American physicist Richard Feynman gave a visionary talk titled There’s Plenty of Room at the Bottom, in which he asked whether it would be possible to shrink devices all the way down to the atomic level. At the time, he predicted that all printed information accumulated over the centuries since the Gutenberg Bible would someday be able to be stored in a cube of material 1/200 of an inch wide, a unit barely visible to the naked eye. Feynman believed that the ultimate storage medium would store a bit in a single atom, with a few atomic spaces between bits in order to prevent them from coupling.
In 2002, a two-dimensional version of Feynman’s atomic memory was formed on the surface of silicon by a small amount of gold, which triggered the formation of self-assembled tracks. It looks similar to a CD-ROM, but the scale is nanometers instead of micrometers. Therefore, the storage density, which is based on the ability to store one bit of data on one atom, is a million times higher.
“The minimum empty area required around each bit is five by four atoms, four atoms from one track to the next,” Himpsel notes. “Feynman’s 1959 suggestion of spacing the bits five atoms apart was right on the mark.”
Unlike bacteria-based storage, atomic storage is easy to access. Reading the memory consists of a simple line scan with a scanning tunneling microscope along the self-assembled tracks. There is no need to search in two dimensions for the location of a bit. The signal is highly predictable, since all the atoms have the same shape and sit on well-defined lattice sites.
Writing data, however, is more difficult — and time-consuming. Even though the storage density is 250 terabits per square inch, the data rate is extremely low. As the size of a bit shrinks, less energy can be extracted from it during readout. Therefore, a longer integration time is required for obtaining an acceptable signal-to-noise level. Even the theoretical limit of the data rate with the best possible readout electronics is still far lower than what hard disks achieve today. It is so slow that it would take about a million years to write a square centimetre of data.
That’s fine with Himpsel, whose work in this area has been funded by the US National Science Foundation. The idea is “to go so far out that you reach the real limits that nature gives us for density of storing data”, he says. “It’s so far out that it’s not practical, and it’s not intended to be practical.”
Himpsel has also compared the silicon atom memory to that of DNA. In so doing, he found that DNA needs 32 atoms to store one bit, which is comparable to the area of 20 atoms around each bit at the silicon surface.