When Auckland company James Odell, which customises cars, was gutted by a fire, it was faced with a wrecked laptop that contained all the vital information about the company.
The laptop was melted and burnt, as well as being water-damaged from fire hoses, and would have looked to most people as if it was beyond repair (pictures, page 19).
But the company was able to retrieve its valuable data by using shadow-protect software and virtualisation technology and was able to re-open from new premises just days later.
James Odell contracted Softsource to retrieve the data and also to rebuild the company’s IT infrastructure. This now includes new servers, PCs and a laptop.
Every day now, says James Odell managing director John Forrest, the company ensures it backs up its data using a flash key, the laptop is taken home, and a server now allows remote access to the data.
However, there is more to back-up and recovery than computer forensics and a little flash key.
A whole variety of technologies are now available, including back-up over the internet, although disks and tapes still remain the most common means of data storage. Virtualisation is also beginning to have an impact.
Softsource director John Harrop says the landscape has shifted dramatically. Previously, back-up consisted of tape, disaster recovery sites were cold, and archiving was confined to paper-based records. But, today, back-up and archiving technologies complement each other.
“With the increase in WAN speeds, de-duplication technologies and dramatic cost reductions in the telecomms sector, [this is] opening up smarter and more efficient ways of doing things. Companies such as ours are able to offer services such as managed back-up and archiving services to mid-market New Zealand. This frees up our customers’ information management departments from the mundane, and allows them to focus on the things that will bring their business wealth and success,” says Harrop.
However, suppliers do need to understand the business objectives and the legal requirements facing their customers when suggesting a suitable back-up/archiving/DR and compliance technology.
Tape is still the mainstay of such systems, but, as the importance if the information stored on it changes with age, the platform it is stored on may have to be different. This is called tiered storage, says Harrop. It means, for example, that data that needs to be instantly available, like that stored on an executive’s laptop, may need a back-up over the wire service while older data could be kept on tapes, stored offsite.
Unisys, however, sees the debate as largely disk versus tape. Tape automation is cheap and stable, with tapes having a shelf-life of 30 years. But tapes can be perceived as slow, difficult to manage, hard to secure and complex to administer. Disk offers effective multi-threading, random access, and, frequently, raw speed, too. But disks are expensive, hard to “make portable” in the absence of high bandwidth, and easy to erase or accidentally delete.
Geoff Dickson, Unisys’ sales and services, systems and technologies vice-president for Asia-Pacific, says companies should use both disks and tapes, by using a disk-to-disk-to-tape (D2D2T) architecture.
Virtual Tape Libraries (VTL) provide a good intermediate step, and a mechanism of improving back-up and recovery times while enabling legacy back-up environments that do not support a D2D2T model. But Dickson says that VTL should be considered a short-term solution.
“The back-up solution will need to provide support for both disk and tape [to be used] as back-up storage media. Doing so will provide for more flexible back-up solutions in terms of fault tolerance, load balancing, more efficient storage utilisation and centrally managed data movement policies,” he says.
Key new technologies
Key technologies to improve back-up and recovery solutions include data de-duplication and data archiving.
Data de-duplication technologies can reduce the space consumed by back-ups up to ten times compared with traditional methods. They can similarly reduce the network traffic associated with back-up by up to 90%, so reducing back-up and recovery windows considerably.
“Data de-duplication can, generally, operate at the file-and-block level. File de-duplication eliminates duplicate files but is not a very efficient means of de-duplication. Block de-duplication looks within a file and saves unique iterations of each block,” Dickson explains.
Typically in larger firms, 80% of stored files won’t have been accessed for six months or more. Thus, data archiving, including email, can reduce the online storage requirements of email and other servers. Such firms need policies to decide what to archive, as well as the tools to operate archiving.
Dickson says de-duplication has excited much customer interest, with organisations also having to think about back up with file archiving in mind too, to help reduce the amount of data that needs to be backed up.
Symantec sees disk as being the first choice for an initial storage unit, before migration to tape for the period of retention for the data. Virtualisation also helps.
“Remote locations can be given the same attention to detail for recovery as the datacentre, by using the technologies of disk-based back-up/recovery, virtual-machine support, enterprise o/s application protection; [as well as] leveraging the open storage solutions, for ease of use, and [employing] granular recovery technology from a single management console,” says ANZ systems engineering director Paul Lancaster.
Businesses have varying levels of data stored both inside and outside the business, based on various tiers of storage. Software tools are thus essential to communicate between them, he says.
“The ability to have disk as first point of call of recovery is key, and the ability to change the variations of storage units and locations is another major aspect,” Lancaster says.
“Architecture, design and deployment of storage solutions are key to any back-up design, and the pros and cons of using one over the other falls back purely on the tiering of mission-critical applications with the associated data, along with the disaster recovery plan.
“Recently, there has been a stronger move towards disk-based storage technologies. However, vendors, distributors and partners are also providing storage devices, as different business markets require different storage solutions and technologies,” he adds.
Lancaster says the growing use of data de-duplication, to help businesses reduce their storage and bandwidth needs.
Tape technology improving
Auckland-based DR specialist Plan-B says it uses tape for 98% of its customers, and recovers more than 1,000 customer servers in scheduled “testing and proving” actions.
Director Symon Thurlow says many customers will use a block of SAN space or attached storage configured as a fast RAID array, and the initial back up of each server will be written onto disk.
“This provides the benefits of a very fast back up, reducing the back-up window… [Data] is subsequently written to tape to be sent offsite, outside the time constraints imposed by the ever decreasing back up window. This approach also offers the benefits of immediate restore, without having to retrieve a tape from our vault,” says Thurlow.
While many users say disks store data faster, Thurlow says that improving tape technologies mean this is not always the case. Tapes are also more robust than fragile disks.
However, online back-ups are also creating a buzz in the corporate market because of their fast back up capability, helped by disk storage which allows the simultaneous recovery of multiple servers. The technology also allows for back up several times a day, although communication costs can be a factor here.
Nonetheless, Thurlow expects online back-up to become the predominant back-up form over the next few years, helped by businesses becoming increasingly willing to accept the SaaS solution format. Virtualisation also allows for replication of servers at low cost. This allows Plan-B customers to access its remote servers at very low cost.
Revera managing director Roger Cockayne says compression technology is advancing — be it data compressed by software or compressed on the hardware itself. Such high compression rates allow users to store a lot more data online.
Typically, businesses are now archiving their data but are not placing enough emphasis on speed of data restoration, which is usually the important issue, says Cockayne.
Some large corporates “triangulate their data”, which involves having a primary copy at the primary production site, with extra copies being maintained at a DR site and a third remote site. But such triplicate network and storage management costs are not cheap.
Cockayne sees disaster recovery as data recovery. Servers can be replaced, but if data is lost it’s gone forever.
“Data is more important than servers. The server-side has been made much easier with virtualisation — additional machine units can be created instantly on utility systems. However, without data or network connectivity they’re not much good,” he says.
“To diminish the risk, and better liquefy your data, the management approach must embrace a cloud concept, where datacentres — computing capacity — are interlinked, and access to one means access to all of them. That means your data can be spread geographically, for better protection and availability, and your virtual servers can go with them.
“Virtual servers are data — they’re just files. So, a virtual server, with its operating systems and settings, simply becomes a data-set than can move around with the data, which means it can be cranked up from anywhere — just so long as you have network access to the cloud,” says Cockayne.
Naturally, he also advises firms to let “the experts” manage such back up and recovery.
“The amount of bandwidth required to get data over the internet is reducing, thanks to new technology at each end. Data can be compressed with software before it’s sent over the line, and again with hardware.
“But the best network is the internet. With the internet, if you can’t find a way through one route, it will try alternative routes. And that’s massive opportunity. The cloud will eventually include networks. The only hold-up is security,” he says.