Data storage policies found wanting

Email deletion and shifting data from one format to another are ad-hoc decisions at many organisations

Despite advances in tiered storage technology, many IT managers say they have no idea of the value of their companies’ data and can’t manage it in any automated way.

Several speakers at the Storage Decisions 2005 conference in Las Vegas earlier this month confirmed this view.

Laura Fucci, chief technology officer at the MGM Mirage hotel and casino chain, says her department has implemented a tiered storage infrastructure for its 180TB of data. However, the company is still trying to better manage its storage, she says.

“One problem we have is we don’t have a storage [management] policy. We’re going to tackle that next year,” Fucci says. The company also plans to develop a policy for handling sensitive data such as credit card numbers. “The lawyers are compiling that information now,” she says.

MGM Mirage is in the process of implementing Symantec’s Veritas Enterprise Vault software to archive email and plans to do the same thing for the company’s file systems next year.

“Information just keeps growing. Our demand for storage keeps growing. I’m not sure if we’re ever ahead of the storage problem but we’re going to do something to keep up,” she says.

Of roughly 250 users polled by conference organisers, 51% say they have no way of determining the cost of storing data over time and 47% say they have a tiered storage model and some idea of storage costs but no way to automatically migrate data between tiers. Only 7% say they can definitely determine the value of their data.

Gary Schwimmer, a datacentre operations manager at US defence contractor Northrop Grumman, says his company has developed a data retention policy that involves tagging data using SGML (Standard Generalised Markup Language) to determine what to move and when to move it.

However, he says migrating data from one tier to another is still a manual process prompted by an automated email notification system developed in-house.

Another big issue identified by the speakers is finding ways to ensure data is deleted at the end of its useful life. While some say they delete everything after a set time, others say their data often sits in external storage vaults, requiring the payment of fees and a migration to newer tape technology over time.

Schwimmer says Northrop Grumman’s data deletion policy requires that everything go after ten years. “[However,] we’re struggling like everyone else. The big part is convincing people it’s going to [require] an investment to make things change,” he says.

Richard Scannell, a consultant at GlassHouse Technologies, says IT managers can’t afford not to begin deleting data. Even if the capacity of new storage systems doubles every 18 months, it will never be enough to keep up with data growth, he says. Statistics show that up to 74% of all data storage costs can be attributed to maintenance and administration of existing storage, he says.

Craig Taylor, associate director of open systems at the Chicago Mercantile Exchange, says his group is working to determine how to classify data so migration policies can be created. Taylor’s group has built an elaborate storage infrastructure with five tiers of data storage that include EMC’s Symmetrix arrays, secondary disk storage systems from Copan Systems and tape libraries from StorageTek.

Even so, says Taylor, “do we have any physical deletion policy? No.”

Join the newsletter!

Error: Please check your email address.

Tags technology

More about EMC CorporationMGMMGM MirageNorthrop GrummanStorageTekSymantecSymmetrixVeritas

Show Comments
[]