Surviving the deluge

Storage, rather than being simply an extension of servers and applications, is becoming a resource in its own right and needs to be managed with this value in mind.

Storage, rather than being simply an extension of servers and applications, is becoming a resource in its own right and needs to be managed with this value in mind.

In March, Computerworld’s US sister publication InfoWorld polled 500 readers to gain an insight into storage trends and troubles and how IT managers plan to counter these challenges. Considering the deluge of data IT bosses are trying to manage, it’s small wonder that 73% of respondents consider data storage needs a high or business-critical priority.

A wealth of business intelligence, analysis and data mining tools out there have made previously useless data useful, so in response companies are stockpiling huge quantities of data from application suites such as ERP and CRM to extract valuable information on variables like customers, products and operational data. This is not to mention the internet, which brings a world of data for employees to import and store, much of it, such as streaming video and audio, hogging terabytes of space.

Survey respondents report that of their combined data, 45% of storage is allocated to host databases, 21% is dedicated to user files, 15% is used for email messages and attachments, 6% goes to web content, while other application data accounts for the remaining 13%.

The traditional approach to deploying and managing storage devices has become impractical and, in most cases, unsustainable. Simply piling more capacity on top of the old is no longer the answer — if it ever was. Different types of data need different storage methods tailored to the unique business requirements of each company.

The InfoWorld survey revealed that organisations are implementing different architectures in order to achieve this outcome. It found that 52% of respondents have deployed a file-based NAS (network-attached storage) solution, whereas 25% have deployed block-oriented SANs (storage area networks). To this multiple-answer question, 52% of readers also report that they continue to use DAS (direct-attached storage).

When it came to the most important factors in choosing a storage solution, the majority of respondents emphasised efficiency, leveraging existing resources and controlling costs.

Seeing through the obstacles

Choosing the right architecture and mix requires an accurate assessment of the data and an open-minded understanding of its business relevance. Right now there are an unprecedented number of tools and technologies available.

Although SCSI remains the dominant interface for storage devices, complementary transport protocols such as iSCSI (internet SCSI) are challenging the expensive option of FC (fibre channel) to allocate storage devices conveniently on networks.

With a multitude of devices spanning the typical enterprise, data access can be difficult and backups are typically less than seamless, so taking full advantage of storage capacity is often impossible.

For the better part of two years, vendors have been throwing around the term “visualisation”. Like most new ideas, this concept is little understood and probably ahead of its time.

The basic concept is that accessing data should be transparent to anyone authorised to use that data, regardless of where it’s stored on the network. This means end users should not have to know anything about storage devices or the format of the data that resides on these devices.

The only way to accomplish this today is to acquire and deploy a handful of expensive, proprietary software products.

IDC analyst Graham Penn, who specialises in storage, says virtualisation is a necessary but insufficient step in enabling improved storage management. It provides a view of the total storage resource as “one great big hard disk” whatever its physical location. This involves an abstraction layer so that the application and the administrator don’t have to worry about where the data is stored.

However, virtualisation by itself does not provide a complete answer. It is also the subject of considerable controversy in its own right (in-band, out-of-band, at the server, in the network, in the storage system and so on). EMC refuses to use the term as its Symmetrix product does all of this internally.

“I would be very concerned to portray storage virtualisation as a panacea,” says Penn. “It is merely a technique or an enabler.”

Penn maintains that effective storage management has to include three entities:

  • the management of storage devices and systems

  • the management of the data

  • the management of users.
It also must work with the applications involved.

Each of these has many elements which are addressed to varying degrees by “point products”, he says, from Veritas, Legato, CA and BakBone Software or by “global products” such as EMC’s AutoIS, HDS’ TrueNorth, Veritas’ SanPoint or IBM’s forthcoming Tivoli StorageTank product.

One worrying issue that has been raised by some analysts and users is just what happens if something goes wrong with virtualised data and the “pointers” to it get out of synch. It’s a complication vendors will have to seek to address, as important data may prove unrecoverable.

Part of the way

Penn says apart from a few, typically larger, corporates, most are just starting the journey to better storage management. Many firms still have most of their data on directly attached internal or directly attached external storage linked to a single server.

Until they break this server/storage linkage and move to a networked storage infrastructure they won’t be able to gain the benefits of forthcoming (and partially evolved) storage management software. “The logical cannot really be achieved until the network is in place,” Penn says.

Even then, with a network storage infrastructure there is no guarantee the promised benefits will be realised.

“More work needs to be done in each organisation taking into account its particular business objectives and operating characteristics,” says Penn. This means the real value in the future will be in storage services rather than in commodity storage hardware or even in storage software alone.

As an organisation changes and grows the infrastructure and operating methods will also need to be reviewed and modified.

“The whole lot is complex with many, many layers. Hence the difficulty in getting the organisation — read CEO, board of directors — to recognise and address the problem.”

Meanwhile, companies like Microsoft are moving into the storage market and pundits expect to see approaches to virtualisation that leverage web services and XML.

XML can sit on top of a storage solution, while a web service can provide the infrastructure to point to and call that data. This means that cache memory and storage devices could be conscious of each other, and memory could summon data across the network to wherever the user needs it. Odds are you will see storage vendors such as Network Appliance and EMC working with Microsoft to create this architecture, though it is not yet known how other companies will respond. Sun and IBM are also strong in web services, but use Java development tools.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags storage

More about BakBone SoftwareDASEMC CorporationHDSIBM AustraliaIDC AustraliaLegatoMicrosoftNASSymmetrixTivoliVeritas

Show Comments