Virtualization: a strategic tool to beat storage inefficiency. (Storage Networking).
Until now, most IT people have responded to storage shortages with a tactical fix. They'd simply go out and buy more storage. But today's costs and complexity make such tactics a business blunder. It's clear that data center managers need to do something differently.
The time has come to take a strategic approach to storage--one that analyzes and invests in the architectures and technologies that will proactively meet a business ever-growing storage needs. And one tool to use in that mission is virtualization. It simplifies management, uses resources more efficiently, and ultimately helps a business reduce costs.
The Virtual Strategy Setup
Three factors are leading businesses inexorably toward the need for a strategic approach to storage that will likely draw on the simplification powers of virtualization.
First, storage is growing faster than disk is getting cheaper, and IT budgets are flat or shrinking. Until the first half of 2001, storage was growing at around 100% every year. The economic slowdown and the aftermath of Sept. 11 has reduced this number to somewhere around 60% to 80% growth. However, there is every sign that we may return to 100% annual growth rates very soon. Why is this happening?
In the last 10 years, we have a rapid growth in the value of information to the business. In an increasingly commoditized and globalized business world, information has become the key to competitiveness. At the same time, the nature of information has changed. No longer is a string of text and numbers good enough. Information is just as likely to be transmitted as video or voice or complex graphics.
In addition, our most common means of communicating in the business world-email--contains much more data in its attachments than in the email itself. Every business in the world is being compelled to build ebusiness infrastructure, from supply-chain management to customer relationship management and everything in between.
All of these are adding tremendously to storage growth within our businesses. But while growth is hovering around 70% a year, storage efficiencies are only improving at a rate of about 32% a year. IT managers find themselves forced to spend more on storage, and with flat or shrinking budgets, that's money they may not have. At some point, we'll all need to face the facts: We can no longer afford to throw more disk at our storage problems.
There are people costs to worry about, as well. Human resources expenses are the second of those three factors compelling a strategic approach to storage. Storage growth has risen much faster than management productivity. Moreover, even with the dot-coin meltdown, storage managers are in scarce supply. And when you find them, they're expensive.
Plus, there's a third problem weighing on storage managers today: complexity. Storage is gaining so much in terms of mass (size) and complexity, it's exceeding the ability of humans to manage it error-free.
If IT can't sustain storage growth, a company may not be able to compete. Which brings us to virtualization and how it fits in a company's storage strategy. Virtualization is an enabling technology--perhaps even the foundation technology--of simpler, more manageable and less expensive storage solutions.
Simplify, Simplify, Simplify
Echoing the words of Henry David Thoreau, it's time to "simplify, simplify, simplify" in the world of data storage. You can do it using virtualization at multiple levels within your infrastructure.
Storage virtualization presents a view of storage that is at once straightforward and familiar, without any need for the system to know what physical storage devices actually exist. At the most basic and most common level, virtualization is being implemented within the context of storage area networks (SANs), networks that contain servers, storage devices, and the network topology. SANs allow many servers to share access to many storage devices, which means you can use those devices more efficiently and therefore need fewer of them.
Virtualization in SANs provides a complete separation of the server environment and the storage environment. It is this separation that builds the foundation for storage management outside of the traditional model of management existing within the application server. Here's the sequence of events:
* Define a logical or virtual view of the storage to the servers.
* Link the logical view that the servers have to one or more physical devices.
When the servers wish to access their storage devices, they simply engage in a dialogue with the storage they believe exists, and the virtualization platform dynamically converts the "language" the server is speaking to the language of the real physical devices allocated.
The topmost value of virtualization at the SAN level is this: If and when the real physical storage world changes--and it always does--the logical view from the servers doesn't. No change in the real world should impact the servers and the applications that run on them. You'll simplify storage management, significantly cutting your system downtime and management costs.
The discussion above talks about storage virtualization in the context of storage networks. There are really three places where storage virtualization can take place--in the server, in the "fabric" of the storage network, or in the storage devices themselves. There are pros and cons for the implementation of virtualization in all three places. Here's where your strategic muscles get some exercise.
The Strategic Approach
Many storage challenges face today's IT manager. We need to improve the fundamental cost structure of our storage resources, enhance manageability, provide an appropriate level of data protection and be able to scale while protecting our storage investments. We've already seen how virtualization at the SAN level can simplify complexity and improve manageability. To evaluate how virtualization at the device level can help your company, you should examine your data's lifecycle curve.
Early in data's life is when it is most active, when it is contributing most to revenue generation and when it is of most value. That's when it makes strategic good sense to apply extraordinary measures to protect your data and ensure its accessibility.
As data ages IT managers should begin asking themselves, "Can I reduce the cost of protecting and leveraging this information?"
It helps to classify data into several categories:
Critical Data: Includes the systems and applications used in key business processes or data that must be preserved for any legal reasons.
Vital Data: Information a company can function without for short periods of time.
Sensitive Information: Data used in everyday operations but for which there are alternate sources available in case of loss, making restoration relatively easily.
Non-Critical Data: Can be reconstructed with minimal cost. Generally it's data that's already been copied or has low security requirements.
Once you classify data, determine how quickly and in what hierarchy systems should be restored. You could aim for: immediate recovery, same-day recovery, 24-hour recovery, or longer.
The point of analyzing your data lifecycle continuum is to divvy up investments strategically. At every point along the curve, there are opportunities to reduce cost.
For instance, you can cut costs by migrating data from disk to tape storage and back again. You'll find virtual products that can do that process for you automatically based on your own user-defined policies.
For data that falls in your critical or vital categories, ask yourself, "Why do I need to mirror enterprise-class disk to enterprise-class disk? Why can't I use ATA-class disk? Or virtual disk?"
Virtual disk never uses disk space until data is actually written to the disk, so it allows you to use every bit of the disk purchased. You won't be wasting that 30% of your storage investment on disk that's allocated but unused as you would with traditional disk.
Virtual disk also has pointers for every volume and every dataset, which allows for unique point-in-time snapshots of your data, thereby letting you make virtual "copies" instantly without consuming disk space.
In the tape world, virtualization at the device level can free up library slots and increase performance. Without virtualization, tape automation rarely touches more than half of a large data center's cartridges, so tape management remains a labor-intensive chore. Virtualization enables 100% automation without adding hardware or physical drives.
Clearly categorizing data and planning storage buys strategically can help IT pros deal with today's growing storage needs and corporate budget woes. For maximum efficiency, storage virtualization is an enabling technology that provides a simpler, less complex and more stable environment upon which to build.
Ultimately, the IT industry must build software that provides automated policy-based management systems. The sheer scale and complexity of storage will dictate that we reduce or eliminate human intervention. Storage virtualization will be an important element in meeting this coming challenge.
Rob Nieboer is director of business management, growth markets, at StorageTek (Louisville, Colo.)
|Printer friendly Cite/link Email Feedback|
|Publication:||Computer Technology Review|
|Article Type:||Industry Overview|
|Date:||Nov 1, 2002|
|Previous Article:||The new world of NAS virtualization: the next generation could fulfill NAS' promise. (Storage Networking).|
|Next Article:||The what and why of virtual tape: hard disk augments that storage standby. (Storage Networking).|