Printer Friendly

Storage challenging business creativity: hard choices ahead for management and government compliance.

Given the mounting storage management challenges faced by businesses today, some of the long-standing beliefs about storage management are being questioned. In some cases, the new viewpoint is the opposite of conventional thinking. Here are a few examples of the inverse thinking some businesses are considering as their management challenges escalate.

[ILLUSTRATION OMITTED]

Question 1: Is it better to manage storage or just add more hardware?

Managing storage has become far more complicated than simply making sure enough capacity is available to meet demand. Disk and tape technologies from multiple vendors, a variety of switching devices, network management, SAN, NAS, DAS, implementing acceptable backup/recovery and high availability schemes, and meeting the SLAs of key applications leave the cost and complexity of complete and effective storage management out of reach for most non-mainframe businesses.

Throughout the mid-1990s, the easiest way to manage storage subsystems and data was usually accomplished by simply adding more storage. This straightforward strategy worked well for many small and medium-sized businesses for many years. As the growing management gap between storage growth and the number of storage administrators continued to diverge and put business applications at risk, this strategy began to fall out of favor and storage management became a stated mission or mantra for most IT organizations by the late 1990s.

Today, data is growing faster than the deployment of storage management tools and the supply of trained people to manage storage doesn't keep pace with demand. As a result, a great deal of valuable business data is not effectively managed meaning that backup, recovery, performance and capacity are dealt with in a reactive manner, if at all. By 2007, it is expected that the average (non-mainframe) storage administrator will be able to manage about 15 terabytes of storage while the amount of data to be managed will exceed 50 terabytes.

Storage management can be simplified by adding NAS for certain applications, SANs for others, adding virtualization software, unifying block and file storage systems and undertaking consolidation efforts. Overall, these actions have not kept up with the 50-70% annual growth rates of the digital storage pool, and they are costly in terms of people, processes, and expensive to implement given the tough economic landscape.

Possibly the most significant and long-awaited technological solution will arrive in the form of storage management functions and intelligence that move away from the server and into the storage subsystem or network fabric. This concept enables more of a "black box" proactive approach to storage management moving away from labor-intensive and reactive host-server-based approaches. Though under various stages of development, the intelligent fabric is not here yet and until it arrives, businesses are, out of necessity, considering other options.

One option that is gaining more consideration in some areas, rather surprisingly, is not to manage storage. Why try to invest all that money, time and energy to deal with such complexity? After all, if storage management costs have become so much higher than declining hardware expenses, why not return to the mid-1990's philosophy of just adding more hardware? Is this a sustainable or viable solution given that the resources and infrastructure to effectively manage storage and data are falling further behind the demand curve?

Bottom-line: Storage deployment is falling behind storage growth. With the management gap widening, management expenses growing, and hardware prices falling 30-40% annually, for the near future, businesses are reconsidering simply adding more hardware as the less-expensive management strategy.

Question 2: Is compliance with regulatory agencies worth the expense?

Increasing regulatory pressure to comply with federal mandates for e-mail, medical, insurance, legal, financial and government classified data are quickly forcing many businesses to examine any potential weak points in their long-term storage systems. New applications and a variety of legal and business requirements are driving the need for many businesses to re-examine or create their archival policies. One of the most visible examples of the emphasis on the increasingly critical value of archival data lies with the HIPAA (Health Insurance Portability and Accounting Act) requirements. Not only does HIPAA require health providers to preserve data for a yet-to-be-determined time period, but the failure to protect critical patient data carries with it penalties presently ranging up to--or exceeding--$25,000 per violation.

The threat of the fines and other forms of non-compliance are encouraging storage administrators to examine the increasing amount of archival data that would be required to be kept indefinitely for future reference. For example, the PACS application (Picture Archiving and Communications System) that captures and stores radiology information and other types of medical images is a primary component of the HIPAA requirement. Data used to be retained for one year, and then three years, then seven years; now infinite retention seems inevitable for some applications. Some health care businesses are talking about retaining digital records for the patients' lifetime plus seven years! This could be over 100 years, in some cases. Given today's legal, economic and political climate, the value of archival data changes as it ages rather than just declining as it ages. This partial list of regulations is becoming increasingly important to the storage administrator's strategy and includes:

* The Sarbanes-Oxley Act requires that companies define rules for falsification of records and e-mail with retention and deletion guidelines

* HIPAA for medical images and records

* Brokerage Business: SEC Rule 17a-3 and 17a-4

* Telecommunications: Title 47, Part 42

* Banking: OCC and FDIC regulations

* Defense: DOD 5015.2 regulation

* Numerous others under review

Approximately 40 million American citizens have decided that health care costs are so expensive that they no longer carry health insurance, rationalizing that it's cheaper to pay the bills. Could similar thinking be at work here? Some businesses refer to the added expenses associated with compliance as forcing all businesses to undergo a financial root canal procedure when only a few have bad teeth. Just 30% of 136 chief financial officers of large global companies surveyed by Price-Waterhouse Coopers in June 2003 had a favorable opinion of the Sarbanes-Oxley Act, which mandates companies track just about everything. Keeping track of things equates to a major storage, infrastructure, and management investment. The key storage related question has become: Can businesses afford the extra infrastructure, people, processes and solutions to implement this strategy sometimes referred to as data lifecycle management? [Note: Lifecycle data management is more than compliance.] Can today's businesses afford the personnel distraction to their business efforts in order to implement lifecycle management? Or is it cheaper to pay the non-compliance penalties whenever they might occur?

Bottom-line: Full regulatory compliance brings with it significant added expenses that not every business can afford. Though risky, some businesses will decide to take their chances and not fully comply.

Question 3: Is it worth it to move data from one level of the storage hierarchy to another?

The storage industry has traditionally used a pyramid or triangle to depict a hierarchy of products that spans all levels of price, performance and capacities available. The economic value of optimizing a storage hierarchy, getting the right data in the right place at the right time, has been an important storage management philosophy, which provided significant cost savings for years. Broad ranges exist within the hierarchy for all three parameters, and some unfilled access and cost gaps stand out such as the random-access Nearline storage segment. This gap may finally be filled with tape libraries using embedded disk arrays and the emerging MAID storage category.

Magnetic disk contains an estimated 95% or more of all the world's mission-critical data, and its ultra-high reliability and availability are the key selection factors over all other devices for mission-critical data. Disk subsystems have clearly defined the high-performance and high-capacity levels in the storage hierarchy with price-per-megabyte and access density the major differences between levels. Nearline defines the level of storage between online disk and farline or shelved storage. In addition to being the backup medium of choice, Nearline storage is the primary repository for much of the world's digital data and contains over 80% of the world's digitally stored data, archives, and fixed content. Automated tape is less expensive on a per-gigabyte basis than disk and normally ranges from 1/15th to 1/20th the price per gigabyte of disk subsystems for Unix, Linux, and Win2K environments. To optimize the hierarchy means that data must travel from disk to tape storage and back again for re-reference in many cases. This adds a significant I/O load to a server and its associated data paths, potentially impacting the service levels of other key applications.

Managing and exploiting the hierarchy to its maximum benefit is an increasing challenge. As storage grows, the payoff for implementing an effective storage hierarchy becomes enormous. The robust hierarchy of storage devices that exists today, in conjunction with upcoming intelligent storage management software and SANs with outboard data-movement capabilities, will finally enable the TCO of storage to be significantly reduced from current levels.

Bottom-line: Hardware expenses can be significantly reduced by optimizing the storage hierarchy. The system overhead associated with moving large amounts of data between levels of the hierarchy and in and out of servers adds to the total cost of optimizing the hierarchy. As storage pools get bigger, can this I/O tax become unaffordable? This question will be asked more and more until the intelligent fabric arrives and enables outboard data movement between the disk and tape levels of the hierarchy to occur without the server absorbing the I/O overhead.
COPYRIGHT 2003 West World Productions, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2003, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Storage Management
Author:Moore, Fred
Publication:Computer Technology Review
Date:Nov 1, 2003
Words:1573
Previous Article:Making the most of your business, trade media opportunities.
Next Article:Tiered storage: an idea whose time has come; it's just one driver of the new storage dynamism.


Related Articles
Network Appliance broadens enterprise storage solutions portfolio.
EMC: serving Latin America's Top 100.
The rise of storage process automation.
It's 2003: do you know where your data is? The government is enforcing strict new guidelines on archived data. Is your company complying?
Lifecycle management drives data management's evolution from art to science.
The impact of compliance on storage: will you benefit from increased demand?
Compliance cuts across industries, storage products.
New ILM solutions for regulatory compliance: case study on how a customer achieves both financial and operational efficiencies.
Storage down cold: DLTIce is a compliant electronic storage medium.
Not Information Lifecycle Management but Information Value Management.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |