Home / Banking Strategies / Storage Tiering for Data Management

Storage Tiering for Data Management

The numbers can seem daunting. IDC predicts that by 2012 the world would generate 2.7 trillion gigabytes of data, up 50% from 2011. McKinsey & Co. predicts that the amount of data generated will grow 40% each year through 2020.

Financial services companies are contributing to this rapid pace of data growth. McKinsey reports that financial institutions with 1,000 plus employees lead all industry sectors in terms of data volume with 5,800 terabytes of data stored on average. The New York Stock Exchange creates one terabyte of data per day by itself. Financial data is continuously infiltrating the market and organizations must determine a way to appropriately store this information in an accessible yet secure and compliant manner.

As this type of critical business data continues to increase, Information Technology (IT) Managers at financial firms require both faster access and more efficient storage infrastructures. In many cases though, data growth-driven storage needs are hindered, competing with other enterprise-critical expenditures. Data is the most critical and valuable asset for this industry yet many organizations often do not have an optimized, secure storage strategy in place capable of growing with their needs.

Many financial institutions deal with the data deluge by buying increasing amounts of primary storage. While this approach keeps data close at hand, it is excessively expensive and makes it difficult to find specific data – similar to looking for a needle in a haystack. A better solution can be found in an approach proven in enterprises called storage tiering, where data is stored in higher performance, lower cost storage, or longer-life types of storage over time, based on the current value of the data to the organization. This keeps the data readily available when it is most needed, while cost-effectively and securely storing it for use down the line.

Storage tiering is not a new concept, but it may be new to financial services organizations that are being increasingly challenged by the volume of data they are creating. By learning about tiered storage, and by building data archiving, backup and other data management solutions on top of a tiered storage infrastructure, financial institutions can reduce content storage costs, quickly transfer large data sets and safeguard valuable data. The tiered approach uses higher-cost storage for business critical data, lower-cost storage for nearline access and removable tape or disk or cloud storage for low-access data needs.

The online storage tier is where a company’s active, primary production data resides. High-cost, online storage is usually reserved for data that is most frequently accessed and is of highest importance, yet many many companies still use the same online storage technologies for backup. This approach is expensive and susceptible to failure.

The nearline storage tier represents an intermediary between online storage, which enables rapid data access, and offline storage, which is more affordable but requires more time for data access. To reduce storage costs without dramatically reducing speed of data access, companies can replicate data to a nearline data protection device. They also can set policies for types of data that should be stored online or nearline. This tier of data access can provide a copy of data in the event that online data is compromised.  

The offline storage tier maintains copies of archived data at a physically remote location. The best data management solutions enable multiple ways to do this, including removable media that can be transported to another site. RDX is a good example of a removable media format that delivers cost efficient backup, recovery and archiving. RDX is a unique type of media that offers all of the advantages of disk-to-disk storage, including high performance and a low failure rate, plus the removability and portability of tape. As removable storage, RDX supports offsite storage for disaster recovery and offers a practical way to seed cloud storage. New secure versions of RDX media also feature encryption and cryptographic erase capabilities.

Financial institutions need a system that will not only provide an onsite copy of data for fast restoration, but also an offsite copy if the disaster causes damage to the primary storage location. As confidence in cloud providers continues to increase, more companies will utilize cloud storage for offsite backup. The main benefits of cloud storage are threefold. First, the cloud provides true disaster recovery and business continuity. It adds critical offsite storage to ensure that a business’ most important asset is accessible in the event of a disaster. Second, cloud providers offer pay-as-you-go options, which enable businesses to account for storage as an operational expense, not a capital expense. Finally, cloud storage is infinitely scalable, and additional capacity can be used when needed. However, it is important to keep in mind that scalability is not just about the size or amount of storage. The speed of access and throughput must also be scalable.

A four-tier, RDX-based storage approach offers financial institutions scalability and agility as the amount of data being stored increases exponentially. Companies need to look for appliances that support increased-capacity media cartridges or the easy addition of new appliances or arrays. It can seem daunting for the finance industry to overhaul their storage infrastructure while adhering to regulatory constrictions and compliance efforts. However, implementing a four tiered storage strategy is the best approach for cost-effective optimization and data backup.

Mr. Rosenberger is global product management director, Scalable Storage Business, for Oakdale, Minn.-based Imation. He can be reached at [email protected]