To take complete advantage of big data, enterprises must take a holistic approach and transform their view of storage from a ‘data destination’ to a ‘data platform.’ As a platform for big data and not just a destination for data storage, enterprise storage solutions need to deliver cost-effective scale and capacity; eliminate data migration and incorporate the ability to grow without bound; bridge legacy storage silos; provide global accessibility of data; and protect and maintain the availability of data.

1. Deliver cost-effective scale and capacity
To minimize cost, a big data storage platform must take a ‘scale-out’ approach, achieving scale by pooling industry-standard commodity servers and storage devices. This offers not only low costs today, but also the ability to benefit from increased buying power as hardware gets better, faster and cheaper over time. An effective big data storage system must also be scalable in the performance dimension, so that applications experience no degradation in performance as the volume of data in the system is increased.

2. Eliminate data migration
With enterprise data stores now approaching petabyte sizes, wholesale data migration is no longer logistically or financially feasible. A big data platform must address the requirement for periodic data migration by providing a system with the ability to grow without bound.

3. Bridge legacy storage silos
To be able to fully exploit the opportunities of big data, companies must be able to access and use all of their data without ad-hoc interventions. To enable this, a big data storage platform must bridge these legacy storage silos, rather than simply add yet another storage solution to the mix.

4. Provide global accessibility of data
A centralized approach to data management is no longer workable in the age of big data: data volumes are too large, WAN bandwidth is too limited, and the consequences of a single point of failure are too costly. A big data storage platform must be able to manage, as a single, unified pool, data that is distributed across the global enterprise.

5. Protect and maintain the availability of data
Rather than seeking to protect against failure through the use of proprietary, enterprise-grade hardware, a big data storage platform must assume that hardware failure is inevitable and offer data availability and integrity through intelligent software.

We expand on these must-haves for big data storage in a newly published white paper. If you’re interested in a deeper look into these areas, download that white paper here. Or, if you’re interested in Red Hat Storage, visit www.redhat.com/liberate.


About the author

Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies.

Read full bio