articles Big Data Big-Storagewala Storage

How Storage Is Changing In The Age Of Big Data

How Storage Is Changing In The Age Of Big Data

Data is now the most valuable currency within any enterprise. Unless an organization has data and uses that data, it is highly unlikely that they will be on the path to business success. The prolific growth of big data and analytics within the enterprise only cements this truth further.

By 2020 we are looking at approximately 1.7 megabytes of new information created every second for each human being on this planet – the data universe will grow from 4.4 zettabyets today to 44 trillion gigabytes!

Clearly, a good deal of that data is being held on consumer devices or cloud services – but the figures give a clear signal of the pace of growth.

As the data expands, the universe of business opportunities expands too. However, in order to maximize opportunities from data, enterprises have had to take a look at their storage capabilities. After all, over the past few years, data, amongst other things, has also changed phenomenally. The concept of data tables is antiquated. Applications are changing. The once prominent legacy applications are now on the gradual path to redundancy and have paved the way for microservices. The relationships between data items are becoming tighter and demand greater granularity and sophistication.

“The revamped storage architecture gives us four times the performance for a fifth of the cost, and with complete flexibility” – Eddie Satterly, IAG

Traditionally, storage architecture was built to manage and scale up to accommodate the growth in data. However, this scaling demanded additional capacity which meant adding more hardware resources.

The problem here was that since these were controller-based systems, the enterprise was left with a great deal of storage sprawl and a very siloed environment. While this worked for a while, today as the volume of data is increasing – both in its growth and use – enterprises realized that they needed storage consolidation within a scalable storage infrastructure. The hour of the Hybrid IT storage was clearly upon us.

Let us have a look at the evolution of data storage over the past few years.:

Software Defined Storage

Software Defined Storage has evolved as well with the growth of Big Data and has become more intelligent. SDS allows big data streams to become more clustered which gives data the ability of faster callbacks, allows for voluminous data processing while facilitating faster and appropriate data search outputs. I-SDS gives the enterprise the advantage of speed without compromising on accuracy.

Along with active data, enterprises also have large volumes of passive data; data that does not need ready availability making it the perfect candidate for cold storage. Enterprises can free up disk space by transferring all the backlog information that does not need regular access here while keeping the hotter and frequently used data on Flash or SDS.

We have also seen enterprises veer towards using file storage systems such as Apache Hadoop that not only helped organizations save massive files but also analyze that data using parallel computations.

Flash Storage

Flash storage is another storage technology that reinvented itself with the rise of big data. Today, Flash is able to offer enterprises greater storage density with a smaller footprint and at lower costs. High-performance Flash storage is fast becoming the enterprise workhorse that is leveraged for primary data storage and applications that employ the use of this data. Flash is now also being considered within a secondary storage system owing to greater density and lower costs. Instead of building new data centers, enterprises can now squeeze more capacity within the same space and yet have the performance advantage. Given the rise of OTT media, entertainment industries have been leaning in towards Flash because of its ability to respond to sudden spikes in access requests. Enterprises are also considering Flash for backup and data protection as Flash gives the flexibility to add data and respond to user search requests almost instantaneously.

The Cloud

The Cloud promised to ease some of the data storage woes for the enterprise. Enterprises could save vast volumes of data in the cloud, it was scalable and promised agility. However, one big concern was that of cost and security in the cloud. The Private Cloud promised to be secure but it had high costs associated with it. The Public Cloud was less expensive but also was not the best place to house sensitive data. Enter, the hybrid cloud which did a mash-up of storage on the cloud, on-premise storage and hardware storage. The hybrid cloud gave organizations the flexibility to access the type of storage they demanded depending on the application security and accessibility demands and did away with the worries of compliance, security, and latency. Added to this, the hybrid cloud also offered cost efficiencies.

Clearly, Big Data has also led the storage revolution. However, when making their storage choices, it became increasingly clear that choosing just one storage option would never lead the path to storage nirvana. Employing an appropriate mix of storage technologies (using flash or SSD’s for data that demands faster access and employing HDD based storage for data that doesn’t demand high performance) along with using a tiered storage architecture that employs a healthy mix of slow but cheap as well as fast and expensive storage units for Tier 1 and Tier 2 applications makes a lot of sense.

Finally, enterprises have also recognized that moving data around can be an expensive and time-consuming process. Hence, keeping data closest to the applications using it or creating a distributed processing system that facilitates pre-processing of data portions near the point it is generated helps in preparing a robust data storage system.

You may also like

Read More