Big Data

5 trends that will drive big data in 2016

Big data, the cloud, mobile technology and IT security continued to grab the lion’s share of technology headlines in 2015, and IT and data leaders can expect more of the same in 2016.

That is the prediction of MapR CEO and Cofounder John Schroeder, who sees an acceleration in big data deployments, and has crystallized his view of market trends into these five major predictions for 2016.

Converged Approaches Become Mainstream

“For the last few decades, the accepted best practice has been to keep operational and analytic systems separate, in order to prevent analytic workloads from disrupting operational processing,” Schroeder explains.

HTAP (Hybrid Transaction/Analytical Processing) was coined in early 2014 by Gartner to describe a new generation of data platforms that can perform both online transaction processing (OLTP) and online analytical processing (OLAP) without requiring data duplication.

“In 2016, we will see converged approaches become mainstream as leading companies reap the benefits of combining production workloads with analytics to adjust quickly to changing customer preferences, competitive pressures, and business conditions. This convergence speeds the “data to action” cycle for organizations and removes the time lag between analytics and business impact,” Schroeder says.

Pendulum Swing from Centralized to Distributed

Tech cycles have swung back and forth from centralized to distributed workloads, Schroeder says.

“Big data solutions initially focused on centralized data lakes that reduced data duplication, simplified management and supported a variety of applications including customer 360 analysis. However, in 2016, large organizations will increasingly move to distributed processing for big data to address the challenges of managing multiple devices, multiple data centers, multiple global use cases and changing overseas data security rules (safe harbor),” Schroeder says.

“The continued growth of Internet of Things (IoT), cheap IoT sensors, fast networks, and edge processing will further dictate the deployment of distributed processing frameworks,” Schroeder adds.

Storage Becomes an Abundant Resource

“Next-generation, software-based storage technology is enabling multi-temperature (fast and dense) solutions,” Schroeder explains. “Flash memory is a key technology that will enable new design for products in the consumer, computer and enterprise markets. Consumer demand for flash will continue to drive down its cost, and flash deployments in big data will begin to deploy. The optimal solution will combine flash and disk to support both fast and dense configurations.”

In 2016, Schroeder says this new generation of software-based storage that enables multi-temperature solutions will proliferate. “Organizations will not have to choose between fast and dense—they will be able to get both,” he says.

Increased Focus on Fundamental Value

In 2016, IT and data pros will focus less on less on the latest and greatest “shiny object” software downloads, as Schroeder termed them, and more on proven technologies that provide fundamental business value.

“New community innovations will continue to garner attention, but in 2016, companies will increasingly recognize the attraction of software that results in business impact, rather than focusing on raw big data technologies,” Schroeder says.

Markets Experience a ‘Flight to Quality’

Finally, ending on a somewhat promotional note, Schroeder says “In terms of big data technology companies, investors and organizations will turn away from volatile companies that have frequently pivoted in their business models. Instead, they will turn to focus on more secure options – those companies that have both a proven business model and technology innovations that enable improved business outcomes and operational efficiencies.”

Source: http://linkis.com/thoughtsoncloud.com/HCXMVBIG DATA

You may also like

Read More