New technologies abound and are constantly emerging in the big data world. Today, a greater variety of data than ever before is being collected, stored, and analyzed by more users. Smart machines are spurring immediate action with sensors streaming data across the Internet of Things. Big data processing is being done on Hadoop, Apache Spark, and other frameworks with data often stored in open source and NoSQL database platforms in addition to established relational database systems. And, supporting all that diversity of choice for handling data is the cloud, which is often seen as the antidote to growing IT complexity.
In terms of big data storage and processing, for example, the Hadoop market was valued at $7.69 billion in 2016 and is expected to reach approximately $87.14 billion by 2022, according to Zion Market Research. And, a 2016 survey by the Taneja Group found that one-half of all respondents, 54%, are
actively using Spark, with 64% intending to increase usage over the next 12 months. A large number of Spark users are also leveraging technologies for storage other than Hadoop, such as NoSQL, and open source and proprietary SQL data stores, confirms Databricks’ 2016 Apache Spark Survey.
In addition, in-memory technologies are being tapped to deliver insights to decision makers more rapidly. According to a Unisphere-IOUG survey, a majority of respondents, 52%, saw in-memory technology as critical to their future endeavors. More than a quarter, 28%, were implementing in-memory
technologies, while 23% were piloting these solutions.
Cognitive computing and IoT are also poised to shake up the big data market.
Though still in the early days, investment in AI is growing, largely due to tech giants. In 2016, companies directed $26-$39 billion to AI initiatives, with large tech firms accounting for $20-$30 billion of that spending, state the authors of a McKinsey Global Institute Report. These firms focused 90% of their investment on R&D and deployment, and 10% on AI acquisitions. Not surprisingly, McKinsey found AI adoption is greatest in sectors that are already strong digital adopters, such as high tech/telecom, automotive/assembly, and financial services. Among companies at various stages of loT adoption or
evaluation, the opportunity to introduce or increase new business revenue sources is the leading motivation, followed by the ability to increase product and customer knowledge,according to a joint Unisphere Research-Radiant Advisors survey.
Underlying all these new technologies is cloud computing which is opening up possibilities that were unthinkable when all systems and data had to be deployed in the physical data center. Today, on-premises, as well as public and private cloud approaches are being embraced. Indeed, a hybrid mix of cloud and physical deployments is widely expected to be adopted for some time, though the proportions may vary by company and industry. According to an IBM study, companies identified as “leaders” are using hybrid cloud to jumpstart next-generation initiatives. In addition, eight in 10 leaders believe hybrid cloud helps reduce the amount of shadow IT, and 85% believe hybrid drives more collaboration between IT and lines of business.
Evaluating new and disruptive technologies, as well as when and where they may prove useful, is the challenge. Against the rapidly evolving big data scene, this year, Big Data Quarterly presents the newest “Big Data 50,” an annual list of forward-thinking companies that are working to expand what’s possible in terms of collecting, storing, and deriving value from data.
We encourage you to explore these solution providers by visiting their websites. You can also gain insight into trends in how data is being managed and consumed by accessing Unisphere Research’s survey reports at www.unisphereresearch.com as well as an extensive library of best practices reports and white papers at www.dbta.com/DBTA-Downloads/WhitePapers.