Newsletters




Defining and Preparing for the Future of Analytics


Modern data architectures—based on ecosystems of integration, scalability, and real-time data—continue to populate the ever-evolving world of analytics. The desire to invest in complex workloads, including AI and machine learning (ML), shapes how data will need to be managed, rendering traditional data architectures unfit.

Experts joined DBTA’s special roundtable webinar, The Future of Analytics: Cloud Data Warehouses, Data Lakehouses, and More, to discuss how better, faster insights can be achieved through an assortment of modern systems that emphasize flexibility, scalability, and agility.

Optionality—from human skill to data and tech—is a fundamental shaper of the analytical future, according to Patrick Phrayme (Masi-Phelps), sales engineer, partnerships, Dataiku. Yet, this sort of agility is challenged by enterprises’ desire for end-to-end operational transparency, centralized within a single place. Meaning, the ability to choose between data, technology, or the skill required on behalf of the human to access data is made difficult to achieve due to a need for a transparent, single source of truth.

To address this problem, Dataiku “tries to empower all employees to use data and AI…to connect to data no matter what it is…integrate with all technologies…and govern the full lifecycle,” explained Masi-Phelps.

Each of these components serve to future-proof analytics, from supporting different types of data—whether it's structured, unstructured, or semi-structured—to integrating with major technologies as they continue to arise. Paired with Dataiku’s end-to-end operational view on the full process, from data to AI, Dataiku ensures that flexibility, as well as a secure, scalable framework, is a possibility for the analytical space.

Simplicity, scale, and governance are crucial pieces of the future of analytics puzzle, noted Scott Teal, product marketing at Snowflake.

As a major player in the data warehousing market, Snowflake continues to expand its platform as one that “helps organizations in lots of industries address their silos and complexity challenges, and we’re all about making data and AI easy, efficient, and trustworthy,” said Teal.

Outside of this overarching mission, Snowflake’s appeal lies within its unified data and AI strategy that makes it easy to use through its fully managed infrastructure. From a single platform, enterprises can build out any architectural pattern—including a data warehouse, data lake, data lakehouse, or a data mesh/fabric—that fits their needs.

Nate Stewart, CEO, Materialize, emphasized that acting confidently on fast-changing data is a defining feature of a successful, modern data system. Steward  posed a pertinent question: How can you make trustworthy, transformed data available throughout your systems and teams while it’s still fresh?  

Enterprises could move closer to where their data lives, yet this slows down queries; they can adopt a data warehouse for faster queries, yet data becomes stale; they can implement a stream processor, but this introduces complexities and talent bottlenecks.

Materialize’s real-time data integration platform aims to solve this ever-present challenge, enabling organizations to transform, deliver, and act on fast changing data, just by using SQL. Instead of executing complex queries on read, Materialize does small amounts of work as writes come in, shifting to a proactive—not reactive—approach to data transformation, increasing efficiency and cost savings.

Capable of integrating into enterprises’ existing tech stacks, Materialize enables the assembly and modification of trustworthy, live data products in minutes, without requiring special talent, according to Stewart.

Modern analytics challenges are born out of the fact that enterprises have data in many different systems, as Ravi Krishnan, head of sales engineering at Nexla, pointed out. Sprawled among SaaS apps, on-prem apps, third-party apps, analytical systems, and more, integration becomes a complex, expansive problem. There are simply too many integration styles and tools, said Krishnan, and this introduces significant friction and a lack of interoperability to the tech stack.

Krishnan explained that the solution lies within a unified, collaborative approach to infrastructure management, where consumers are met where they are.

Nexla’s enterprise data and AI integration platform—powered by metadata intelligence—enables organizations to create virtual data products from any data source, in any data system, which can be used by a variety of personas, including analysts, data scientists, developers, and business users. This is all centralized under the umbrella of AI readiness, ensuring that GenAI projects can succeed with converged integration and a governed, controlled environment.

Following their presentation, each speaker engaged in a roundtable discussion to further explore the nuance of the future of analytics.

For the full, in-depth webinar, you can view an archived version here.


Sponsors