Powering any successful business is a robust data management strategy, where data quality and data governance reign supreme. According to DBTA subscribers, 60% reported that they have submitted budgets to modernize their data management initiatives, driven by promises of enhanced operational efficiency and decision making.
In Achieving Trusted Data: Data Quality, Data Governance and MDM Trends, one of DBTA’s latest webinars, data management experts examined the current best practices and key solutions for modernizing data management, as well as potential opportunities and present challenges—both amplified by advanced analytics and AI.
Abhi S. Visuvasam, field CTO, Reltio, centered his portion of the conversation on “where we see data unification—what MDM can do for you—in 2025 and beyond.” Currently, CDOs are facing ongoing data challenges regarding its volume, variety, and variability, while encountering new obstacles propelled by enterprise and agentic AI.
Visuvasam identified five main themes shaping data trust, which Reltio’s unification platform is currently addressing:
- The “golden record” has evolved and will evolve to a multi-modal trusted profile: Enterprise AI will be powered by trusted unified profiles consisting of behavior, factual, unstructured, and derived attribution data, as well as each profile’s relationship.
- Emergence of data unification and data delivery as two related but distinct needs: Reltio’s always-on, real-time data delivery enables enterprises to rapidly deliver unified data, served in milliseconds, with a 99.99% availability guarantee.
- Still, a lot of manual effort goes into data management: With agentic intelligence, enterprises can reduce toil in data management by 10x.
- Data continues to migrate to the cloud: Building deeper product integrations across the cloud ecosystem helps drive data trust.
- Data teams under pressure to show Return on Investment (ROI): Reltio’s platform helps reduce effort to deploy and operate, working out-of-the-box and reducing the need for support during initial install and ongoing operation.
Dimitri Garder, director, special projects, Melissa, emphasized that what makes Melissa’s approach to data quality differentiated is its attention to referential data. By leveraging systems of records, Melissa can ensure data quality at an extremely granular level, driving its data quality verification, normalization, and orchestration capabilities.
“The way we [Melissa] like to think about data quality in the context of identity verification is that data quality really serves as the foundation for all of the other downstream processing that needs to occur to achieve that high trust in the data,” said Garder. “It starts with the verification of the individual attributes, and those stages build on one another...the foundation of this is based on verification of data against known, proven, high-quality referential datasets.”
Applying this methodology to a case study, Garder walked webinar viewers through a typical customer pain point, where customer interactions are fragmented across many IT silos; there is a difficulty in obtaining a clear view of the customer; and, as a result, customer experience is negatively impacted.
The solution, according to Garder, implements real-time cleansing, matching, and merging capabilities in an MDM to build out a system of record from which all data is validated against. Melissa, the company providing these capabilities in an MDM-agnostic fashion, enables enterprises to develop 360-degree customer views with trusted data that ultimately drives data unification and positive customer experiences.
Meeting organizations where they are in their data quality journeys is a key factor of enabling those enterprises to deliver positive business outcomes. Danny Sandwell, technology strategist, erwin by Quest, highlighted erwin by Quest’s framework for determining where businesses are in terms of data quality and how they can help them deliver that trusted, AI-ready data, which includes the following steps:
- Model: Design data architecture.
- Catalog: Search and find data easily.
- Curate: Enrich with business context.
- Govern: Apply business rules and policies.
- Observe: Raise data visibility and integrate data quality.
- Score: Automate data value scoring.
- Shop: Make trusted, governed data widely accessible.
AI model governance adheres to the same seven steps, noted Sandwell, as data is the fuel that powers AI. Without trusted, quality data, AI initiatives are doomed to fail to the same data challenges that have continued to plague organizations for years.
A crucial component of erwin by Quest’s seven-step process is creating prescriptive, reliable, and explainable data products. Data products are purpose-built containers of curated and governed data assets aligned to a business need or use case, explained Sandwell. These data products must be accessible, available, and discoverable, demonstrating clear business value and lineage of ownership throughout its life.
This is only a snippet of the full Achieving Trusted Data: Data Quality, Data Governance and MDM Trends webinar. For the full webinar, featuring more detailed explanations, a Q&A, and more, you can view an archived version of the webinar here.