The nature of enterprise functions is diverse: various teams, tools, and systems form a successful enterprise, each taking on an individual area of expertise. This necessary miscellany of groups and functions makes for a gap in data handling, where opposing operations can confound and slow productivity. Enter the semantic layer: the business glue that connects BI tools, users, and data through a common language reflected throughout every inch of the enterprise.
DBTA held a webinar, “How a Semantic Layer Makes Data Mesh Work at Scale,” with Elif Tutuk, VP of product at AtScale, to investigate the ways in which a semantic layer is the indispensable backbone of modern data stacks; best leveraged when equipped with scalable, business-ready systems and AI/ML-augmented analytics.
Tutuk defined the semantic layer as a business representation of corporate data that assists end users to access data autonomously, mapping complex data into familiar business terms (i.e., product, consumer, revenue). In other words, the semantic layer enables data to speak the language of business, allowing data to be business-ready from its inception.
Yet Tutuk also proposed that it was time for a semantic layer that could adapt to the modernized enterprise, necessitating a “rethinking” of the semantic layer to become answerable for scaled and real-time, multi-dimensional demands.
Data mesh, referred to by Tutuk as “the new kid in town,” is a practice in building a decentralized analytics architecture where business domains are responsible for their data, giving ownership to the group that's closest to and best understands the data. Successful ones are able to adapt with agility, provide governance, as well as be highly performant and cloud cost conserving.
But what does this have to do with the semantic layer? The semantic layer is woven within the data mesh, acting as a repository for common models and definitions that is accessible via enterprise jargon. The data mesh, then, is composed of these semantic objects to allow improved access for various teams.
The AtScale Semantic Platform joins data integration tools with the semantic layer for better control over governance, definitions, and pipelines—as well as enabling de-centralized data products. AtScale ensures best practices of dimensional analysis, accommodation for different modeling personas, and increased composability for model productivity. With centralized governance, the platform supplies high performance while maintaining semantic layer integrity, enforcement of access control, and consistent metrics, dimensions, and models. Regardless of a business users’ tool of choice, such as Excel for financial analysts or Python for data science, AtScale empowers users to employ their BI tool of choice with the same semantic layer.
Tutuk further explained how AtScale works, including its deployment, semantic engine, and TPC-DS 10TB benchmarks to illustrate AtScale’s improvements towards data warehouse queries, such as Snowflake and Big Query. To learn more about AtScale’s unique technology, as well as view a demo of the platform, you can view an archived version of the webinar here.