The proliferation of microservices architectures reflects the needs of modern business. Faster time-to-market, stack flexibility, and better scalability and resilience—each key prerequisites for AI and edge use cases—are a few of these desired infrastructural advantages achievable through microservices. However, microservices architectures are equally as complex to manage and perfect as they are beneficial to adopt.
In DBTA’s latest webinar, Building a Microservices Architecture for the Edge and AI Era, experts discussed key technologies and best practices crucial for implementing a microservices architecture capable of supporting AI and edge use cases.
Managing small, distributed systems—an inherent quality of a microservices architecture—introduces many infrastructure complexities. Josh Arenberg, field engineer, Materialize, emphasized that simplifying these systems by coordinating and consistently joining data across microservice databases without breaking encapsulation is key.
Arenberg explained that while microservices architectures have proven effective over the years, “there’s no good way to support complex queries on data that spans multiple microservices without breaking the encapsulation that gives the architecture all its benefit.” Innately, data silos are the biggest challenge with microservice databases—and when a query requires information from more than one microservice, issues arise.
Materialize is a real-time data integration platform that enables users to transform, deliver, and act on fast changing data, just by using SQL. The platform enables OLAP queries on OLTP data without compromising the trustworthiness of results, no matter where the data resides, adapting to the team composition and budget that enterprises have today. For microservices architectures, Materialize ensures queries are fast, consistent, and fresh, keeping encapsulation in place and helping tie microservices together.
Jean-Noël Moyne, field CTO, Synadia, centered his discussion on NATS—or the open source connectivity and messaging system optimized for simplicity, portability, and adaptability. Capable of simplifying the software stack by reducing reliance on systems like Kafka, Redis, Amazon S3, and more, NATS is compelling for microservices architectures because it offers:
- Distributed queuing (with geo-affinity)
- Request/reply, request many, working queue streaming
- Subject hierarchy and wildcards simplify observability
- Eliminates the need for DNS, port numbers, load-balancers, proxies, or service discovery
At the edge, NATS thrives, according to Moyne, due to its lack of external dependencies, its capacity to run everywhere, and its single 15MB static Go binary. For AI, its proficiency at the edge makes NATS easy and efficient to deploy and run inference anywhere, dynamically balancing its velocity and value.
Abhilash Mula, senior manager, product management, data integration, Informatica, explained that “the shift from monolithic architectures to microservices is not happening in isolation. It’s being driven by cloud-native technologies that provide the flexibility, scalability, and resilience needed for modern, AI-driven workloads.”
Cloud computing, then, is the foundation for microservices architectures, according to Mula. Like the cloud, microservices architecture suffers from challenges associated with data silos and fragmentation. Building a modern data architecture is fundamental in powering microservices, where integrating various data sources and preparing it for action addresses the ever-present issue of data silos. For microservices, AI, and edge, a modern data architecture should provide:
- Decentralized data access and processing
- Clean integrated data
- Support real-time, batch, and event-driven architectures
Informatica’s unified data management platform supports all modern data architectures, explained Mula. Powered by Informatica’s Intelligent Data Management Cloud (IDMC), Informatica offers a unified platform that enables seamless data integration and engineering, ensuring data quality, observability, governance, privacy, and security—which are all crucial for AI-driven decision-making, noted Mula.
This is only a snippet of DBTA’s Building a Microservices Architecture for the Edge and AI Era webinar. For the full broadcast, featuring in-depth explanations, a roundtable discussion, and more, you can view an archived version of the webinar here.