Confluent, Inc., the event streaming pioneer, announced Confluent’s Premium Connector for Oracle Change Data Capture (CDC) Source, a bridge for one of the most common and critical sources of enterprise data to connect to Apache Kafka.
It is now possible to identify and capture data that has been added to, updated, or removed from Oracle databases and make those events available in real time across a business.
Confluent aims to create a central nervous system for organizations to harness all data in motion, but that is only possible if every environment is interconnected.
As the first in a series of Premium Connectors, Confluent makes integrations with complex, high-value systems seamless, reliable, and cost-effective to establish a continuously flowing stream of data that powers the business.
Oracle is well known for its relational database systems, which are used across many enterprises to store data for critical applications and drive operational decision-making.
“There’s an urgent need for organizations to stream, store, and process data across all areas of a business to unlock real-time applications and operations,” said Ganesh Srinivasan, chief product and engineering officer, Confluent. “Our new Premium Connectors represent a big step forward to making event streaming a central nervous system for any enterprise. With the ability to tie historical, enterprise data together with real-time events, companies can react and respond to everything happening in the business, quickly and with the right context.”
The Premium Connector for Oracle CDC enables development teams to securely capture changes happening in Oracle databases and stores it as different Kafka topics.
Paired with Confluent’s complete event streaming platform, ksqlDB, and sink connectors for modern data systems, enterprises can enable key use cases like data synchronization, real-time analytics, and data warehouse modernization that power new and improved Customer 360, fraud detection, machine learning, and more.
Organizations can jump-start technical use cases by leveraging the pre-built connector’s out-of-the-box enterprise features and functionality, such as:
- Redo topic logs
- Snapshots
- Table change event topics
- Record keys
To advance its mission, Confluent is introducing a new line of connectors that specifically targets mission-critical, enterprise systems that are notoriously complex to integrate with Kafka environments.
With Confluent’s pre-built, expert-certified Premium Connectors, it frees up engineers’ time, lowers data integration costs, and accelerates time to market for real-time use cases and applications.
For more information about this news, visit www.confluent.io.