The ultra-low latency, highly scalable, distributed Apache Kafka data streaming platform has ushered in a new era of real-time data integration, processing and analytics. With Kafka, enterprises can address new advanced analytics use cases and extract more value from more data. Production database transactions provide a rich vein of data to drive these use cases.
However, architects and DBAs struggle with the scripting and complexity of publishing database transactions to Kafka and streaming environments. Talented programmers must individually and manually configure data producers and data type conversions. They often cannot easily integrate source metadata and schema changes.
Attunity Replicate provides a simple, real-time and universal solution for converting production databases into live data streams.
Read this whitepaper to understand:
- Motivations for data streaming
- Key architectural components of Kafka
- The role of Attunity Replicate in streaming environments
- Methods for automated configuration, one-to-many publication, auto-data type mapping and simpler metadata integration
- Best practices based on two enterprise case studies