Organizations with mainframes are bringing that data into Hadoop to analyze it with other important data sources in a cost-effective, scalable manner. However, as soon as the data lake is populated, it quickly becomes out of date, as mainframe applications continue to update, delete and insert new data.
DMX Change Data Capture (CDC) continually keeps Hadoop data in sync with changes made on the mainframe, so the most current Big Data information is available in the data lake for analytics.
- Work with the most up-to-date information for downstream analytics
- Meet short SLAs with simple and quick synchronization of mainframe data with Hadoop
- Save time and resources with graphical design and dynamic optimization
- Maintain optimal performance and preserve expensive CPU resources on the mainframe
- Reliably restart right from where you left off if connectivity is lost on either side