Fabric Sync is a mechanism that synchronizes data between the data sources and Fabric by extraction and transformation processes that are executed on an LU Instance (LUI). Fabric Sync can be performed in 2 modes:
Stream Sync is a Fabric module, which enables proactive synchronization of Fabric with source systems by processing changes only, without requiring re-synchronization of the entire instance for every change in the source.
The Stream Sync job runs on a Fabric server and receives the Insert, Update and Delete transactions from the source system via a pre-defined PubSub interface. The job identifies which Instance ID is impacted by the change and updates it in the relevant tables of the Fabric DB.
For example, the Oracle GoldenGate system publishes messages with the data updates that occur in the source Oracle DB table to Kafka. Stream Sync then listens to Kafka and saves the changes in internal tables in order to process them and to update the instances in Fabric.
The Stream Sync has a rather sophisticated algorithm, capable of identifying whether the received data updates are incomplete or missing logical relationships due to an out-of-order arrival. Invalid or incomplete received data is handled differently from valid and complete received data.
Starting from V8.2, two solutions are provided for the Fabric DB synchronization with the source system changes:
Both solutions use the same resources, such as:
The articles of this section describe the Stream Sync solution.
Fabric Sync is a mechanism that synchronizes data between the data sources and Fabric by extraction and transformation processes that are executed on an LU Instance (LUI). Fabric Sync can be performed in 2 modes:
Stream Sync is a Fabric module, which enables proactive synchronization of Fabric with source systems by processing changes only, without requiring re-synchronization of the entire instance for every change in the source.
The Stream Sync job runs on a Fabric server and receives the Insert, Update and Delete transactions from the source system via a pre-defined PubSub interface. The job identifies which Instance ID is impacted by the change and updates it in the relevant tables of the Fabric DB.
For example, the Oracle GoldenGate system publishes messages with the data updates that occur in the source Oracle DB table to Kafka. Stream Sync then listens to Kafka and saves the changes in internal tables in order to process them and to update the instances in Fabric.
The Stream Sync has a rather sophisticated algorithm, capable of identifying whether the received data updates are incomplete or missing logical relationships due to an out-of-order arrival. Invalid or incomplete received data is handled differently from valid and complete received data.
Starting from V8.2, two solutions are provided for the Fabric DB synchronization with the source system changes:
Both solutions use the same resources, such as:
The articles of this section describe the Stream Sync solution.