The Fabric CDC process aggregates LUI data updates in a MicroDB and publishes CDC message(s) with the committed changes to the CDC consumer(s).
The following diagram describes the CDC process:
A transaction on an LUI may involve several updates on some of its LU tables. Each update (write) in the LUI's MicroDB SQLite file activates a Fabric trigger that sends the change to the CDC Message. The CDC Message publishes a message to Kafka for each INSERT, UPDATE or DELETE event in the MicroDB. Each message contains an LUI (IID), an event type, old and new values of each CDC column, LU table's PK columns and a transaction ID.
If the transaction is committed, a Commit message is sent by a CDC Message. If the transaction is interrupted, rolled back or failed, a Rollback message is sent by a CDC Message.
Each transaction can generate multiple CDC messages. For example, if an LUI sync inserts five records into an LU table, a separate CDC message is generated for each insert.
All CDC messages initiated by a given transaction have the same value in their trxId property.
The CDC Message publishes transaction messages to Kafka CDC Consumer topics for each UPDATE, INSERT or DELETE activity. The partition key is the LUI (IID).
Each CDC message has its own value in the msgNo property.
The msgCount property of each CDC message is populated by the number of CDC messages initiated by a transaction for a given CDC consumer.
Fabric has a built-in integration with Elasticsearch. The CDC_TRANSACTION_CONSUMER jobs start automatically when deploying an LU with search indexes. The Jobs UID is search. The CDC consumer job consumes the messages in the Kafka search topic and creates search indexes in Elasticsearch.
Click for more information about Fabric Search capabilities.
The DEBUG_CDC_JOB Fabric job can be run as a CDC consumer to debug a CDC topic, whereby it consumes the CDC messages of a given CDC topic and writes them to the log file.
Example:
startjob DEBUG_CDC_JOB name='DEBUG_CDC_JOB' ARGS='{"topic":Tableau", "group_id": "tableau"}';
The Fabric CDC process aggregates LUI data updates in a MicroDB and publishes CDC message(s) with the committed changes to the CDC consumer(s).
The following diagram describes the CDC process:
A transaction on an LUI may involve several updates on some of its LU tables. Each update (write) in the LUI's MicroDB SQLite file activates a Fabric trigger that sends the change to the CDC Message. The CDC Message publishes a message to Kafka for each INSERT, UPDATE or DELETE event in the MicroDB. Each message contains an LUI (IID), an event type, old and new values of each CDC column, LU table's PK columns and a transaction ID.
If the transaction is committed, a Commit message is sent by a CDC Message. If the transaction is interrupted, rolled back or failed, a Rollback message is sent by a CDC Message.
Each transaction can generate multiple CDC messages. For example, if an LUI sync inserts five records into an LU table, a separate CDC message is generated for each insert.
All CDC messages initiated by a given transaction have the same value in their trxId property.
The CDC Message publishes transaction messages to Kafka CDC Consumer topics for each UPDATE, INSERT or DELETE activity. The partition key is the LUI (IID).
Each CDC message has its own value in the msgNo property.
The msgCount property of each CDC message is populated by the number of CDC messages initiated by a transaction for a given CDC consumer.
Fabric has a built-in integration with Elasticsearch. The CDC_TRANSACTION_CONSUMER jobs start automatically when deploying an LU with search indexes. The Jobs UID is search. The CDC consumer job consumes the messages in the Kafka search topic and creates search indexes in Elasticsearch.
Click for more information about Fabric Search capabilities.
The DEBUG_CDC_JOB Fabric job can be run as a CDC consumer to debug a CDC topic, whereby it consumes the CDC messages of a given CDC topic and writes them to the log file.
Example:
startjob DEBUG_CDC_JOB name='DEBUG_CDC_JOB' ARGS='{"topic":Tableau", "group_id": "tableau"}';