The TDM library has sets of generic flows that allow you to create a standard TDM implementation in just a few minutes. Once a standard implementation has been done, its flows can be edited and tailored to your project's needs.
Before beginning to create Broadway flows, define the tables that are filtered out by the Broadway template, which generates the delete and load flows. The library includes settings for the following filtered auxiliary tables:
This setting is implemented using the TDMFilterOutTargetTables Actor. To filter additional tables, open the TDMFilterOutTargetTables Actor and edit its table object. The lu_name column should be populated as follows:
The generator_filterout needs to be checked (true) if a data generation flow should not be generated for the table.
These tables should be added to the TDMFilterOutTargetTables Actor as it would prevent the load/delete flows creation for the tables; these tables are already loaded/deleted by the child LUs.
Following the Actor's update completion, refresh the project by clicking the button (top of the Project Tree). This act applies the changes in the TDMFilterOutTargetTables Actor and deploys the LU.
When populating a target database, sequences are required. Therefore, setting and initiating sequences is mandatory when implementing TDM.
If the k2masking keyspace does not exist in the DB interface, which is defined for caching the masked values, create it using either the masking-create-cache-table.flow from the Broadway examples library or the create_masking_cache_table.sql from the TDM Library.
Note: The k2masking keyspace can also be created by the deploy.flow of the TDM LU.
Take the following steps in order to create the sequences for your TDM implementation:
A. The TDM library includes a TDMSeqList Actor that holds a list of sequences. Populate the Actor's table object with the relevant information for your TDM implementation as follows:
SEQUENCE_NAME - the sequence name must be identical to the DB's sequence name if the next value is taken from the DB.
CACHE_DB_NAME - populate this setting using DB_CASSANDRA, where the Sequence Cache tables are stored.
SEQUENCE_REDIS_OR_DB - indicates whether the next value is taken from Redis, memory, or from the target DB interface. Populate this setting using either one of the following:
INITIATE_VALUE_OR_FLOW - set an initial value for the sequence or populate the name of an inner flow to apply logic when getting the initial value. For example, you can set the initial value from the max value of the target table. The initial value is only relevant when getting the next value from FabricRedis, IN-MEMORY, or from a newly created DB sequence. Otherwise, the next value is taken from the existing DB sequence.
Note: If the target DB does not have a sequence, or the target DB is not Oracle, DB2 and PostgreSQL, you can populate the Target DB interface name with TDM. The sequence will automatically be created in the TDM DB. Define an init flow to set the initial value for the newly created sequence. It is recommended to use the "IF NULL" functions (for example COALCASE in PG and NVL in Oracle) when getting the initial value for the newly created sequence. For example: Select COALESCE(max(activity_id),0) + 100000 as init_activity_id from activity;
Click here for more information about the sequence Actors.
An example of the TDMSeqList Actor:
An example of an inner flow for getting the initial sequence value:
The table values are used by the createSeqFlowsOnlyFromTemplates flow that generates the Sequences' Actors.
Following completion of the Actor's update, refresh the project by clicking the button (top of the Project Tree). This act applies the changes in the TDMSeqList Actor and deploys the TDM LU.
B. Run createSeqFlowsOnlyFromTemplates.flow from the Shared Objects ScriptsforTemplates folder. The flow has 2 Inner Flows that first create a Broadway flow for each sequence in the TDMSeqList Actor and then create an Actor from each flow. The generated sequence flows invoke the MaskingSequence Actor to get the new sequence value and populate the source and target IDs in the TDM_SEQ_MAPPING table under the k2masking keyspace.
Note: This flow should run once per TDM implementation and not per each LU, as the sequences are used across several LUs in the TDM project. The sequences' flows and Actors are created under Shared Objects, enabling several LUs to use a sequence Actor.
The TDMSeqSrc2TrgMapping table maps between the generated sequence Actors and the target tables' columns. A sequence Actor can be mapped into multiple tables and LUs.
View the below example:
This table serves 2 purposes:
It has been added in TDM 7.3 to automatically add the sequence Actors to the load flows. Populate TDMSeqSrc2TrgMapping table to map between the generated sequence Actors and the target tables' columns. A sequence Actor can be mapped into a different table and a different LU.
TDM 8.0 uses this table to add the sequence Actors to the data generation flow that generates synthetic data for the LU table.
Click here for more information about the synthetic data generation implementation.
Fabric supports sending a category parameter to the masking Actors. This capability enables you to create your own function or Broadway flow in order to generate a new ID using the MaskingLuFunction or MaskingInnerFlow Actors in the sequence Actor. It works as follows:
Click for more information about customizing the replace sequence logic.
In this step you will run the generic createFlowsFromTemplates.flow from the Shared Objects Broadway folder in order to create the delete and load flows under the LU. The flow gets the following input parameters:
LU name
Target Interface
Target Schema
Override Existing Flows - when set to true, the flow deletes and recreates existing load and delete flows. When set to false, the flow skips existing load and delete flows and creates new flows only, if needed. The default value is false.
Note: If the target table name is not identical to the related LU table name, you must populate the mapping of the LU table name to the target table name in TDMTargetTablesNames Actor (imported from the TDM Library) and redeploy the LU to the debug server before running the createFlowsFromTemplates flow.
The createFlowsFromTemplates.flow executes the inner flows that are listed below (A-D). These inner flows generate the load and delete flows in the input LU. The LU source table names must be identical to the table names in the target environment in order to generate the load and delete flows with the correct table names.
Note that the input LU must be deployed to Fabric debug server before running the createFlowsFromTemplates.flow.
A. Create a LOAD flow per table
The load flows are generated by the createLoadTableFlows.flow, which receives the following input parameters: Logical Unit name, target interface and target schema. It then retrieves the list of tables from the LU schema, and creates a separate Broadway flow on each table in order to load its data into the related target table in the target DB. The name of each newly created flow is load_[Table Name].flow, e.g. load_Customer.flow. The tables defined in Step 1 are filtered out and the flow is not created for them.
The sequence Actors are added automatically to the load flows based on the TDMSeqSrc2TrgMapping table.
Additionally, the createFlowsFromTemplates.flow adds the setTargetEntityId_Actor to the load flow of the main target table in order to populate the TARGET_ENTITY_ID key by the target entity ID. For example, add the setTargetEntityId_Actor to load_cases flow and send the target case ID as an input parameter to the Actor:
B. Create the main LOAD flow
The main load flow is generated by the createLoadAllTablesFlow.flow that receives the following input parameter: Logical Unit name. It then creates the LoadAllTables.flow Broadway flow. The purpose of this flow is to invoke all load flows based on the LU Schema's execution order.
C. Create a DELETE flow per table
The delete flow is created by the createDeleteTableFlows.flow that receives the Logical Unit name, target interface, and target schema and retrieves the list of tables from the LU Schema. It then creates a Broadway flow in order to delete the data from this table in the target DB. The name of each newly created flow is delete_[Table Name].flow, e.g. delete_CUSTOMER.flow. The tables defined in Step 1 are filtered out and the flow is not created for them.
The following updates must be performed manually:
SELECT CUSTOMER_ID, ACTIVITY_ID FROM TAR_ACTIVITY;
Populate the keys input argument of the DbDelete Actor. These should correlate with the table's keys.
D. Create the main DELETE flow
The main delete flow is created by the createDeleteAllTablesFlow.flow that receives the Logical Unit name and creates an envelope DeleteAllTables.flow Broadway flow. The purpose of this flow is to invoke all DELETE flows in the opposite order of the population order, considering the target DB's foreign keys.
You can run each one of the load flows in a debug mode. Normally, when running a task, the InitiateTDMLoad_Actor gets the task's attributes and sets the execution parameters accordingly. When running a load flow in a debug mode without executing a TDM task, the InitiateTDMLoad_Actor sets the execution's parameters based on the TDM Globals.
Once all LOAD and DELETE flows are ready, create an orchestrator. The purpose of the TDMOrchestrator.flow is to encapsulate all Broadway flows of the TDM task into a single flow. It includes the invocation of all steps such as:
The TDMOrchestrator.flow should be created from the Logical Unit's Broadway folder; it is built for each Logical Unit in the TDM project. Deploy the Logical Unit to the debug server and then create the Orchestrator flow using a template as shown in the figure below:
The TDMReserveOrchestrator runs the reserve only tasks. Import the flow from the TDM Library into the Shared Objects and redeploy the TDM LU.
TDM systems often handle sensitive data. Complying with data privacy laws and regulations, Fabric enables masking sensitive fields such as SSN, credit card numbers and email addresses before they are loaded either to Fabric or into the target database.
In order to mask a sensitive field - prior to loading it into Fabric - create a Broadway population flow for the table that contains this field and add 1 or more Masking Actors.
If the masked field is used as an input argument that is linked to another LU table, add the masking population that masks the fields in all LU tables to the last executed LU table in order to have the original value when populating the LU tables.
To mask a sensitive field as part of a load to the Target DB, add a masking Actor to the relevant load_[Table Name].flow. The TDM infrastructure controls masking enablement or disablement based on the settings of the global variables.
There are 3 possible scenarios for handling masking:
Notes:
From TDM 7.3 and onwards, the task that clones an entity creates only 1 LUI instance for all clones. Therefore, you must add masking on both processes (LUI Sync and load flows) in order to get different data in the masked fields on each clone.
TDM 8.0 added the root_iid to the caching key, in order to maintain the referential integrity on PII fields across different LUs of the task’s BE.
For example, CRM and Billing LUs keep the Customer's data. The customer name needs to be identical in both LUs for a given customer. Setting the root_iid with the customer ID enables keeping the referential integrity between the CRM and Billing LUs. It is recommended to set the useInstanceId input argument of the masking Actors to true to keep the PII fields' referential integrity within the Business Entity LUs.
Click here to learn how to use masking Actors.
Click here to learn how the TDM task execution process builds the entity list.
The entity list of the full entity subset can be generated by either using an SQL query on the source DB or running a Broadway flow. A Broadway flow is needed when running an extract on a non JDBC data source.
Create a Broadway flow under the related root LU or the Shared Objects. It is recommended to locate the Broadway flow under the Shared Objects to enable running the flow on several root LUs of a given Business Entity. The Broadway flow must include the following stages:
Populate the Broadway flow in the trnMigrateList translation.
Redeploy the related LUs and the TDM LU.
The TDM library provides a list of Broadway Actors and flows to support generating an entity list by a project Broadway flow. The project Broadway flow gets the entity list and calls the TDM library Actors to insert them into a dedicated Cassandra table in k2view_tdm keyspace. A separate Cassandra entity table is created on each LU and it has the following naming convention: [LU_NAME]_entity_list.
The TDM task execution process runs the batch process on entities in the Cassandra table that are a part of current task execution, having the current task execution id.
Click here for more information about TDM implementation on non JDBC Data Source.
You can build 1 or multiple Broadway flows for get a list of entities for a task execution. These Broadway flows are executed by the TDM task execution process in order to building the entity list for the task. The project Broadway flow needs to select the entity list and to call the TDM library Actors in order to insert them into a dedicated Cassandra table in k2view_tdm keyspace. A separate Cassandra entity table is created on each LU and has the following naming convention: [LU_NAME]_entity_list.
The TDM task execution process runs the batch process on the entities in the Cassandra table that belong to the current task execution (have the current task execution id).
The Custom Logic Broadway flow can be created in either the Shared Objects or a given LU.
The Custom Logic Broadway flow always has 2 external input parameters and it gets their values from the task execution process:
TDM supports the creation of additional external parameters in the flow, enabling the user to send the values of these parameters in the TDM task; e.g., you can add an external parameter name customer_status to the flow. The flow selects the customers for the task based on the input customer_status parameter. This way you can filter the selected customers by their status and still use the same flow to select them.
Notes:
The input parameter name must not contain spaces or double quotes.
TDM 8.0 added an integration of Broadway editors into the TDM portal when populating either the data generation parameters or the Custom logic parameters in the task’s tabs. This integration enables the user to select a valid value from a list, set dates and to set distributed parameters.
Click here for more information about the TDM integration with the Broadway editors and the implementation instructions for them.
Sending multiple values in 1 single parameter - you can define a String input parameter in order to get a list of values into the parameter and split it into an array in the flow, e.g., "CA,NY". The Broadway flow can split this String by the delimiter. The values must be delimited by the delimiter, which is set in the split Actor in Broadway flow.
You can get an input SELECT statement with binding parameters. The parameters' values can be either sent into a separate input parameter or added to the SELECT statement.
Examples of input SELECT query:
SQLQuery:
select distinct cust.customer_id from customer cust, activity act, cases cs where cust.customer_id = act.customer_id and act.activity_id = cs.activity_id and cs.status = ? and cs.case_type = ?
SQLParams:
Open,Billing Issue
SQLQuery:
Select Distinct act.customer_id From activity act, cases ca Where act.activity_id = ca.activity_id And ca.status <> 'Closed' And ca.case_type in ('Device Issue', 'Billing Issue');
Stage 1:
Stages 2-4: Loop on the selected entities - set a Transaction in the loop in order to have 1 commit for all iterations:
Stage 2: Set the selected entity ID - returned by the Actor of Stage 1 - to a String using the ToString Actor.
Stage 3: Call CheckReserveAndLoadToEntityList TDM Broadway flow (imported from the TDM Library):
Stage 4: Calls CheckAndStopLoop TDM Actor (imported from the TDM Library). Set the NUM_OF_ENTITIES to be an external input parameter to get its value from the task execution process. It checks the number of entities inserted to the Cassandra table, and stops the loop if the custom flow reaches the task's number of entities.
Example:
The task needs to get 5 entities. The SELECT statement gets 20 entities. The first 2 selected entities are reserved for another user. The 3rd, 4th, 5th, 6th and 7th entities are available and are populated in the Cassandra table; then the entities' loop stops.
Below are examples of a Custom Logic flow:
Example 1 - get a the Contract status as an input parameter and build the SELECT statement accordingly:
Example 2 - get an input String of States, separated by a comma. Split the input String into and array and send it to the SQL query:
Example of the input US states:
Example 3 - get an input SELECT statement with parameters for the SELECT statement:
Add the LU name and Custom Logic flow name to the CustomLogicFlows constTable TDM Actor (imported from the TDM Library).
See example:
Redeploy the Web-Services.
The TDM library has sets of generic flows that allow you to create a standard TDM implementation in just a few minutes. Once a standard implementation has been done, its flows can be edited and tailored to your project's needs.
Before beginning to create Broadway flows, define the tables that are filtered out by the Broadway template, which generates the delete and load flows. The library includes settings for the following filtered auxiliary tables:
This setting is implemented using the TDMFilterOutTargetTables Actor. To filter additional tables, open the TDMFilterOutTargetTables Actor and edit its table object. The lu_name column should be populated as follows:
The generator_filterout needs to be checked (true) if a data generation flow should not be generated for the table.
These tables should be added to the TDMFilterOutTargetTables Actor as it would prevent the load/delete flows creation for the tables; these tables are already loaded/deleted by the child LUs.
Following the Actor's update completion, refresh the project by clicking the button (top of the Project Tree). This act applies the changes in the TDMFilterOutTargetTables Actor and deploys the LU.
When populating a target database, sequences are required. Therefore, setting and initiating sequences is mandatory when implementing TDM.
If the k2masking keyspace does not exist in the DB interface, which is defined for caching the masked values, create it using either the masking-create-cache-table.flow from the Broadway examples library or the create_masking_cache_table.sql from the TDM Library.
Note: The k2masking keyspace can also be created by the deploy.flow of the TDM LU.
Take the following steps in order to create the sequences for your TDM implementation:
A. The TDM library includes a TDMSeqList Actor that holds a list of sequences. Populate the Actor's table object with the relevant information for your TDM implementation as follows:
SEQUENCE_NAME - the sequence name must be identical to the DB's sequence name if the next value is taken from the DB.
CACHE_DB_NAME - populate this setting using DB_CASSANDRA, where the Sequence Cache tables are stored.
SEQUENCE_REDIS_OR_DB - indicates whether the next value is taken from Redis, memory, or from the target DB interface. Populate this setting using either one of the following:
INITIATE_VALUE_OR_FLOW - set an initial value for the sequence or populate the name of an inner flow to apply logic when getting the initial value. For example, you can set the initial value from the max value of the target table. The initial value is only relevant when getting the next value from FabricRedis, IN-MEMORY, or from a newly created DB sequence. Otherwise, the next value is taken from the existing DB sequence.
Note: If the target DB does not have a sequence, or the target DB is not Oracle, DB2 and PostgreSQL, you can populate the Target DB interface name with TDM. The sequence will automatically be created in the TDM DB. Define an init flow to set the initial value for the newly created sequence. It is recommended to use the "IF NULL" functions (for example COALCASE in PG and NVL in Oracle) when getting the initial value for the newly created sequence. For example: Select COALESCE(max(activity_id),0) + 100000 as init_activity_id from activity;
Click here for more information about the sequence Actors.
An example of the TDMSeqList Actor:
An example of an inner flow for getting the initial sequence value:
The table values are used by the createSeqFlowsOnlyFromTemplates flow that generates the Sequences' Actors.
Following completion of the Actor's update, refresh the project by clicking the button (top of the Project Tree). This act applies the changes in the TDMSeqList Actor and deploys the TDM LU.
B. Run createSeqFlowsOnlyFromTemplates.flow from the Shared Objects ScriptsforTemplates folder. The flow has 2 Inner Flows that first create a Broadway flow for each sequence in the TDMSeqList Actor and then create an Actor from each flow. The generated sequence flows invoke the MaskingSequence Actor to get the new sequence value and populate the source and target IDs in the TDM_SEQ_MAPPING table under the k2masking keyspace.
Note: This flow should run once per TDM implementation and not per each LU, as the sequences are used across several LUs in the TDM project. The sequences' flows and Actors are created under Shared Objects, enabling several LUs to use a sequence Actor.
The TDMSeqSrc2TrgMapping table maps between the generated sequence Actors and the target tables' columns. A sequence Actor can be mapped into multiple tables and LUs.
View the below example:
This table serves 2 purposes:
It has been added in TDM 7.3 to automatically add the sequence Actors to the load flows. Populate TDMSeqSrc2TrgMapping table to map between the generated sequence Actors and the target tables' columns. A sequence Actor can be mapped into a different table and a different LU.
TDM 8.0 uses this table to add the sequence Actors to the data generation flow that generates synthetic data for the LU table.
Click here for more information about the synthetic data generation implementation.
Fabric supports sending a category parameter to the masking Actors. This capability enables you to create your own function or Broadway flow in order to generate a new ID using the MaskingLuFunction or MaskingInnerFlow Actors in the sequence Actor. It works as follows:
Click for more information about customizing the replace sequence logic.
In this step you will run the generic createFlowsFromTemplates.flow from the Shared Objects Broadway folder in order to create the delete and load flows under the LU. The flow gets the following input parameters:
LU name
Target Interface
Target Schema
Override Existing Flows - when set to true, the flow deletes and recreates existing load and delete flows. When set to false, the flow skips existing load and delete flows and creates new flows only, if needed. The default value is false.
Note: If the target table name is not identical to the related LU table name, you must populate the mapping of the LU table name to the target table name in TDMTargetTablesNames Actor (imported from the TDM Library) and redeploy the LU to the debug server before running the createFlowsFromTemplates flow.
The createFlowsFromTemplates.flow executes the inner flows that are listed below (A-D). These inner flows generate the load and delete flows in the input LU. The LU source table names must be identical to the table names in the target environment in order to generate the load and delete flows with the correct table names.
Note that the input LU must be deployed to Fabric debug server before running the createFlowsFromTemplates.flow.
A. Create a LOAD flow per table
The load flows are generated by the createLoadTableFlows.flow, which receives the following input parameters: Logical Unit name, target interface and target schema. It then retrieves the list of tables from the LU schema, and creates a separate Broadway flow on each table in order to load its data into the related target table in the target DB. The name of each newly created flow is load_[Table Name].flow, e.g. load_Customer.flow. The tables defined in Step 1 are filtered out and the flow is not created for them.
The sequence Actors are added automatically to the load flows based on the TDMSeqSrc2TrgMapping table.
Additionally, the createFlowsFromTemplates.flow adds the setTargetEntityId_Actor to the load flow of the main target table in order to populate the TARGET_ENTITY_ID key by the target entity ID. For example, add the setTargetEntityId_Actor to load_cases flow and send the target case ID as an input parameter to the Actor:
B. Create the main LOAD flow
The main load flow is generated by the createLoadAllTablesFlow.flow that receives the following input parameter: Logical Unit name. It then creates the LoadAllTables.flow Broadway flow. The purpose of this flow is to invoke all load flows based on the LU Schema's execution order.
C. Create a DELETE flow per table
The delete flow is created by the createDeleteTableFlows.flow that receives the Logical Unit name, target interface, and target schema and retrieves the list of tables from the LU Schema. It then creates a Broadway flow in order to delete the data from this table in the target DB. The name of each newly created flow is delete_[Table Name].flow, e.g. delete_CUSTOMER.flow. The tables defined in Step 1 are filtered out and the flow is not created for them.
The following updates must be performed manually:
SELECT CUSTOMER_ID, ACTIVITY_ID FROM TAR_ACTIVITY;
Populate the keys input argument of the DbDelete Actor. These should correlate with the table's keys.
D. Create the main DELETE flow
The main delete flow is created by the createDeleteAllTablesFlow.flow that receives the Logical Unit name and creates an envelope DeleteAllTables.flow Broadway flow. The purpose of this flow is to invoke all DELETE flows in the opposite order of the population order, considering the target DB's foreign keys.
You can run each one of the load flows in a debug mode. Normally, when running a task, the InitiateTDMLoad_Actor gets the task's attributes and sets the execution parameters accordingly. When running a load flow in a debug mode without executing a TDM task, the InitiateTDMLoad_Actor sets the execution's parameters based on the TDM Globals.
Once all LOAD and DELETE flows are ready, create an orchestrator. The purpose of the TDMOrchestrator.flow is to encapsulate all Broadway flows of the TDM task into a single flow. It includes the invocation of all steps such as:
The TDMOrchestrator.flow should be created from the Logical Unit's Broadway folder; it is built for each Logical Unit in the TDM project. Deploy the Logical Unit to the debug server and then create the Orchestrator flow using a template as shown in the figure below:
The TDMReserveOrchestrator runs the reserve only tasks. Import the flow from the TDM Library into the Shared Objects and redeploy the TDM LU.
TDM systems often handle sensitive data. Complying with data privacy laws and regulations, Fabric enables masking sensitive fields such as SSN, credit card numbers and email addresses before they are loaded either to Fabric or into the target database.
In order to mask a sensitive field - prior to loading it into Fabric - create a Broadway population flow for the table that contains this field and add 1 or more Masking Actors.
If the masked field is used as an input argument that is linked to another LU table, add the masking population that masks the fields in all LU tables to the last executed LU table in order to have the original value when populating the LU tables.
To mask a sensitive field as part of a load to the Target DB, add a masking Actor to the relevant load_[Table Name].flow. The TDM infrastructure controls masking enablement or disablement based on the settings of the global variables.
There are 3 possible scenarios for handling masking:
Notes:
From TDM 7.3 and onwards, the task that clones an entity creates only 1 LUI instance for all clones. Therefore, you must add masking on both processes (LUI Sync and load flows) in order to get different data in the masked fields on each clone.
TDM 8.0 added the root_iid to the caching key, in order to maintain the referential integrity on PII fields across different LUs of the task’s BE.
For example, CRM and Billing LUs keep the Customer's data. The customer name needs to be identical in both LUs for a given customer. Setting the root_iid with the customer ID enables keeping the referential integrity between the CRM and Billing LUs. It is recommended to set the useInstanceId input argument of the masking Actors to true to keep the PII fields' referential integrity within the Business Entity LUs.
Click here to learn how to use masking Actors.
Click here to learn how the TDM task execution process builds the entity list.
The entity list of the full entity subset can be generated by either using an SQL query on the source DB or running a Broadway flow. A Broadway flow is needed when running an extract on a non JDBC data source.
Create a Broadway flow under the related root LU or the Shared Objects. It is recommended to locate the Broadway flow under the Shared Objects to enable running the flow on several root LUs of a given Business Entity. The Broadway flow must include the following stages:
Populate the Broadway flow in the trnMigrateList translation.
Redeploy the related LUs and the TDM LU.
The TDM library provides a list of Broadway Actors and flows to support generating an entity list by a project Broadway flow. The project Broadway flow gets the entity list and calls the TDM library Actors to insert them into a dedicated Cassandra table in k2view_tdm keyspace. A separate Cassandra entity table is created on each LU and it has the following naming convention: [LU_NAME]_entity_list.
The TDM task execution process runs the batch process on entities in the Cassandra table that are a part of current task execution, having the current task execution id.
Click here for more information about TDM implementation on non JDBC Data Source.
You can build 1 or multiple Broadway flows for get a list of entities for a task execution. These Broadway flows are executed by the TDM task execution process in order to building the entity list for the task. The project Broadway flow needs to select the entity list and to call the TDM library Actors in order to insert them into a dedicated Cassandra table in k2view_tdm keyspace. A separate Cassandra entity table is created on each LU and has the following naming convention: [LU_NAME]_entity_list.
The TDM task execution process runs the batch process on the entities in the Cassandra table that belong to the current task execution (have the current task execution id).
The Custom Logic Broadway flow can be created in either the Shared Objects or a given LU.
The Custom Logic Broadway flow always has 2 external input parameters and it gets their values from the task execution process:
TDM supports the creation of additional external parameters in the flow, enabling the user to send the values of these parameters in the TDM task; e.g., you can add an external parameter name customer_status to the flow. The flow selects the customers for the task based on the input customer_status parameter. This way you can filter the selected customers by their status and still use the same flow to select them.
Notes:
The input parameter name must not contain spaces or double quotes.
TDM 8.0 added an integration of Broadway editors into the TDM portal when populating either the data generation parameters or the Custom logic parameters in the task’s tabs. This integration enables the user to select a valid value from a list, set dates and to set distributed parameters.
Click here for more information about the TDM integration with the Broadway editors and the implementation instructions for them.
Sending multiple values in 1 single parameter - you can define a String input parameter in order to get a list of values into the parameter and split it into an array in the flow, e.g., "CA,NY". The Broadway flow can split this String by the delimiter. The values must be delimited by the delimiter, which is set in the split Actor in Broadway flow.
You can get an input SELECT statement with binding parameters. The parameters' values can be either sent into a separate input parameter or added to the SELECT statement.
Examples of input SELECT query:
SQLQuery:
select distinct cust.customer_id from customer cust, activity act, cases cs where cust.customer_id = act.customer_id and act.activity_id = cs.activity_id and cs.status = ? and cs.case_type = ?
SQLParams:
Open,Billing Issue
SQLQuery:
Select Distinct act.customer_id From activity act, cases ca Where act.activity_id = ca.activity_id And ca.status <> 'Closed' And ca.case_type in ('Device Issue', 'Billing Issue');
Stage 1:
Stages 2-4: Loop on the selected entities - set a Transaction in the loop in order to have 1 commit for all iterations:
Stage 2: Set the selected entity ID - returned by the Actor of Stage 1 - to a String using the ToString Actor.
Stage 3: Call CheckReserveAndLoadToEntityList TDM Broadway flow (imported from the TDM Library):
Stage 4: Calls CheckAndStopLoop TDM Actor (imported from the TDM Library). Set the NUM_OF_ENTITIES to be an external input parameter to get its value from the task execution process. It checks the number of entities inserted to the Cassandra table, and stops the loop if the custom flow reaches the task's number of entities.
Example:
The task needs to get 5 entities. The SELECT statement gets 20 entities. The first 2 selected entities are reserved for another user. The 3rd, 4th, 5th, 6th and 7th entities are available and are populated in the Cassandra table; then the entities' loop stops.
Below are examples of a Custom Logic flow:
Example 1 - get a the Contract status as an input parameter and build the SELECT statement accordingly:
Example 2 - get an input String of States, separated by a comma. Split the input String into and array and send it to the SQL query:
Example of the input US states:
Example 3 - get an input SELECT statement with parameters for the SELECT statement:
Add the LU name and Custom Logic flow name to the CustomLogicFlows constTable TDM Actor (imported from the TDM Library).
See example:
Redeploy the Web-Services.