Switching between Dataflows in a Workflow

 

Centerprise’s workflow engine allows you to dynamically switch between dataflows that should run in a workflow.  This allows for extra degree of flexibility and customization since your dataflow file paths no longer have to be hard-coded, and you can run a different dataflow in the same RunDataflow object depending on your workflow logic.  This is done by parameterizing the RunDataflow task.

Normally, the RunDataflow task allows you to select a single dataflow that should run at that point in the workflow, by entering the dataflow’s file path in the task properties.  However, it may be necessary to run a different dataflow depending on the data received, for example.

You can achieve this by mapping the _JobFilePath input parameter in the RunDataflow task to a source field with the path of the dataflow you want to run.

When the _JobFilePath parameter is mapped, the mapped value has precedence over any dataflow path specified in the Run Dataflow object properties.

Let’s take a look at two examples showing how you can use this feature in your workflows.

 

Example 1:  Switching between dataflows depending on data received.

The workflow shown in the figure below uses the _JobFilePath parameter to dynamically switch between the dataflows at run-time.  The _JobFilePath parameter is highlighted in a purple oval below.

In this scenario, the workflow is scheduled to run when a file was dropped in a watch folder. The workflow receives the file path of the file dropped using FileName and DroppedFilePath parameters.  These parameters, as well as the third parameter, DataFlowRootPath, are fed into an expression object GetTableLoadDataflow.   This expression object returns the complete file path by processing and concatenating the three input parameters listed above.  The output of the expression object, using the $Output field, returns the complete filepath where the dataflow is at.  This parameter is then fed to the RunDataflow task LoadFileToTempTable, using the input map to the _JobFilePath parameter.

This configuration allows the LoadFileToTempTable task to run the correct target dataflow, using the dataflow filepath in the _JobFilePath parameter.

 

 

 

 a1.png

Example 2: Running a collection of dataflows in a single workflow.

In this example, the workflow runs a collection of dataflows, using a single RunDataflow task, RunInventoryFlow.  This is made possible by mapping _JobFilePath parameter in the RunDataflow task to an iterator source.  In our example, the FlowInventoryTable is a database source object assigned as the Iterator (or Loop). Iterator objects connect to the task they iterate through using a double green line. FlowInventoryTable reads the table storing the flow information of the dataflows in the collection.  It outputs the FlowID value which is fed to the GetFlowPath expression.  This expression returns the complete file path of the flow for the given flow ID.  The file path is then fed to the _JobFilePath parameter of the RunInventoryFlow task.   This configuration make it possible for several dataflows to run one after another using a single RunInventoryFlow object.

 

 a3.png

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.