Business Partner Integration with Centerprise Webinar Q&A

Q: Can we do data correction as part of a flow if we encounter bad data coming from a partner?

A: Yes. For instance, say we have a correct data flow that is routing data and sending bad records to our correct data subflow.

 Screen_Shot_2014-01-17_at_1.27.36_PM.png

The subflow is using a single expression to map the fields that need to be corrected. You can use expressions not only to transform data for extraction and manipulation to get it into your canonical interface, but you can also use expressions to fix data depending on some business logic you have set up whenever you encounter these types of data anomalies.

 Screen_Shot_2014-01-17_at_1.27.50_PM.png

 

Q: Do you have any raw data filter for files—something similar to the where clause or SQL queries?

A: Yes. There are two ways to filter your data. If you look at the options in a delimited file menu, you can see there is a raw text filter.

Screen_Shot_2014-01-17_at_1.31.19_PM.png

You can use the raw text filter  or you can use the Filter Transformation.  The Filter Transformation will filter out records that have been read using Astera Rules Language expressions.  The raw text filter will prevent a line of text from being “read”.

 filter_transformation.png

For example, if you have a header repeating throughout the document, you don’t want to process that header as an erroneous record. In this case, you would not use the filter transformation; you would use the raw text filter to get exactly what you are looking for.

Q: Can a subflow interface represent multi-level data?

A: Yes. In a subflow interface hierarchical structure you can create a new subflow and copy and paste the input layout. Once you have your multi-level interface you can then map to it.

Screen_Shot_2014-01-17_at_1.36.38_PM.png

Q: Does Centerprise support secure FTP?

A: Yes. If you look at download file options, there are several options for FTP.

Screen_Shot_2014-01-17_at_1.37.59_PM.png

Q: How do you recommend organizing mappings to our vendors?

A: That is entirely up to you, but I recommend that you design your project in such a way that you are easily able to acquire a new vendor and create a new node in your project. I would have as much common information as possible with the goal of being able to make each set of vendor data files as small as possible so that you are re-using a lot of the common structures. I would have a folder per vendor or maybe even some naming convention for the dataflows and I’d want to make the flows as generic as possible so they can be used in a dynamic data flow situation to reduce the redundancy as much as possible.

Q: Is there any object for bookmarking newly added files into an FTP server?

A: For that you are going to want to use a component in Centerprise called File Entries Source. This allows you to list the files in your FTP folder. You are able to see when the files were created, their size, when they were created and who the owner is.

 Screen_Shot_2014-01-17_at_1.45.09_PM.png

The idea is that you get that data set and then filter it based on a date, for instance the last time you processed this information. The next time, you are going to filter it and get a different set of data for that FTP list.

You can then use that FTP list as a driver and then in a workflow, which will allow you to remove all those files you’ve just processed. This brings me to a point I skipped over in the webinar. For this type of situation we recommend that you have some sort of file processing system set up for handling these files. So Step 1 might be to download the file, Step 2 to move the file into a bin to be processed, Step 3 is to process that data and when that step is done you might have a next step that says “move to completed bin” or similar.

 Screen_Shot_2014-01-17_at_1.47.06_PM.png

If you follow that pattern it will save you in a situation where you’ve had something go wrong with the data and you need to go back and restart from a certain point, which Centerprise allows you to do. So if you’ve done this pattern I just described, you will be able to easily get back to a point and not wonder and try to find what happened to your data/files. You’ll easily be able to recover from flow processing errors.

 

 

 

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.