SuccessFactors: Goodbye Provisioning Jobs Part 1 | Using Integration Center (IC) for synchronizing user data from Foundation to LMS via SF – User Connector
This blog post will explain how to setup an integration center flow for creating an output file being used to synchronize user data from Foundation to Learning Management System via Integration Center. Therefore we use SF User Connector in LMS.
Many thanks to those blog posts, who really prepared this scenario, but did not finally answer some question in terms of using IC for LMS specific connectors in depth:
- Integration Center – A viable and flexible alternative for LMS Connectors: https://blogs.sap.com/2017/07/02/integration-center-a-viable-and-flexible-alternative-for-lms-connectors/
- Exporting Benefits data using Integration Center EDI Format: https://blogs.sap.com/2018/03/25/exporting-benefits-from-employee-center-using-integration-center-edi-format/
In most scenarios a standard provisioning job is used to create an comma separated output file which is imported by SF User Connector.
There might be scenarios where an Integration Center Flow exceeds the functionality of the provisioning job:
- Delta Handling
- Filtering acc. to dynamic criteria
- Multiple export jobs
- Admin/customer-controlled jobs for synchronizing FO to LMS
- Custom Adaptions to fields (like inactive in foundation and active in LMS)
My goal is to provide an explanation how to build this flow.
Please excuse that there might be technical differences according to business configuration of the system, custom adaption should be possible after reading this blog.
- Go to Integration Center
- Create a new schedulable flow
Source Type SF
Destination Type SFTP
Format: EDI/Stacked Delimited –> necessary for double headline
CR/LN -> Carriage Return / New Line –> to delimit lines correctly from each other
- Now, the magic 🙂 … it was quite difficult to understand how to build and use this EDI file format… at the end you have to insert a new segment, select the segment (row 1 column 1) and click “insert sibling (which creates row 1 column 2). You have to continue this for all your fields as headline 1, headline 2 and content line. At the end in my productive file I have created each line with 42 fields and it took quite some time to insert them, fill the labels and so on.
- I used calculated fields in row 3 to manage the double quotes being used as a wrapper for every content per column. There were also some specialties and difficulties calculating the correct values for Manager, HR and the Hire as well as the Company Exit Date
- Sorting should be adapted like in the provisioning job (ends up in importing active users first, then all inactive users)
- Now you are good to go. You need to define your sFTP Destination like contracted with SAP and define your file name (e.g. user_data_xxx.txt).
- In filter section you might do some filtering according to your requirements (e.g. Delta Handling)
It took quite some time to build this flow including all specialities like double headline, double quote separation incl. comma separated values. At the end this effort turns out beeing useful for admins. Full control of the job in terms of scheduling and adapting to your requirements.
Important Information with scheduling this flow: it turned out that planning this flow with schedule (e.g. nightly run) an job in provisioning will be created. If, however, two of the same flows exists, this job fails telling you that there is another Provisioning Job for SF User Connector. In this case, remove all jobs (delete, not only unschedule) and re-schedule the job in Integration Center.