SuccessFactors: Goodbye Provisioning Jobs Part 1 | Using Integration Center (IC) for synchronizing user data from Foundation to LMS via SF – User Connector
Overview
This blog post will explain how to setup an integration center flow for creating an output file being used to synchronize user data from Foundation to Learning Management System via Integration Center. Therefore we use SF User Connector in LMS.
Many thanks to those blog posts, who really prepared this scenario, but did not finally answer some question in terms of using IC for LMS specific connectors in depth:
- Integration Center – A viable and flexible alternative for LMS Connectors: https://blogs.sap.com/2017/07/02/integration-center-a-viable-and-flexible-alternative-for-lms-connectors/
- Exporting Benefits data using Integration Center EDI Format: https://blogs.sap.com/2018/03/25/exporting-benefits-from-employee-center-using-integration-center-edi-format/
Scenario
In most scenarios a standard provisioning job is used to create an comma separated output file which is imported by SF User Connector.
There might be scenarios where an Integration Center Flow exceeds the functionality of the provisioning job:
- Delta Handling
- Filtering acc. to dynamic criteria
- Multiple export jobs
- Admin/customer-controlled jobs for synchronizing FO to LMS
- Custom Adaptions to fields (like inactive in foundation and active in LMS)
Goal
My goal is to provide an explanation how to build this flow.
Please excuse that there might be technical differences according to business configuration of the system, custom adaption should be possible after reading this blog.
Realization
- Go to Integration Center
- Create a new schedulable flow
Scheduled
Source Type SF
Destination Type SFTP
Format: EDI/Stacked Delimited –> necessary for double headline - Options
Comma Separated
CR/LN -> Carriage Return / New Line –> to delimit lines correctly from each other - Now, the magic 🙂 … it was quite difficult to understand how to build and use this EDI file format… at the end you have to insert a new segment, select the segment (row 1 column 1) and click “insert sibling (which creates row 1 column 2). You have to continue this for all your fields as headline 1, headline 2 and content line. At the end in my productive file I have created each line with 42 fields and it took quite some time to insert them, fill the labels and so on.
- I used calculated fields in row 3 to manage the double quotes being used as a wrapper for every content per column. There were also some specialties and difficulties calculating the correct values for Manager, HR and the Hire as well as the Company Exit Date
- Sorting should be adapted like in the provisioning job (ends up in importing active users first, then all inactive users)
- Now you are good to go. You need to define your sFTP Destination like contracted with SAP and define your file name (e.g. user_data_xxx.txt).
- In filter section you might do some filtering according to your requirements (e.g. Delta Handling)
Conclusion
It took quite some time to build this flow including all specialities like double headline, double quote separation incl. comma separated values. At the end this effort turns out beeing useful for admins. Full control of the job in terms of scheduling and adapting to your requirements.
Important Information with scheduling this flow: it turned out that planning this flow with schedule (e.g. nightly run) an job in provisioning will be created. If, however, two of the same flows exists, this job fails telling you that there is another Provisioning Job for SF User Connector. In this case, remove all jobs (delete, not only unschedule) and re-schedule the job in Integration Center.
Cheers
Christian
Thanks for sharing Christian!
In our case, additional filters in Advanced Filters were needed to include inactive employees. Are inactive employees work ok for your case?
Hi Vasily, thanks for your comment. I set priority on realizing a flow, which does exactly the same as the provisioning job.
The answer is yes. Inactive handling works (even if a user is inactive in FO and not yet existing in LMS, as well as existing as inactive in both). In Filtering you can use "In Range" and as value "active, inactive" but also keep it blank because you want "everything" from FO to LMS no matter if active or inactive.
Hi Former Member, Thanks for the mention to my blog and excellent job in describing the detailed connector configuration steps.
Hi Christian!
Very interesting and useful article, thank you for your experience!
I created the same integration, but since 20th of February, I can't upload the data to the LMS. User Connector SF usually gives the status "FAIL - FAILURE". Do you know why it can be?
Hi Anastasia, I don’t think that this is a general issue. Did you check the logs? Normally in User Connector – SF you can insert an email where logs (in your case an error log) should be sent. FAIL may also occur if your file is empty ... do you have data sent via Integration Center?
BR, Christian
Yes, I’ve checked the log, but it’s empty. I work with LMS connectors a few years and I’ve never seen this status. My file looks good enough. I usually download the file from sftp and then check it from “show archived input files” menu.
However, sometimes my data is loaded, sometimes not. If I just delete a few rows using Notepad++ (so, it can’t choose the formatting of the original file) then file uploads correctly! Also, when I download the preview file from IC and put it manually to sftp, it’s loaded too.
Regards, Anastasia
I would like to help but honestly don't know how without even having an error message.
I think it is necessary to narrow down your error by finding the one data set which curses the issue - maybe someone without userId oder anything? Something critical mandatory for LMS which is not given in the file.
Maybe you should contact SAP support on this.
BR
Christian
I try to find any data corruption but no avail..
Thank you for your advice. I'll try to contact support.
Regards,
Anastasia
Hi guys,
I am in a project now, about to implement Integration Centre for LMS: will replace the Provisioning job to trigger from IC instead, to get the separation of the id and descriptions in the different fields of LMS: organization, job code etc.
I am worried on the impact where certain fields by design in platform are still going to be required as concatenated as in: id+description. So, if IC is in place for Learning only and the Talent modules still rely on the UDF: the experience on reporting and admin tasks between PMGM and LMS or CDP modules for example, might get a bit tedious. Anyone with some experience on this?
Thanks for sharing this
One question how did you manage the end of line for each row (!##!)
Thanks
Hi Zakaria,
For the SF-User Connector you don't need to finish with !##!, only a ".
But for one of the other LMS Connectors where you did you would simply use the calculation:
e.g. Field Value = [userId from User] Prepend Text ' " ' Concatenate/Append ' " '
This is the final field calculation that we use in our SF-User connector.
If you have any further queries regarding this then please let us know.
Regards,
Nathan H.
Hi Christian,
thank you for the detailed explanation.
As for I previously had experience only with the normal User Connector, I have a probably simple question about this specific IC job for the SF User Connector.
Why are the three headers necessary and what what logic should be used when creating these three rows of headers?
Thanks,
Daria