Skip to Content

SuccessFactors: Goodbye Provisioning Jobs Part 1 | Using Integration Center (IC) for synchronizing user data from Foundation to LMS via SF – User Connector


This blog post will explain how to setup an integration center flow for creating an output file being used to synchronize user data from Foundation to Learning Management System via Integration Center. Therefore we use SF User Connector in LMS.

Many thanks to those blog posts, who really prepared this scenario, but did not finally answer some question in terms of using IC for LMS specific connectors in depth:



In most scenarios a standard provisioning job is used to create an comma separated output file which is imported by SF User Connector.

There might be scenarios where an Integration Center Flow exceeds the functionality of the provisioning job:

  • Delta Handling
  • Filtering acc. to dynamic criteria
  • Multiple export jobs
  • Admin/customer-controlled jobs for synchronizing FO to LMS
  • Custom Adaptions to fields (like inactive in foundation and active in LMS)



My goal is to provide an explanation how to build this flow.


Please excuse that there might be technical differences according to business configuration of the system, custom adaption should be possible after reading this blog.



  1. Go to Integration Center
  2. Create a new schedulable flow
    Source Type SF
    Destination Type SFTP
    Format: EDI/Stacked Delimited –> necessary for double headline
  3. Options
    Comma Separated
    CR/LN -> Carriage Return / New Line –> to delimit lines correctly from each other
  4. Now, the magic 🙂 … it was quite difficult to understand how to build and use this EDI file format… at the end you have to insert a new segment, select the segment (row 1 column 1) and click “insert sibling (which creates row 1 column 2). You have to continue this for all your fields as headline 1, headline 2 and content line. At the end in my productive file I have created each line with 42 fields and it took quite some time to insert them, fill the labels and so on.
  5. I used calculated fields in row 3 to manage the double quotes being used as a wrapper for every content per column. There were also some specialties and difficulties calculating the correct values for Manager, HR and the Hire as well as the Company Exit Date
  6. Sorting should be adapted like in the provisioning job (ends up in importing active users first, then all inactive users)
  7. Now you are good to go. You need to define your sFTP Destination like contracted with SAP and define your file name (e.g. user_data_xxx.txt).
  8. In filter section you might do some filtering according to your requirements (e.g. Delta Handling)



It took quite some time to build this flow including all specialities like double headline, double quote separation incl. comma separated values. At the end this effort turns out beeing useful for admins. Full control of the job in terms of scheduling and adapting to your requirements.

Important Information with scheduling this flow: it turned out that planning this flow with schedule (e.g. nightly run) an job in provisioning will be created. If, however, two of the same flows exists, this job fails telling you that there is another Provisioning Job for SF User Connector. In this case, remove all jobs (delete, not only unschedule) and re-schedule the job in Integration Center.







You must be Logged on to comment or reply to a post.
  • Thanks for sharing Christian!

    In our case, additional filters in Advanced Filters were needed to include inactive employees. Are inactive employees work ok for your case?


    • Hi Vasily, thanks for your comment. I set priority on realizing a flow, which does exactly the same as the provisioning job.

      The answer is yes. Inactive handling works (even if a user is inactive in FO and not yet existing in LMS, as well as existing as inactive in both). In Filtering you can use “In Range” and as value “active, inactive” but also keep it blank because you want “everything” from FO to LMS no matter if active or inactive.

  • Hi Christian!

    Very interesting and useful article, thank you for your experience!

    I created the same integration, but since 20th of February, I can’t upload the data to the LMS. User Connector SF usually gives the status “FAIL – FAILURE”. Do you know why it can be?

    • Hi Anastasia, I don’t think that this is a general issue. Did you check the logs? Normally in User Connector – SF you can insert an email where logs (in your case an error log) should be sent. FAIL may also occur if your file is empty … do you have data sent via Integration Center?
      BR, Christian

      • Yes, I’ve checked the log, but it’s empty. I work with LMS connectors a few years and I’ve never seen this status. My file looks good enough. I usually download the file from sftp and then check it from “show archived input files” menu.

        However, sometimes my data is loaded, sometimes not. If I just delete a few rows using Notepad++ (so, it can’t choose the formatting of the original file) then file uploads correctly! Also, when I download the preview file from IC and put it manually to sftp, it’s loaded too.

        Regards, Anastasia

        • I would like to help but honestly don’t know how without even having an error message.

          I think it is necessary to narrow down your error by finding the one data set which curses the issue – maybe someone without userId oder anything? Something critical mandatory for LMS which is not given in the file.

          Maybe you should contact SAP support on this.