This blog post will explain how to setup an integration center flow for importing a user data file, if file-based approach is used to import users into foundation.
Honest words first: it works. You can import the file-based CSV via IC. But it turned out that the provisioning job offers some more service which – at the end convinced me to stay with the provisioning job instead of using IC.
This does not apply to Part 1 of my blog series, I fully use IC flow for synchronizing FO to LMS.
In most use-cases when file-based user data approach is chosen, a provisioning job is planned to import this data from sFTP.
In my scenario there was a use case in which I did not want to enhance SAP standard export report which generates the file, because SAP promised to realize some new things in it for GDPR, so I wanted to temporarily use a field and use my IC flow to move this value into the correct field in SF.
I could not find any other use-case why IC might be better then provisioning job (except for having a clear focus on using a calculation rule for one or more fields before importing the file).
The goal of this blog post is to show the difficulties you have to take care of when realizing this flow with IC.
Please excuse that there might be technical differences according to business configuration of the system, custom adaption should be possible after reading this blog.
During realization it turned out that using a scheduled inbound sFTP to SuccessFactors integration does not support filtering, sorting or any other options like how to handle headlines.
First part would be to handle double-headlines. Therefore I found no other option than creating a dummy user which is inactived every time the integration gets triggered and thus replaces the second headline (which will be shown as a value when importing an example file).
- Create a new inbound flow, sFTP to SuccessFactors as CSV
- Import a sample file, best case containing active and inactive users
- Now we have to take care about the double headline by applying a calculation rule to each of the columns … yes: each of the columns.
You need to check if the row which is currently processed contains the second headline, or is it a real value you want to import into the system. If it contains a headline text, you keep most of the columns empty except for userid (in my case dummy999 and the status in my case inactive). If it is a value, return it.
For Manager, Matrix Manager and HR apply a default value (NO_MANAGER or NO_HR) via more options during calculation.
For dates apply a date format MM/dd/yyyy via more options during calculation.
- Plan your schedule and setup your sFTP connection and you are good to go.
It seems quite easy building this flow but it takes some time to realize a calculation with all options (dates, special fields) for each of the values you want to import.
The dummy user is also not what a professional solution should look like (imho) but it is the only way to take care of the double headline.
Also a lot of warnings and messages raised where you would have to start adapting the whole business configuration just to match the IC flow.
You would also have to enhance your calculations with messages/warnings if you want to do it right… which raises another question of how much time to invest.
Another thing is if you have values containing a COMMA you need to replace this comma in advance or IC thinks this is a separator … no option to this and again where you have to adapt everything just to match your IC flow.
If there is a use case where you have to do some thing during this flow which is not possible during creating the file, the flow is your solution – but this is where I stopped and stayed with the provisioning job.
PS: I think HCI/SCI (whatever name SAP currently uses for SAP Cloud Integration) for importing the sFTP file or directly using standard integrations / own integrations would be my favorite choice for this scenario.