Ariba and SAP (SRM and ECC) integration
I recently worked on an Ariba integration project and would like to share my exp and findings. We implemented below Ariba modules:
-
- Ariba Sourcing Professional– Upstream
- Ariba Supplier Information and Performance Management-Upstream
- Ariba Contract Management Professional –Downstream
- Ariba Savings and Pipeline Tracker
In our scope of work SRM was used for contracts which were created in Ariba and only a read only copy was passed to SRM. SRM was being used for PO creation and further invoicing was done in ECC. Now due to the unique landscape in mind it became a bit challenging .
As the standard integrator work well with the ECC system but not SRM some of the data being extracted lacked the SRM specific checks like:
- User Attributes currency , company code specific checks
- User roles and Authorizations in SRM
- Product category vs GL mapping
- Authorization to create specific document type based on user attributes
- User specific pur grp information
Since the data is only sent from ECC system and not SRM so, the business had to do away with SRM specific checks when raising a contract in Ariba. After you have decided on the starting the Ariba implementation ensure you follow the below strategy plan for master data extraction.
You need to ask or know the answer to below questions:
- What data is needed in Ariba
- Do you have that data being extracted using Standard extractors
- What data set will be a one time vs incremental load in the system
- What shall be the frequency of the incremental load
The way standard integration works is that the file gets downloaded as a CSV file from the ECC system to the Ariba system. This gets stored in a shared folder drive and uploaded manually to Ariba or can be stored in a shared folder and directly loaded in Ariba.
Basis and security may need to add some commands which are External OS Commands via SM69 and add to the existing roles so you are able to extract the files.
Master data extraction program which we used was /ARBA/MASTER_DATA_EXPORT. Since there was a couple of changes which was needed I copied this as a Z program
There is a particular code which sends the file to the local folder and the removes it since I wanted to know what the data was being extracted I commented that code which can be found at below PERFORM statement.
PERFORM remove_directory USING gv_filepath.
After you execute the program and after you have hidden the code it will send the file extracted as CSV in your drive folder:
Adding custom filters to the extracted data
The data which will get extracted above is all the data from the system. The standard program ARBA/MASTER_DATA_EXPORT calls different FM like for user, GL , Vendor related FM and extracts the data..
SAP has provided provision of adding some customer specific filters like you may want to only filter some specific company code, or you want to filter only specific employee group etc. For adding custom logic in the data being sent can be maintained in /ARBA/TVARV table. In order to understand what values is the program looking for I read the code. So the standard program there is a FORM validations. It contains specific filters for each of the data sets like below one you can use /ARBA/SL_PARTNER_TYPE.
Filter criteria in the main program ARBA/MASTER_DATA_EXPORT as seen below
This FM /ARBA/PREFILTER reads data from table /arba/tvarv
Sometimes the standard program /ARBA/MASTER_DATA_EXPORT may call a FM like / /ARBA/VENDOR_ONLY_EXPORT and this will contain the pre-filter value like seen below .
Another Fm which is called to filter the data based on user parameter is /ARBA/PARAMETER
This also calls the same table /ARBA/TVARV
You can talk to the Ariba SPOC and they can help you to udnerstadn all parameters. I referred the link to find the list of all parameter or if you are a bit technical and if you can manage to read the program and the different FM it calls it can be seen in the code what are the available parameters that the program is checking for.So I maintained this value in the table
Below are some of the values in the table: /ARBA/TVARV that I maintained
/ARBA/CLASS | Value CL11111 |
/ARBA/COMPANY_CODE_EXPORT | BUKRS Only the company codes you want to extract if not all |
/ARBA/EXTERNAL_SID | Maintain the Client System ID like SD1CLNT110 etc |
/ARBA/GENERAL_LEDGER_EXPORT | Specific GL series that needs to be included / excluded or |
/ARBA/MATERIAL_CHARACT | ATINN |
/ARBA/MATERIAL_ONLY_AML | MATNR |
/ARBA/RL_PARTNER_TYPE | PARVW |
/ARBA/RL_VENDOR_ADDRESS | |
/ARBA/SL_PARTNER_TYPE | PARVW |
/ARBA/SPLIT_NOMREC | I Eq 40000 to split files if no of records are more than 40K |
/ARBA/TEMP_DIRECTORY | Place where you need you file to be kept. |
/ARBA/VENDOR_ONLY_EXPORT | KTOKK Specific Vendor group |
/ARBA/ZTIME | Time Zone specific settings |
BALANCE_SHEET_ACCOUNTS | To include Balance sheet accounts in extraction by default they are not extracted |
This is not an exhaustive list and the SAP Ariba contact point for your project should let you know of the specific parameter which you would like to use to build on a custom logic for extraction.
I will be creating a separate blog on the other learning and technical object details which I used in the Ariba implementation project
Thanks for reading and let me know what you think of the blog
Do read my other blog on Ariba certification here:
Regards
Vinita
Nice blog vinita.
We are also working on similar ground filtering only required data to be exported via TVARV.
Thanks a lot for the effective blog.
I'm trying to send data from SAP ECC to Ariba by the /ARBA/MASTER_DATA_EXPORT and the file also extracting, but its showing an issue of "The Specified path is invalid".
I'm trying to send sourcing data with the help of direct connectivity without middleware.
Ashish ,
Please check if the entry maintained in /ARBA/TEMP_DIRECTORY is valid or not.
Br,
Manoj
Thanks Manoj for the replied.
As you said i had given the entry as ".\Masterdata" as in /ARBA/TEMP_DIRECTORY and the ERP system name which i created in Ariba portal also.
I have an issue with realm id that on the URL of ariba portal is the same realm id which i have to use or someother in SOAMANAGER and the computer name of access URL
Hi Ashish
You have to use the same Realm ID even on SOAMANAGER.
Kind regards,
Sabelo.
Hi
I have maintained the following under /ARBA/TVARV table for /ARBA/COMPANY_CODE_EXPORT parameter. I want the Master Data Program to filter using 2 company codes but it's not doing that, am I missing something?
The Please see the attached image for more details:
can u attach the error which you are getting
Hi,
You need to enter two rows where the first row is EQ 1458 and the second is EQ 1637. The first row should have Number = 0 and the second row should have Number = 1.
Best regards,
Kristoffer
Hi Kristoffer
It works, I followed what you said and it's now working thank you.
Thanks,
Sabelo.
I am not getting an error but it's not filtering, it's bringing back all the company codes even the ones I don't need. I only need it to display company code 1458 and 1637.
See the attached images (Results from tcode - AL11):
I think it accepts only single entries. Can you try the with 2 different entries instead of Comma? Also you can filter in the badi as well using the
/ARBA/MASTER_DATA_EXPORT there is one of the method related to company code extraction as far as I remember I do not have access to the system
Hi Vinita
The option of using methods for filtering also works I used it for complex filtering.
Please see the name of method for filtering CompanyCodes:
method /ARBA/IF_EXP_MASTER_DATA~PUBLISH_COMPANYCODE.
Thanks,
Sabelo.
Hi
I am getting an error whenever I try to execute the CurrencyConversion under Cross/Application Configuration Data, am I supposed to add something in table: /ARBA/CURRENCY_CONVERT_EXPORT, please see the attached images for more details:
Thanks in advance,
Sabelo.