Skip to Content
Technical Articles
Author's profile photo Mackenzie Moylan

Extracting SAP Ariba Reporting API Data using SAP Integration Suite

In this installment of using SAP Ariba with BTP, I’ll be changing topics and focus on how to extract SAP Ariba analytical data with SAP Integration Suite. Before we dive into the more technical components, let’s look into why this a valuable use case for SAP Ariba customers.

Currently, SAP Ariba customers can leverage pre built content on SAP Analytics Cloud that will allow them to gain greater visibility and insights into their spend data. Why is this important? It allows customers to improve savings, drive compliance, and increase operational efficiency. For more information on this topic, please check the linked blogs below:

How Enterprise Analytics for Procurement Improves the Bottom Line

Let’s Play: Make winning procurement moves with SAP Business Technology Platform and Enterprise Analytics

The benefits of using SAP Analytics Cloud are the pre-built content for displaying stories and insights. How can we extract this data from SAP Ariba? That’s where our data extractor comes into play.

The SAP Ariba Analytical Data Extractor

What we’ve done with Integration Suite is construct an iFlow that extracts the Analytical Reporting API data and transforms it from JSON to XML zip files for consumption. The files can then fed into a data storing solution, such as SAP Data Warehouse Cloud. In this blog post, we’ll be focusing on how to extract a fact table, InvoiceLineItemFact.

What is SAP Integration Suite?



Creating SAP Ariba Analytical Reporting API

To get this enabled you will first need to create an SAP Ariba Analytical Reporting API in the developer portal, if you don’t have one.

  1. Log onto, and then follow the prompt and create an account if you need to. Then follow the video below to create an API:
  2.  Once that is done, copy and paste the OAuth UserID, OAuth Secret, and API key. Those will need to be used for later.

Using SAP Integration Suite

Now you’ll need to login to your BTP cockpit and access Integration Suite, from there you’ll need click on Design, Develop, and Operate Integration Scenarios:

From here you’ll need to go to the Design tab on the top left, which is indicated by a pencil logo. You will need to create 5 iFlows to perform the whole data extraction process:

  1. Establish Connection between Ariba/CPI.
  2. CPI to submit view template request to Ariba.
  3. CPI checks if Pagination is needed.
  4. CPI retrieves the information from Ariba.
  5. CPI converts the data that can be sent off to a DB solution.

There will be a collection of iFlows available shortly that you can use to simply download and import into your CPI tenant. This will be made via Discovery Center Mission.

Before we get connected, we will need you to create two Security Credentials, which are designated by the eye icon on the left.

Security Materials are endpoints we need to create to allow flow of data from the source system. You will need to create a User Credentials (the access token) and then an OAuth2 Client Credentials to call the data.

These credentials are from the API you created in the developer portal. You’ll need the API Key, OAuth Client ID, and OAuth Secret. Once those have been created, click Deploy to save them.

Importing and Running the Job

Once those have been created, we will need to download and import our iFlows. The benefits of SAP Integration Suite is that it is relatively user friendly to use prebuilt iFlows. Currently we are working on making these accessible for customers to use. But for educational purposes the below screenshots will show how simple it is to download and upload prebuilt iFlows.

To upload an iFlow, you would go to the Design tab and then go to artifact you’ve built. Then click edit and add an integration flow for individual flows you’ve downloaded, or if you want to upload an entire script collection you can do that as well.

Once that has been imported, you will need to add your company specific information in the configuration section. In the Query field you would add you SAP Ariba realm name. If you have a parent-child site configuration you’ll need to just use the parent realm since that’s where reporting data lives.

For the next section, you’ll want to add your API key, API url and the date it will be querying data from.


You’ll want to repeat these steps for the 5 other iFlows that will be associated with this data fact. Once that is done, save and deploy. Once the jobs are ran successfully you’ll have a zip file containing the API data in XML format that can be used for reporting and data storage tools.

We will have a Discovery Center mission aligned with this use case coming this Summer.

There will be a second part to this blog coming out this Summer too. It will focus on how to send this data into SAP Data Warehouse Cloud to be utilized with pre-built content on SAP Analytics Cloud via an automated end to end process.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Mustafa Bensan
      Mustafa Bensan

      Hi Mackenzie,

      This solution partially addresses a long-time gap in the SAP analytics product suite functionality for the need to load Ariba data into SAP Analytics Cloud (SAC), SAP Data Warehouse Cloud (DWC) and SAP BW/4HANA, which is a common request from customers.  To this day, neither SAC, DWC or SAP BW/4HANA have an out-of-the-box connector for Ariba.  I think it is reasonable for SAP customers to expect that such data source connectivity is at least provided natively for SAP's own products, without the need for an additional integration layer such as SAP Integration Suite at additional cost.

      That being said, the SAP Integration Suite example in this blog is a good start, yet only a partial solution to the problem.  What would be of real value is to provide an end-to-end Integration Suite iflow which also takes the output ZIP file and transfers the data to SAC, DWC or SAP BW/4HANA.  In the end, this is a common problem for SAP customers so they shouldn't have to reinvent the wheel each by figuring out how to automate the upload of the Ariba ZIP file data to SAC, DWC or SAP BW/4HANA.



      Author's profile photo Mackenzie Moylan
      Mackenzie Moylan
      Blog Post Author

      Hi Mustafa,

      We've actually developed an end to end process of taking the data from Ariba and sending it to DWC using SAP Integration Suite. I would recommend to stay tuned and we will provide more thorough content on that process this Summer. Current focus of this article is the data extracting piece using SAP Integration Suite.



      Author's profile photo Mustafa Bensan
      Mustafa Bensan

      Hi Mac,

      That's great to hear.  I will certainly stay tuned and look forward to the more comprehensive content when ready.



      Author's profile photo Bhargava Krishna Talasila
      Bhargava Krishna Talasila

      Hi Mackenzie,

      Thank you for sharing the information.

      We are working on similar use case where we are consuming Operational Reporting for Sourcing API's and post to SAP BW application.

      Please check the below issues that we are facing with both synchronous and Asynchronous Ariba Open API's for your reference and share your thoughts on this.

      Synchronous API's

      Initially we implemented Synchronous iFlows but stuck with the below issue.

      Due to Ariba Synchronous API limitation the processing time around 18 to 20 mins depends on number of records and pages to be fetched from Ariba.

      It is blocking our daily process with HTTP : 429 - Too many requests issue, as the same API were being used by other applications to fetch the data from Ariba.


      API Limitations

      For example -

      We are expecting 4.800 records, these will be split by the API itself into 96 pages. API will allow only 6 calls per minute, therefor it needs ca. 17 minutes to transfer the whole set.

      We approached SAP on this issue and they suggested to user Asynchronous API's

      Asynchronous API's

      SAP suggested to implement iFlow with asynchronous API's. Which makes the process more complex, because of Asynchronous API's design. We must use three API's  (Job submission, Job status, File request) to fetch the data from Asynchronously in Zip file and then unzip to extract the data to perform necessary validation and transformations.

      How to create a client application using the asynchronous Analytical reporting API | SAP Help Portal


      Best Regards

      Bhargava Krishna Talasila

      Author's profile photo Mackenzie Moylan
      Mackenzie Moylan
      Blog Post Author

      Hi Bhargava,

      Those limitations and advice are correct from an SAP Ariba Perspective. You will need to use the Asynchronous Analytical Reporting API to fetch larger sets of data. It will be more complex than Synchronous, but that is the advice I would give anyone too.

      We're using the Asynchronous method for our CPI Analytical Reporting API extractor too. For the Operational Reporting for Procurement and Sourcing APIs, those will be Synchronous and are typically used in more narrow use cases. IE the specific transactional documents/processes that you'd like to see in real time.

      Hopes this helps!


      Author's profile photo Bhargava Krishna Talasila
      Bhargava Krishna Talasila

      Hi Mac,

      Thank you for your response and information. It is very helpful.


      Bhargava Krishna Talasila

      Author's profile photo Jaime Elias Cunha Jr.
      Jaime Elias Cunha Jr.

      Hi Mac

      is there any iflow that i can download from github and upload in the CPI to accelerate this implementation ?

      best regards



      Author's profile photo Aman Dwivedi
      Aman Dwivedi

      Hi Mac,

      Great work. Can you please share the Iflow package file, or update it on github. The current one does not seem to be working.