Spend Management Blogs by SAP
Stay current on SAP Ariba for direct and indirect spend, SAP Fieldglass for workforce management, and SAP Concur for travel and expense with blog posts by SAP.
cancel
Showing results for 
Search instead for 
Did you mean: 
ajmaradiaga
Developer Advocate
Developer Advocate
In this blog post, I will share my findings while extracting data and developing a near real-time reporting solution of SAP Ariba by using the SAP Ariba APIs available. Also, I include a brief comparison between the Analytical Reporting and Operational Reporting APIs.

Unfortunately, there is no mechanism, e.g. webhooks, for an external application/listener to be notified when a particular object has changed in SAP Ariba. For example, receive a notification when a Sourcing Request has been created/updated/deleted. This means that we need to poll the APIs available to achieve near real-time reporting of SAP Ariba data. For this, you will need to use a combination of the Operational Reporting APIs and other object specific APIs available.
️If you are unfamiliar with the SAP Ariba APIs, the SAP Ariba Developer Portal and don't know where to start.... I recommend visiting this blog post first - SAP Ariba developer portal – How to create applications and consume the SAP Ariba APIs

Given that we are interested in near real-time reporting, we will be mainly using the Operational Reporting synchronous API. That being said, there are scenarios where the asynchronous API will be more suitable, e.g. migration, consolidation of data. Choose wisely when deciding which API will fit best your needs.
The examples below cover the Operational Reporting APIs for Sourcing. That being said, the principles mentioned apply for the Procurement Operational Reporting APIs as well.

Can I use the SAP Ariba Analytical Reporting APIs?
Short answer, no. SAP Ariba provides reporting out of the box but the reporting is based on analytical data. These analytical data is accessible via the SAP Ariba Analytical Reporting API but there is at least a 6 hour delay from the data being created/updated and it being available in the Analytical Reporting API. Below some differences between the Analytical and Operational Reporting APIs:

































Analytical Reporting API Operational Reporting API
Data availability Delay of at least 6 hours Real-time
Document types Composed of Facts and Dimensions, exactly the same that are part of reporting within SAP Ariba Simple tables
Relationships The metadata provides us the relationships between document types, e.g. Fact table will have relationships with Dimensions No relationships between document types specified in metadata
Richness of Data Includes way more data than its equivalent in the Operational Reporting API Not as many tables (and fields) exposed as in the Analytical Reporting API
Filtering You are able to filter on almost any field that is selectable Not that many fields available for filtering. Generally, TimeCreated and TimeUpdated. It is likely that additional filtering will need to happen in the extraction program

If you want to know how to easily visualise the metadata of the Analytical Reporting API, check out this blog 👉 https://blogs.sap.com/2020/10/28/generating-entity-relationship-diagrams-from-the-sap-ariba-analytic.... It shows you how to generate entity-relationship diagrams from the metadata.

Now that we are familiar with some of the differences between the Analytical and Operational reporting API, lets get back to the Operational Reporting APIs and its use when it comes to near real-time reporting.

Tips and Tricks



  • Get very familiar with the fields for each of the document types that you are interested in. Validate your assumptions and what the API returns against what you see in the SAP Ariba UI. Catching a data related bug in a report is much harder and costly than noticing it during development.

  • If the data is not available in the Operational Reporting API you might be able to use a subset of the data returned in the reporting API to call another API. For example, the Operational Reporting API does not expose Sourcing Requests as a document type but the Task document type returns tasks for Sourcing Requests. Also, the ProjectAuditInfo return information when a Sourcing Request is updated. We can use the Sourcing Request ID to call another API, e.g. External Approval API, and retrieve details of a Sourcing Request.

  • Supplier data is not available in the Operational Reporting API. For this, use the Supplier Data API with Pagination. You can filter by updated date to retrieve suppliers updated within a specific time frame.

  • Master data can be used to enrich the data returned by the Operational Reporting API. For this, you can use the Master Data Retrieval API.

  • It is possible to define view templates with time created and time updated as filters. Sometimes you might need to extract data based on when the data was created and other times it is better to extract data based on when it was last updated.


"filterExpressions": [
{
"name": "createdDateTo",
"field": "TimeCreated",
"op": "<=",
"defaultValue": null
},
{
"name": "createdDateFrom",
"field": "TimeCreated",
"op": ">",
"defaultValue": null
},
{
"name": "updatedDateTo",
"field": "TimeUpdated",
"op": "<=",
"defaultValue": null
},
{
"name": "updatedDateFrom",
"field": "TimeUpdated",
"op": ">",
"defaultValue": null
}
]

Things to be aware of



  • Although it is possible to define a view template in the Operational Reporting API with a document type of the Analytical Reporting API, do not expect your requests to magically return real time data 😃. This is a bug in the API.

  • Validate the structure of the underlying objects returned by a document type. There are document types that return data for different objects, e.g. Task document type in the Operational Reporting API. It can return tasks for Sourcing Requests, Sourcing Projects, Contract Workspaces. The structure of a task for a Sourcing Request, a Sourcing Project, and a Contract Workspace might vary slightly as some of these might contain additional data.


Recommendations



  • Given that data will be replicated and that there can be an issue when replicating it. There should be a mechanism to ensure that the integrity of the data has been transferred. For this, you can use the asynchronous APIs to extract the data for a day and use this to ensure that all data is in the target system/data warehouse.

  • Define view templates with only the data (select fields) that you require. This will reduce the load in the API, you will retrieve data faster and if using the asynchronous API, the size of the files returned will be more manageable.

  • Use a custom-schedule to extract data if possible. Depending on your business requirements, it might be the case that you don't require real-time data 24/7. A custom schedule can be used to define the period of the day when near real-time data is required, e.g. running every 15 minutes during core business hours, and when a delay is acceptable, e.g. running every hour outside of core business hours. This will reduce the number of API calls and in a way, working around the API rate limits.

  • The different APIs have different rate limits, make sure that the scheduling and developments takes this in consideration.

  • The asynchronous API is your friend. It can be used for the initial load and to ensure data integrity.


I hope the information shared in this blog post makes you aware of some of the benefits and limitations that there are when trying to report on near real-time SAP Ariba data. It is possible that this blog post gets updated with time, as there might be additional tips/tricks/info that I find along the way which will be worth sharing.

Note: Special thanks to my colleague mustafagilic for reading an initial draft of this blog post and providing valuable feedback 👍
4 Comments