SAP Ariba Analytical Reporting API: Part 1 – Overview of Analytical reporting Data Model
In this post, I will be covering the first part of the three part series on SAP Ariba Analytical Reporting API.
Understanding the basic data model of Analytical reporting database, Reporting API structures and its limitation will ensure a smooth project implementation of data extraction utilizing this API.
Below are the summary for each of the part:
- SAP Ariba Analytical Reporting Data Model
I will briefly cover the Reporting Dimension(s) and Fact(s) classes as Database object, Hierarchies, References, and ASM Data load.
- SAP Ariba Reporting API Structure
I will cover the Synchronous and Asynchronous Method of data extraction, getting Metadata, using OOTB and custom ViewTemplates, Filtering data, using available Parameters, and reading the Rate limit.
- Best practice: SAP Ariba Analytical Reporting API
I will discuss the typical use case of the API along with Q&A, FAQ, and Tips.
Part 1: SAP Ariba Analytical Reporting Data Model
Analytical Reporting Database consists of 2 classes:
Reporting facts store the basic transactions you want to investigate on a report, such as invoices, purchase orders, contracts, or events. Reporting facts includes all aspects of a transaction – IDs, dates, amounts, suppliers, commodities, departments, cost centers, regions, etc. Each reporting fact stores data for one type of transaction including a number of pre-defined calculated fields called Measures. It contains numerical data values such as the number of lines in invoices, the lead bid amounts from events, the count of contract workspaces, etc.
Reporing dimensions store the reference information for data that is shared across reporting facts, such as commodities, suppliers, departments or cost centers, users, etc. Reporting dimensions data is typically updated less frequently by an administrator at your company and it does not come from transactions.
From this point, I will refer to the Reporting Fact as simply fact and Reporting Dimension as dimension.
Utilizing facts and dimensions, various activity can be examined. The fact returns information about the individual transactions. Dimension returns related dimension data. For example, report on invoice fact could display the Company name field from the Supplier Dimension, the City field from the Region Dimension, the Commodity Type field from the Commodity Dimension, and the Department field from the Department Dimension.
Many facts share the same dimension data. For example, if an event and a contract existed between your company with XYZ company, both event and contract Facts use the Supplier Dimension to store detailed information about XYZ company.
Dimension can store data in different levels, organized in a top-down structure called Hierarchy. Hierarchy progresses from general to specific information. For example, the Region Dimension has a top level Continents(USA, Asia, Europe, Middle East, etc) and lower levels such as individual countries or states (UK, France, Belgium, etc).
Dimension can have more than one Hierarchy. For example, a Time dimension can be divided into two hierarchies: Calendar and Fiscal. Each of these hierarchies can contain multiple levels. Calendar hierarchy could contain one level for calendar year, calendar quarter, etc. Fiscal hierarchy could contain fiscal year, fiscal quarter, etc.
Fact and Dimension are referenced using lookup keys. All dimension data associated with a lookup key is associated with the fact record that uses the specific lookup keys. For example, Company name, address, city, and zip code from Supplier Dimension is available for the fact record that includes the supplier lookup key.
Please refer to Reporting fact reference for more information on facts and dimensions.
⚡ Check this blog post by Antonio Maradiaga for creating Entity Relationship diagram utilizing Analytical Reporting API Metadata.
Now that we have the facts and dimensions explained, we’ll briefly discuss the process that loads data from transactional database to reporting(analytical) database.
ASM (Ariba Spend Management) Data Load
Please note that detail of custom Data Load operation is out of the scope of this post as SAP Ariba Analytical reporting API only consume existing data, not populating the data to SAP Ariba. For details regarding Data loading tools, procedures, and information, please refer to the SAP Ariba Reporting Data load and administration guide.
ASM Data Load is a scheduled data load that happens from transactional database to analytical database. The interval of ASM Data Loads are 6 hours after the previous load completes. This is for both dedicated schema and shared schema. Since subsequent data loads are based on the completion of the previous load, the duration of the load process determines total cadence of the ASM Data loads.
Now that we cover the data load scheduled, please remember that data coming from SAP Ariba Analytical reporting API is NOT live or up to the minute data.
Status, details, and contents of Automated ASM Data Load operations are available from Administrator dashboard under Reporting Manager > Schema Manager
While we are discussing the ASM Data Load, please pay attention to these important date fields in relation to the data load:
- TimeCreated – Time when the actual record is created in Analytical database.
- LoadCreateTime – Time when the data load task responsible for the record is first initiated during a particular data load that will affect some particular records. LoadCreateTime will either be the same or earlier from the TimeCreated as LoadCreateTime is the time when the data load is initiated/created where TimeCreated is the time when the particular record get created.
For example: if ASM Data load runs for 15 minutes, the first record will have TimeCreated close to the beginning of the data load and the last record will be have TimeCreated close to the last minute.
- TimeUpdated – Time when the actual record is updated in Analytical database.
- LoadUpdateTime – Time when the data load task responsible for the record update is first initiated during a particular data load that will update some particular records.
Same rules applies between TimeUpdated and LoadUpdateTime as between TimeCreated and LoadCreateTime.
Please remember that all these time fields are related to ASM data load. NONE of these time have anything to do with date or time from transactional data, including LastModified date.
Next in Part 2, we will review the structure of SAP Ariba Reporting API.
I was creating a view with the document type DocumentFact. The response showed me that a LoadCreateTime had a different time span for contracts to a creation time. Once it is a difference between load time and create time 8 days, and on the second example it is 13 hours.
My question here is, what could cause such a delay of a load time, and do you know what is the maximum time that load could take after the creation, in my case, contract workspace?
Thank you and kind regards,
The Value of "Createdtime" is the timestamp the Workspace object is created in the Ariba UI. The "LoadTimeCreated" is the timestamp the Workspace or Document is pulled into the Reporting Database. These timestamps will always be different, but the 8 day difference may mean that the Document in question was not in a "reportable state" within the documents folder tab until the date the report DB shows the "TimeCreated" Timestamp.