Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
zili_zhou
Advisor
Advisor
 

Contents


Motivation

High Level Architecture

Design

An example how delta data works

Working with SAC Data Export Service delta API

  1. Enable SAC Data Export Service

  2. Check your provider ID

  3. Configuration in Postman

  4. A test of list all the providers in SAC

  5. create a subscription to get the delta from SAC models

  6. Use a fixed API Endpoint to get the initial and delta


 

Design considerations for moving the data (Transfer the carrots)

  1. Implementations required in your Odata API consumers in a nutshell

  2. Two important tables to be created at the Odata API consumers

  3. Verification if all records are transferred

  4. Retention policy in subscriptions

  5. Possible of losing delta by calling externalID

  6. Helpful ABAP resources for implementations

  7. Helpful Integration Suite resources for implementations


 

Data transformation for individual logic (Cook your own carrot soup/muffin/pancake)

Learn more

 


Motivation


After the release delta function of SAC Data export Service Odata API , there are many requests coming from S/4 HANA customers (On-premises or Private Cloud) how the delta data could be transferred from SAC to S/4, for example, after planning data is changed and only the delta part is replicated to ACDOCP table. Or planning data transferred to Budget control systems. This blog is to explain a possible way how to deal with delta logic and provide customers a foundation to implement your own solutions.

As the logic is implemented on ABAP or other corresponding language, a HANA DB is not required. The general method this blog explains also applies

  • BW, ERP and any other ABAP systems which support Oauth and http requests.

  • Using other program languages ( python, Java ) or integration tools (integration suite) to replicate data into different systems (success Factor, Snowflake)


High Level Architecture


Below is the high-level architecture. It is important to separate the data staging and transformation into two steps.  This provides an easy way for the maintenance and trouble shooting in exceptional case like network and program ended with exceptions.  We will see later how the design make sure the data integration.


We will focus on step 1 in this blog.  Data staging could be done via

1) Custom coding like ABAP on application level or using integration suite in BTP. We do not need to deal with the SAP SDI Cloud Data Integration specific logic. Just need to deal with it as Odata and HTTP request, like in our example take the carrots as “food” to be transferred.

Or 2) Using SAP HANA SDI (Smart Data Integration) to create replication tasks. Here SAP already use the Cloud Data Integration  adapter to implement a lot of coding for you. So the way Realtime (delta) Data Replication from SAP Analytics Cloud to BW/4 HANA. could also be used in S/4 HANA. In this way, it is taking the carrots as “food does not need to be frozen and not that fragile during the transport”. So, you do not need to write your own coding to move the data.

This blog is going to elaborate way 1), which does not require DP Agent installation and no need to access on DB level.  The reason 2) SDI is not elaborate here is: Many customers in S/4 have expressed concerns about the security of accessing the HANA DB directly. If you see this differently, please be free to leave your comments.

We will have a rabbit-moving-carrots analogy to explain how it works. Staging tables are like a storage warehouse where all the raw materials like carrots are stored and verified for correctness. Target tables are like the kitchen where the raw materials are processed into the finished products like carrot soup, pancake, and muffin.



Design


ABAP systems will be used as an example in following elaborations. Postman is used as tool to check all the API.



An example how delta data works


Here is the initial value of a data model in SAC before a subscription is created































Year* PostPeriod* CompanyCode* Amount Currency
2023 1 001 2000 USD
2023 1 002 4000 USD
2024 2 001 6000 USD

*Year, PostPeriod and company code are the primary keys in SAC data foundation.

When a subscription to SAC is created from the consumer side, the first delta link at SAC will be generated.

 
"@odata.count": "3",
Value[
{
"@odata.": "context”:”FactData/$entity”,
"@odata.id": " FactData(Year=’2023’, PostPeriod=’1’, Companycode=’001’)”,
Year:’2023’,
PostPeriod:’1’,
Companycode:’001’,
Amount: 2000
Currency: ”USD”
}
{
"@odata.": "context”:”FactData/$entity”,
"@odata.id": " FactData(Year=’2023’, PostPeriod=’1’, Companycode=’002’)”,
Year:’2023’,
PostPeriod:’1’,
Companycode:’002’,
Amount: 4000
Currency: ”USD”
}
{
"@odata.": "context”:”FactData/$entity”,
"@odata.id": " FactData(Year=’2024’, PostPeriod=’"’, Companycode=’001’)”,
Year:’202$’,
PostPeriod:’"’,
Companycode:’001’,
Amount: 6000
Currency: ”USD”
}

The staging table at the Odata consumer side











































Year PostPerioud CompanyCode Deltalink UpdateTime Type Amount Currency
2023 1 001 Frist-deltaXXXXXX <timestamp1> Insert 2000 USD
2023 1 002 Frist-deltaXXXXXX <timestamp1> Insert 4000 USD
2024 2 001 Frist-deltaXXXXXX <timestamp1> Insert 6000 USD

 

Then there is a deletion and a change like below































Year* PostPeriod* CompanyCode* Amount Currency
2023 1 001 2000 USD
2023 1 002 4000 4500 USD
2024 2 001 6000 USD

 

Here is how SAC delta link looks like
"@odata.count": "2",
Value[
{
"@odata.": "context”:”FactData/$entity”,
"@odata.id": " FactData(Year=’2023’, PostPeriod=’1’, Companycode=’002’)”,
Year:’2023’,
PostPeriod:’1’,
Companycode:’002’,
Amount: 4500
Currency: ”USD”
}
{
"@odata.context": "#FactData/$deletedEntity",
"@odata.id": " " FactData(Year=’2024’, PostPeriod=’2’, Companycode=’001’
}

Here is how the staging table looks like. Depends on how you want to calculate the value later, you can decide for the update value if you want to have only the after value (4500), or the delta part (500) in this case. For the deletion, if you want to implement it as the after value(0 )or the delta part (-6000).































































Year PostPerioud CompanyCode Deltalink UpdateTime Type Amount Currency
2023 1 001 Frist-deltaXXXXXX <timestamp1> Insert 2000 USD
2023 1 002 Frist-deltaXXXXXX <timestamp1> Insert 4000 USD
2024 2 001 Frist-deltaXXXXXX <timestamp1> Insert 6000 USD
2024 2 001 DeltalinkID1 <timestamp2> delete 0 USD
2023 1 002 DeltalinkID1 <timestamp2> Insert 4500 USD

 

Similarly, there is another changes in the SAC table like below
























Year* PostPeriod* CompanyCode* Amount Currency
2023 1 001 2000  2600 USD
2023 1 002 4500 USD

 

Here is how the staging table could look like

 









































































Year PostPerioud CompanyCode Deltalink UpdateTime Type Amount Currency
2023 1 001 Frist-deltaXXXXXX <timestamp1> Insert 2000 USD
2023 1 002 Frist-deltaXXXXXX <timestamp1> Insert 4000 USD
2024 2 001 Frist-deltaXXXXXX <timestamp1> Insert 6000 USD
2024 2 001 DeltalinkID1 <timestamp2> delete 0 USD
2023 1 002 DeltalinkID1 <timestamp2> Insert 4500 USD
2023 1 001 DeltalinkID2 <timestamp3> Insert 2600 USD

 

Working with SAC Data Export Service delta API


In this section, we will use Postman to see how it works for the API.

1.     Enable SAC Data Export Service


The configuration to enable the SAC Data Export Service could be checked in https://blogs.sap.com/2022/05/30/leverage-the-sap-analytics-cloud-data-export-service-to-extract-you... section SAP Analytics Cloud configuration

The prerequisites of Data Export Service could be found in this link.

https://help.sap.com/docs/SAP_ANALYTICS_CLOUD/14cac91febef464dbb1efce20e3f1613/db62fd76514b48f8b71d6...

 

2.     Check your provider ID


You can find in your data foundation the provider ID, which we are going to use later. Only FactData supports delta, master data does not have a delta.

https://SAPURL/sap/fpa/ui/app.html#/modeler&/m/model/C9ZOQZN2GI2L4HV6S4MK8ULFK


 

3.     Configuration in Postman


We configure in Postman for the test, you need 1) Access Token URL, 2) Client ID and 3) Client Secret as below.

Below are using two-legged auth in Postman. The required information you can find as an admin in SAC Tenantà Admin --> App Intergration



 

Here is the configuration of Postman:


 

4.     A test of list all the providers in SAC


Here we are using this URL to list all the providers from this SAC tenant. We can also see C9ZOQZN2GI2L4HV6S4MK8ULFK is there.

https://your-sac-tenant-url.cloud/api/v1/dataexport/administration/Namespaces(NamespaceID='sac')/Pro...



5.     create a subscription to get the delta from SAC models



  • In any SAP OData service to perform a write operation needs a CSRF token first.


To get a valid x-csrf-token, you could use any valid URL. For example, we use below https://SACURL/api/v1/dataexport/administration/

You need to include x-csrf-token: fetch in the header. In the header of the response, you will get the csrf-token.




  • Create first subscriptions, in the body part you need to specific it for FactData (transaction data) in this ProviderID.  ExternalID is self-defined, which you are going to retrieve the latest changes.


In the response section, you can see it is successfully created with the externalID.


Here are the relationships between ChainID and subscription ID. Here we see how to use externalID Del_Postman01 to point always to the unretrieved deltalink in ChainID sac_C9ZOQZN2GI2L4HV6S4MK8ULFK_FactData_2023-02-23T13:14:39.636Z.

One provider could have several chains, each chain has its own subscriptions.  The chain could be created by different Odata API consumers like Postman, S/4 HANA, BW and also in the SAC subscription UI. One consumer could also have many several chains. Filters could also be added in the chains.



6.     Use a fixed API Endpoint to get the initial and delta


There is a fixed endpoint /providers/{namespaceID}/{providerID}/FactData?externalid={externalID} to get the last subscription in a Chain.

We will always use the same API to get data, there is no difference for this endpoint for initial loading and delta.

  • Simulate initial loading
    The first time, I get 507 records. That was the records originally in this FactData.


https://SACTenant/api/v1/dataexport/providers/sac/C9ZOQZN2GI2L4HV6S4MK8ULFK/FactData?externalid=Del_...

 

The first time postman run this API /providers/{namespaceID}/{providerID}/FactData?externalid={externalID}


 

Here is another useful API could get all the subscriptions from this providerID

https://<SACURL>/api/v1/dataexport/administration/Providers(NamespaceID='sac',ProviderID='C9ZOQZN2GI2L4HV6S4MK8ULFK')/Subscriptions

I just need to search ChainID


Now if you run the same API again, no data will be returned. As now the initial loading has finished.  End users have not done any changes yet.


  • Update and delete


Then I will do make some changes and also deletions in the SAC story which based on this model. With the same Endpoint

https://<SACURL>/api/v1/dataexport/providers/sac/C9ZOQZN2GI2L4HV6S4MK8ULFK/FactData?externalid=Del_P...

you will get the upsert and delete records in below format. @odata.id are the primary key. For deleted records, only keys are there, other transaction columns are not there in the delta link as you do not care the value when you delete it. You can decide if you want to implement it as null or 0 in the staging table. The way we design this staging table make it also possible to check the before image—the value before the deletion/update. Thus, it is not a problem if you also want to check the columns before the deletion/update.
"@odata.count": "2",
Value[
{
"@odata.": "context”:”FactData/$entity”,
"@odata.id": " FactData(Year=’2023’, PostPeriod=’1’, Companycode=’002’)”,
Year:’2023’,
PostPeriod:’1’,
Companycode:’002’,
Amount: 4500
Currency: ”USD”
}
{
"@odata.context": "#FactData/$deletedEntity",
"@odata.id": " " FactData(Year=’2024’, PostPeriod=’2’, Companycode=’001’
}

Below is the logic of the delta links connected in a chain. Each chain can always find its previous chainID in PreviousChainID. This logic can be used when you need to deal with exceptional cases like the externalID is already moved but Odata consumer side has not got all the data.



Design considerations for moving the data (Transfer the carrots)


1.     Implementations required in your Odata API consumers in a nutshell








































Function SAC API ABAP Implementation 
Authentication Tenant access-> Admin, App integration transaction code OA2C_CONFIG: OAuth 2.0 Client Profile
Get all the providers /api/v1/dataexport/administration/Namespaces(NamespaceID='sac')/Providers/ HTTP requests get. Could reuse method F4_HELP_PROVIDERS
Get metadata of a provider /api/v1/dataexport/providers/sac/< ProvidersID>/$metadata HTTP requests get, create staging table at consumer
Create subscriptions at SAC /administration/Subscriptions (POST) HTTP request Post (example of Usage of CSRF token in ABAP for POST request), add entry in admin table
Get the initial loading and delta /api/v1/dataexport/providers/sac/<ProviderID>/FactData?externalid=<ExternalID> HTTP request Get, insert into staging table
deletes a chain of subscriptions /administration/Subscriptions(NamespaceID='{namespaceID}',ProviderID='{providerID}',SubscriptionID=’’)?deleteChain=true HTTP request Post, delete entry in admin table

 

You could implement logic by calling those API in ABAP as well as in other program language or tools. Here is the important link for all these API Endpoints and Paramters.

 

2.     Two important tables to be created at the Odata API consumers


Here are two important tables to be created.

Admin Table


The purpose of admin table is to store the externalID and SAC model names, so that in your coding to get the initial loading and delta in a http request, you could replace the variables to have an URL in below format to replicate data into staging table.

api/v1/dataexport/providers/sac/<ProviderID>/FactData?externalid=<ExternalID>

 















Model ID   External ID Filters and many optional columns
<ProviderID> <ExternalID> optional

 

The admin table should be updated during the creation, deletion of a subscriptions. You could add other columns like if you have filters during the creation of a subscription, timestamp, chainID and so on. But all that are optional.

If you already create a filter when you create a subscription, you do not need to append it in any Endpoint, it is automatically with those filters in the delta link.

Staging table


Do not insert into your target tables directly. Otherwise, if some carrots are lost during the transfer from SAC to OData API consumers and the carrot soup has already been cooked. There is no way for the rabbit to know which carrot is missing.

The purpose of staging table is to verify if all the records are transferred and also you can just select to move parts of the non key columns (transaction data) into the S/4 HANA. You could add filter in the Endpoint to get certain columns  $select=MyColumn1,MyColumn2. But the key columns must be selected.

 

Ideally, when a subscription is created, you loop the meta data for the SAC Fact Table, allow the end user to select and created a staging table which include only the transaction data they need. You could get this by calling to define a proper Primary columns and transaction data.

https://your-sac-tenant-url.cloud/api/v1/dataexport/providers/sac/modelID/$metadata

















Primary Keys DeltaLink Update Time UpdateType Transaction data
@Odata.id in each record "SubscriptionID" in deltaLink <current timestamp> Upsert or delete <required columns>

 

3.     Verification if all records are transferred


The SubscriptionID in deltalink could be used to verify all the records are transferred.

SAC delta link has @entitycount to tell how many “carrots” are to be delivered.

S/4 HANA side running SQL “select count * from staging table where deltaLink= SubscriptionID”  could be used as way to verify the “carrots” have all arrived.



4.    Retention policy in subscriptions 


Please pay attention below. You might get some delta links deleted by SAC automatically in following cases.

  • A model supports up to 500 subscriptions by default. Past this limit, each new subscription automatically deletes the oldest subscription and any data associated to it.

  • An SAP Analytics Cloud tenant supports up to 10 000 subscriptions by default. Exceeding this limit generates an error that prevents you from creating a new subscription in that tenant. If you need an extra subscription, make sure to delete an existing subscription first

  • when the start time of a subscription is older than 40 days it will be deleted


5.     Possible of losing delta by calling externalID


Calling the end point /api/v1/dataexport/providers/sac/<ProviderID>/FactData?externalid=<ExternalID> will make the externalID move to the next delta. If you implement this logic already in production and by mistake to call the API directly like in Postman, it means SAC thinks the data is already retrieved. But the production system like S/4 HANA does not get the data. The data is just sent to Postman. A direct call to the last delta link for test or debug purpose should be avoided in a production system.

6.     Helpful ABAP resources for implementations


Here are some useful blogs/notes for the ABAP implementation for S/4 HANA, which you can reuse some parts of coding from full load for the delta.

Using SAP Analytics Cloud API Data Export Service from SAP S/4 HANA

How to post SAC API output into BW/4HANA using ABAP

3241213 - Downport of integration tools for SAC (core version of PSM-FM tools)

 

7.     Helpful Integration Suite resources for implementations


 

Here is an example to implement it in Integration Suite.

The use case is SAC à Cloud Integration à Success Factors.   SAP Analytics Cloud Integration with SAP SuccessFactors Position Write Back Outbound Flow. This integration is full loading only. But could be modified to a way to support delta.  Here is the document of predefined Integration Flows for OData

Data transformation for individual logic (Cook your own carrot soup/muffin/pancake)


How the staging tables should be changed to the target table could be very different in each application in S/4 FI, Budget Control system and so on. Ideally, in your ABAP systems, you could reuse your original program to read from the staging table(s) to do the field mapping, aggregations, or currency conversions.

Learn more



 

 


 

 

 
3 Comments