Skip to Content
Technical Articles
Author's profile photo Zili Zhou

Realtime (delta) Data Replication from SAP Analytics Cloud to BW/4 HANA

Motivation:

[updated on 27.11.2023 with common mistakes or tips in the customers’ projects]

As planning data in SAC is often changed by business users, these changes need to be replicated to the target systems like BW/BPC, S/4 HANA or Data Warehouse Cloud. This is a feature highly demanded by many customers.

The Data Export Service API is already GA in Q2.2022 with full data. With SAC QRC4 2022 the Data Export Service Delta feature will also be generally available without any delta toggle. Currently, if you are on a fast-track tenant, you can request the feature to be toggled on in your system. For further information please have a look at the help document for the delta feature in this API.

This blog will focus on how to use BW/4 HANA (as target) integrated features (Smart Data Integration) to pull the delta data from SAC (source) using the Data Export Service. Please be aware that this in contrast to a live connection or an import connection where SAC is the target and BW is the source.

Content

Architecture and Prerequisites

Steps

Step Actions Purpose
1 Install DP Agent and connect BW/4 to SAC Prepare for the connections
2

Using SAC Planning model as BW source

–        Verification at HANA side

–        Create BW source system

–        Create Data Source

–        Understand the logic of changing data

–        Understand the logic of deleting data

Understand the delta logic from SAC to the replicated table at BW source
3

Setup BW data flow

–        Create an ADSO Z1DETADSO with  change log.

–        Create a DTP and simple mapping from data source to the ADSO

–        How changes at SAC side are reflected the ADSO Inbound Table, change log and Active Data

Understand the delta logic from replicated table to BW ADSO.

FAQ

Further Links

Call for actions

Architecture and Prerequisites:

Below is the architecture we are going to use. Compared to the way to use ABAP and the API to write to a specific table, the advantages are here 1) best to utilize the HANA artifacts and memory for the real-time replication with SDI technology; 2) No programing is required.

Architecture

Architecture

A DP Agent of version 2.5.3 or higher is necessary. It is recommended to use the latest DP Agent.

The target systems can be a BW/4HANA, DWC, or HANA on premise system (at time of publication of this article). Please check the latest PAM HANA SDI to check the latest status if more target systems are supported.

SDI%20CloudDataIntegrationAdapter%20PAM

SDI CloudDataIntegrationAdapter PAM

Steps 1: Install DP Agent and connect BW/4 to SAC

There is already blog introducing this. Please refer to this help about the details how to install DP Agent and connect to the HANA system under BW/4: Leverage the SAP Analytics Cloud Data Export Service to extract your planning data to SAP HANA, SAP Business Warehouse and SAP S/4HANA

Important: If you do not change the DP Agent default setting, it only allows to use max 4 GB memory even DP Agent is installed on a machine with much larger memory.  We see this in general cause performance issues in many customer cases. Most of the cause, a small and medium DP Agent sizing will meet your requirement in this SAC–> BW integration.  Thus it is recommended to set DPAgent ini, parameter Xmx to at least 8192m or higher, Increase Xms to the same or similar number.

More details could be found in SAP 2688382 – SAP HANA Smart Data Integration Memory Sizing Guideline

  SMALL MEDIUM LARGE
Use Case

A small scenario with:

·       One source system

·       Up to 40 tables

·       A weighted table size category of S-M

·       Federation (initial load) of tables balanced based on HANA target capacity

·       Modification rate less than 1,500,000 records/hour

·       The example above fits here

A midrange scenario with:

·       Approximately 1-3 different source systems

·       And/or up to 100 tables in total

·       A weighted table size category of M-L

·       Federation (initial load) of tables done sequentially across sources and balanced based on HANA target capacity

·       Modification rate less than 5,000,000 records/hour

An upper mid-range scenario with:

·       Up to 6 different source systems

·       And/or up to 300 tables in total

·       A weighted table size category of M-XL

·       Federation (initial load) of tables done sequentially across sources and balanced based on HANA target capacity

·       Modification rate less than 10,000,000 records/hour

DPAgent Server

·       Hardware: 8-16 CPU cores, 16-32 GB of main memory, 2-3x disk space based on main memory

·       DPAgent ini updates: Increase Xmx to 8192m or higher Increase Xms to the same or similar number

·       Ensure 6-8 GB of free RAM availability for the OS and JVM variations*, above and beyond the Xmx setting

·       Hardware: 16-32 CPU cores, 32-64 GB of main memory, 2-3x disk space based on main memory

·       DPAgent ini updates: Increase Xmx to 16384m or higher Increase Xms to the same or similar  number

·       Ensure 8-12 GB of free RAM availability for the OS and JVM variations*, above and beyond the Xmx setting

·       Hardware: 32-64 CPU cores, 64-96 GB of main memory, 2-3x disk space based on main memory

·       DPAgent ini updates: Increase Xmx to 32768 or up to 65536m               Increase Xms to the same or similar number

·       Ensure 12-24 GB of free RAM availability for the OS and JVM variations*, above and beyond the Xmx  setting

SAP HANA Target System

(for replication only)

·       Single Remote source

·       ~ 1 additional CPU core

·       < 1 GB memory (not including memory growth over time as data volume increases)

·       Separate remote source(s) for high volume modification rate tables

·       ~ 2-4 additional CPU cores

·       1-2 GB memory (not including memory growth over time as data volume increases)

·       Separate remote source(s) for high volume modification rate tables

·       ~ 4-8 additional CPU cores

·       2-4 GB memory (not including memory growth over time as data volume increases)

 

Step 2: Using SAC Planning model as BW source

Verification at HANA side

Before start doing anything in BW, you can verify in HANA side –>Provision –> Remote sources, it looks like below. The planning modes are under SAC folder. In case it is empty and also you do not get error, it is many cases due to the result was blocked by firewall in your local network.

Verificate%20the%20connection%20in%20HANA%20Studio

Verificate the connection in HANA Studio

Create BW source system in BW Modeling Tool (BWMT)

Now we change to BW modeling Tool to create a Smart Data Access type source system. We call it SACDES here. Attention: All the SDA and SDI connection type is under this Smart Data Access. We are in fact using SDI here, however it is still called SDA in the UI.

BW%20Source%20System%20Type%3A%20Smart%20Data%20Access

BW Source System Type: Smart Data Access

 

Create Data Source in BWMT

Now we create a Data Source under SACDES. All the planning transaction data are delta enabled. Master data does not support delta. Thus, we search “FactData” here, which is transaction data. Attention: This is going to take a while to retrieve all the metadata from SAC. And you need to check in your SAC modeling URL to find the models.

My model is https://<mySACURL>sap/fpa/ui/app.html#/modeler&/m/model/C9ZOQZN2GI2L4HV6S4MK8ULFK

Choose%20the%20SAC%20model%20from%20BW

Choose the SAC model from BW

 

Here we created the data source called “Z1DELTAS”

Create%20A%20New%20Datasource

Create a new datasource

Go to extraction Tab. Make changes as below for the delta enabled transaction data. The real-time replication type Upsert is like after image and insert is similar to before/after image in BW context.  We will use UPSERT here and will see how delta are reflected later.

Here are more information about the difference of UPSERT and INSERT.

Delta%20enabled%20extractions

Delta enabled extractions

 

Activate your data source first. Then click “Manage“ button, which is behind “Remote Subscription Type UPSERT”.  You should see status Initial here. By this step, BW will generate a virtual table, replication table and subscription, which we are going to check in details later.

Manage%20HANA%20Remote%20Subscription

Manage HANA Remote Subscription

 

Tips: If you choose “without data transfer” above, the initial loading will also happen in a normally cases (if it is the first initial loading with the same filters for this HANA data source). In case of huge data volume, it can save initial loading time.  “With data transfer” will transfer data twice in most of cases, but it will not lead to wrong data as the 2nd data transfer will not really insert into the target table (/BIC/CMT…).

 

Now you execute the delta initialization in the background.

Execute%20the%20delta%20initialization

Execute the delta initialization

After the jobs are done successfully, you can check how many records are replicated into the replication table.

Here is the way to find out what the virtual table, replication table and subscription. Click Overview and you will see all the subscription generated by BW in the system.  BW delta enabled data source generate VT (/BIC/CMV<Datasourcename>0000X000001) , Replicated Table (/BIC/CMT <Datasourcename>0000X000001) and subscription (/BIC/ CMU<Datasourcename>0000X000001) )

Overview%20of%20HANA%20Remote%20Subscriptions%20created%20by%20BW

Overview of HANA Remote Subscriptions created by BW

In our case, the generated replication table is /BIC/CMTZ1DELTA00001000001 and our BW/4 is in SAPHANADB, thus you can run SQL in the HANA directly. We have replicated 508 records.

Check%20number%20of%20records%20at%20replicated%20table

Check number of records at replicated table

Understand the logic of changing data

We will change a record (GLAccount  =“Cost of Goods Sold”, products “city” and Date “Jan (2022)”) from 9.000k to 6.500k

First, we check it in the replicated table it looks like below. SignedData is 9.000k.

Check%20record%20to%20be%20changed

How the record looks like at replication table

 

Now we are going to modify a record in SAC story: change this SignedData from 9.000k to 6.500K . and click publish.  IMPT: only public version can be pulled via the API.

Publish%20changed%20data%20in%20SAC

Publish changed data in SAC

The virtual table points to SAC and it will be changed immediately. Verify by below

select  *  from "SAPHANADB"."/BIC/CMVZ1DELTA00001000001"  where "GLAccount"='Cost of Goods Sold' and "Products"='City'

Check%20the%20virtual%20Table

Check the virtual Table pointing to SAC planning model

 

Depending on the volume, after some seconds or maybe several minutes, you can see the replicated table is also changed. SDI_CHANGE_TYPE” A” (as in Autocorrect) is for inserted or updated records.

select  *  from "SAPHANADB"."/BIC/CMTZ1DELTA00001000001"  where "GLAccount"='Cost of Goods Sold' and "Products"='City'

Update%20is%20replicated%20to%20BW

SAC changed data is automatically replicated to BW

Understand the logic of deleting data

Now I delete this record (GLAccount  =“Cost of Goods Sold”, products “city” and Date “Jan (2022)”) from SAC and publish.

Publish%20data%20after%20deletion

Publish data after deletion

Here is the result from the virtual table, nothing could be found now.

Record%20is%20deleted%20at%20SAC

Record is deleted at SAC

 

Here you will see the record is SDI_CHANGE_TYPE has value “D” for deleted in the replicated table.

BW%20mark%20record%20as%20D%20and%20clean%20up%20the%20transaction%20data

SDI marks record as D and clean up the transaction data

Similarly, you can test the remote subscriptions of type INSERT, it returns the complete delta history in the target table of the replication.

Maybe you are already thinking about after some time, how to clean up the replicated table. The good news is BW has also implemented the house-keeping for the delta tables as below. Here is the LINK.

House%20keeping%20of%20replicated%20table

House keeping of replicated table

Step 3: Setup BW data flow

 

Create an ADSO Z1DETADSO with change log.Create%20BW%20ADSO%20with%20Changelog

 

Create a DTP and simple mapping from data source to the ADSO

Create%20DTP%20in%20BW

Create DTP in BW

 

How changes at SAC side are reflected the ADSO Inbound Table, change log and Active Data

Here are 3 tables generated behind this ADSO.

Inbound%20Table%2C%20change%20log%20and%20Active%20Data%20of%20ADSO

Inbound Table, change log and Active Data of ADSO

We are looking at the Inbound table /BIC/A<ADSO>01 as below

Check%20the%20Inbound%20Table

Check the Inbound Table

Here is the active data table after the request is activated. The delete 1 record is not active and only 507 are there.

select * from "SAPHANADB"."/BIC/AZ1DETADSO2"

Check%20active%20table%20after%20deletion%20in%20SAC

Check active table after deletion in SAC

After this, I changed one record in SAC, you will see from ADSO request only 1 record is loaded and 1 is activate.

Loaded%20Request%20and%20Activate%20Request

Loaded Request and Activate Request

Here is the change log for this ADSO.

Changelog%20in%20ADSO

Changelog in ADSO

FAQ

  1. Does the delta API support Classic Account Model?

    A: Yes, it supports both SAC Classic Account Model and new model.  It also supports both  Analytic models and Planning models. All transaction data has delta. Master data does not support delta.

  2. Do I need to filter out private version while using the API?

    A: No, only published data is exposed by API.

  3. How to trouble shooting subscription at HANA side?

    A: Download the SQL statements from Note 1969700 – SQL statement collection for SAP HANA.Search the keywords “SmartDataIntegration”. There will be around 9 SQL statements. For example , “HANA_SmartDataIntegration_Agents”  is to check the DP Agent status connected to your HANA. “HANA_SmartDataIntegration_RemoteSubscriptions” is to check the remote subscriptions.

  4. How to check the trace at the DP Agent side?

    A: SAP Note 3102531 – Where is dpagent framework trace located? – SDI

  5. How to check if my SAC Odata API works correctly?

    A: You can test your  SAC Odata API in Postman or below link Try Out | Data Export Service | SAP API Business Hub

  6. I have Dev/QA/Production landscape for my on premise systems, can they share the same DP Agent?

    A: No, DP Agent can only have one target system. Thus, you have to install DP Agent for each of your BW,/4 HANA, S/4HANA or HANA system.

  7. Can I create more than one subscription to one SAC planning model.

    A: yes, one consumer(BW, HANA and so on) could create several subscriptions to the same model.

Further Links

Further Blogs/links on the usage of SAC Data Export Service API:

Call for actions:

This method is a seamless combination from SAC Data Export Service Odata API to BW/4 and utilized SDI technology. There are still a few things we can tune the performance in case of huge data volume.   For example, parameter “Pagesize” of the data source in HANA is 10.000 by default and you could increase much 10 times large or even more. Parameter “Poll Period” (in mins) is by default 5 minutes, this could be reduced if you want to try to check the poll more frequently.

Be free to share your test result or any other comments here.

In case there is a need for guidance on how to use this architecture for HANA on primes or Data Warehouse Cloud, please also let me know.

Common issues customer facing:

1 use HDB Studio to change SDI parameters and remote source is not working aftewards.

We noticed sometimes when customer use HDB studio to change parameters like pagesize, some HANA data source properties are changed.

Solution: use SQL command directly or below tools for the changes.

SAP HANA Web-based Development Workbench or Web IDE and SAP Business Application Studio

SQL example

Alter REMOTE SOURCE “SACDatasourcename" ADAPTER "CloudDataIntegrationAdapter" AT LOCATION AGENT “SDIAGENTNAME" CONFIGURATION'<?xml version="1.0" encoding="UTF-8" standalone="yes"?><ConnectionProperties displayName="Configurations" name="configurations">
              <PropertyEntry name="host">SAC Host</PropertyEntry>
              <PropertyEntry name="port"></PropertyEntry>
              <PropertyEntry name="protocol">HTTPS</PropertyEntry>
              <PropertyEntry name="servicePath">/api/v1/dataexport/administration</PropertyEntry>
              <PropertyEntry name="pageSize”>100000</PropertyEntry>
              <PropertyEntry name="auth_mech">OAuth2</PropertyEntry>
              <PropertyEntry name="oauth2_grant_type">client_credentials</PropertyEntry>
              <PropertyEntry name="oauth2_token_request_content_type">url_encoded</PropertyEntry>
<PropertyEntry name="oauth2_token_endpoint">Token Endpoint</PropertyEntry><PropertyEntry name="require_csrf_header">true</PropertyEntry>
</ConnectionProperties>'
WITH CREDENTIAL TYPE 'PASSWORD' USING
'<CredentialEntry name="oauth2_client_credential">
    <user>Client ID</user>
    <password>Client Secret</password>
</CredentialEntry>';

2. SDI exception prevent replicate new data

Please make sure the exception.is cleared up. Otherwise, it will prevent all data source replicate the data or create new subscription.

Reason and how to process the exceptions could be referred as below:

3. DP Agent has not configured enough memory

make sure that you have make a proper DP Agent sizing, The memory (parameter -xmx) is default 4 GB. Please check below document and change accordingly.

Sizing document and Note 2688382 – SAP HANA Smart Data Integration Memory Sizing

 

 

Assigned Tags

      35 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo William Yu
      William Yu

      Master piece Zili! Thank you so much!

      Author's profile photo Carlos Jara Muñoz
      Carlos Jara Muñoz

      Amazing post, thank you very much for sharing Zili Zhou!!!

      Author's profile photo Santosh Hiremath
      Santosh Hiremath

      much needed feature for real time reporting for planning community!! Thanks Zili !!

      Author's profile photo Cemal Aslan
      Cemal Aslan

      Great blog, thank you Zili. I am curious, how this would work with DWC?

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Cemal Aslan ,

      In DWC, you can either do it via data builder (setup can refer to https://blogs.sap.com/2022/04/04/using-the-sac-data-export-api-with-data-warehouse-cloud-and-data-intelligence/) or data flow.

      In data builder, in data intergration moninor --> Remote table monitor--> select the corresponding FactData from SAC and change it to "Enable realtime Access".

      In Data Flow,  you can use the Upsert for the target table. However the delta part is calculated at DWC side. There might be some workaround for this.

       

      regards

      Zili

       

      Author's profile photo Xin Lu
      Xin Lu

      Thank you for sharing, Zili.

      We have BW 7.5 SP25 as backend, I try to create a new datasource point to SAC in BWMT, in the extraction tab, there is no "Overwritten delta with deletion" option. There are 3 options, "Delta only via Full upload", "Delta with After image", "Delta with Before image".

      Is it needed to setup Delta process in SAC ?

      Regards,

      Xin

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Xin Lu

      I did not try in BW on HANA. You can try with "Delta with After image", it should be similar.  The activation of the data source will take care of the createion the subscription (for delta).  You do not need to create a delta process from SAC side explictly. But you need to ask your SAC contact person from SAP to turn on the toggle before QRC4 release is there.

       

      regards

      Zili

      Author's profile photo Shivaram Kashyap
      Shivaram Kashyap

      Hello Xin,

      We are on BW 7.5 SP19 and I was told that I cant create a Source System of type HANA SDA. How are you going about creating a Source System and a Data Source. Pls point me to a blog or help docs that have this information.

      Thank You,

      Shivaram.

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Shivaram,

      this is how BW on HANA system SDA system creation

      BW%20on%20HANA

       

      regards

      Zili

      Author's profile photo Harsha JALAKAM
      Harsha JALAKAM

      Hi Zili,

      Thanks for sharing the blog.

      Is there any way where we can filter the data coming into BW. At the moment, when we do initial load ( create subscription ), the SDA-->DP agent pulls all the data from SAC model into the virtual table. This causes a huge time delay in pulling data into BW and memory issues in the DP agent. So , to mitigate this is there any way where we can pull only the required data by filtering the data from the SAC model ,to pull only the required data rather than pulling all the data.

      Just to add, since all the data from the SAC model is being pulled into the Virtual Table in the first stage, the filter defined at the DTP level will not have any impact on the data that is being pulled into the system

      Regards,
      Harsha

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Harsha,

      you can add filter at Hana source.  https://help.sap.com/docs/SAP_BW4HANA/107a6e8a38b74ede94c833ca3b7b6f51/75af177bf3ee437c8a4fe0cf10c38799.html?version=2.0.5

      adding filter at data source is not possible. So you need to create several HANA source system if you want to add different filters.

       

      regards

      Zili

       

       

       

      Author's profile photo Harsha JALAKAM
      Harsha JALAKAM

      Hi Zili,

      Many thanks for your response.

      So you need to create several HANA source system if you want to add different filters. --> would this mean that we can 't add multiple conditions like  BPCVERSION = 'ACTUAL' & BPCENTITY = 'GLOBAL'?

      And if only single condition is supported is this format supported BPCVERSION IN ('ACTUAL','BUDGET')?

       

      is there any procedure or syntax in specifying the filters And where in BW we can specify?

      Regards,
      Harsha

      Author's profile photo Bartłomiej Poteraj
      Bartłomiej Poteraj

      Hi Zili,

      this is very relevant for complex SAC planning models.

      Where would you actually configure such a filter on the HANA source system ?

       

      In the data source, I don't see such option (system is SAPK-75012INSAPBW )

       

      Author's profile photo Bartłomiej Poteraj
      Bartłomiej Poteraj

      Managed to find the answers:

      1. You can define SQL Static Filer in  Data Source -> Extraction only after you maintain the RSLOGSYSDB table https://answers.sap.com/questions/724570/sdi-integration-to-bw4hana-not-possible-to-specify.html
      2. Only a single condition can be specified in the SQL filter https://me.sap.com/notes/0003337958
      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Barttmomiej,

      For BW, you can note 3355482 which would skip the filter check , when the RSADMIN parameter 'RSDS_SKIP_FILTER_CHECK' is enabled. Then you can add more than one filter in BW data source.

      In purely HANA source, it is always possible to add more than one filter in Replication task.

       

      regards

      Zili

       

      Author's profile photo Zhenhong Jiao
      Zhenhong Jiao

      Hello Zili,

      This is really a excellent blog. I followed your steps to enable SAC->BW data exporting successfully.

      But I still have one question that you might share some lights. I'm not able to enable the real-time data extraction.

      Pls find my testing SAC DS below, with AIMD delta mode. I upload couple of records in SAC side, and it won't replaicate to BW side automatically. I manually execute the delta DTP in BW side and successfully load the data into ADSO.

      Is there any additonal steps I need to do in SAC or BW side to make the changed data load/activate in ADSO by real-time?

      Our SAC version is 2023.2.6, and BW on HANA version is 2022/03 SP23.

      BR,

      Zhenhong

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Zhenhong,

      if you execute delta DTP the data is in ADSO, it means your data is already in /BIC/CMT <Datasourcename>0000X000001. you can first check if the delta is already there.

      The DTP is not triggerred automatically. But you can try streaming process chain if you want the BW part data is triggerred.

       

      regards

      Zili

      Author's profile photo sureshkumar sundaramurthy
      sureshkumar sundaramurthy

      Dear Zili Zhou,

      Thanks a lot for this blog. I have followed the same steps to implement but I stuck at one point where we cannot see any SAC model to select while creating Data source under SDA Source (Remote Resource) in BW system. Can you please help if we missed any access privileges or steps.

      But we could see those model information from under remote source in HANA Verification. we%20cannot%20see%20SAC%20model%20in%20Data%20source

      we cannot see SAC model in Data source

       

      Regards,

      Suresh

      Author's profile photo Giovanni Leggio
      Giovanni Leggio

      Hi sureshkumar sundaramurthy , did you follow the steps in the below blog?

      https://blogs.sap.com/2022/05/30/leverage-the-sap-analytics-cloud-data-export-service-to-extract-your-planning-data-to-sap-hana-sap-business-warehouse-and-sap-s-4hana/

      Thanks,

      John

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Suresh,

      it looks like the BW ABAP user does not have select privilege for the tables, Please check with your basis team which user owns this data source. And try to grant privileges to the BW ABAP user.

       

      regards

      Zili

      Author's profile photo sureshkumar sundaramurthy
      sureshkumar sundaramurthy

      Hi Zili,

       

      Thanks for your response.

      it is resolved now, but we are facing other problems as Delta records are not updating in Replication table after we do change in SAC so SAP suggested to upgrade the DP agent version (2.6+).

      Regards,

      Suresh

      Author's profile photo sureshkumar sundaramurthy
      sureshkumar sundaramurthy

      Hi John,

      Thanks.

      it got resolved after we use Data source Path Prefix as "sac" instead "sap".

       

      Regards,

      Suresh

      Author's profile photo Wanja Spieß
      Wanja Spieß

      Hi Zili,

       

      thanks for this blog.

      We set everything up and everything works. However, we have an issue with the delta when deleting records. We make a transfer from one indicator to another indicator (e.g. from dimension value FIX to VAR). The value is first posted to the new indicator and then the record of the old indicator is deleted. In the delta, the new entry comes first and then the entry with record mode D (reverse order as I would expect).
      Since our ADSO in BW4 has all dimensions as a key, this is not a problem at first. If we post from this ADSO to a second ADSO, the data records are posted in the same order. However, the second ADSO does not have the indicator as a key (structure ACDOCP), which means that when it is activated, the new data record is activated first and then overwritten by the deleted one. This causes a huge problem.
      Do you have an idea what could be causing this? I would be very grateful for any help.

       

      Best regards,

       

      Wanja

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Wanja,

      If at SAC side, the deletion happens first but BW handle it with wrong sequence. You need to open incident in BW side.

      If at SAC side, the deletion was after the upsert. This is as designed. The problem is 2nd ADSO there is not the same PK.  If you can find some patterns, you can try to write some routine in the transformation not to transfer the deletions to the 2nd ADSO.

       

      regards

      Zili

      Author's profile photo Bartłomiej Poteraj
      Bartłomiej Poteraj

      Zili Zhou great description.

      I face a specific issue - the virtual table representing the Fact data of the model  does not contain fields SDI_CHANGE_TYPE, SDI_CHANGE_TIME

      Issue is visible in a 'classic' account model as well as in the migrated 'new' model type.

      Some time ago we followed the setup outlined here and the 'full' export is working fine

      System version is 2023.8

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Bartłomiej,

      SDI_CHANGE_TYPE, SDI_CHANGE_TIME are only in Target Table "/BIC/CMT......" in a realtime replication. Virtual Table "/BIC/CMV......"i is pointing to SAC model and will have the same structure as SAC. Virtual table does not contain those fields SDI_CHANGE_TYPE, SDI_CHANGE_TIME.

       

      regards

      Zili

      Author's profile photo Bartłomiej Poteraj
      Bartłomiej Poteraj

      Hi, I figured out in the meantime that I was using a BW source system of type SAP HANA -> Local SAP HANA Database Schema , and in this case it is necessary to use SAP HANA -> SAP HANA Smart Data Access

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      In BW, please implement note 3355482 which would skip the filter check , when the RSADMIN parameter 'RSDS_SKIP_FILTER_CHECK' is enabled. Please make sure to perform thorough testing before using the filter. You will be able to add more than one filter.

      In WebIDE, you can always add more than one filter in replication task. Then you can also use local HANA source in BW to consume the data.

      Author's profile photo Bartłomiej Poteraj
      Bartłomiej Poteraj

      Zili Zhou is the note released?

      Author's profile photo Josef Haid
      Josef Haid

      Hi Zili Zhou

      Thanks for this valuable post. We used your blog and Maximilian Paul Gander 's blog ("Leverage the SAP Analytics Cloud Data Export Service to extract your planning data to SAP HANA, SAP Business Warehouse and SAP S/4HANA") to establish the RemoteSource.

      Now I'm inside Step 2: I verified that my planning model exists in the remote source on HANA side. Creating the DataSource in BW4, I cannot find my planning model. BUT: when I copy my planning model to a different folder in SAC, the new planning model can be used in the DataSource.

      Have you ever seen such behavior in BW4 and do you know why this happens?

       

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Josef,

      have you solved your problem? I have not seen this kind of issues. This blog is using 2-legged oauth and it should not relevant to end-users authorization.

       

      regards

      Zili

      Author's profile photo Josef Haid
      Josef Haid

      Hi Zili,

      no unfortunately I couldn't find a solution yet.

      Regards, Josef

      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Josef,

      please open an incident in component LOD-ANA-ML-DI.

       

      regards

      Zili

      Author's profile photo Bartłomiej Poteraj
      Bartłomiej Poteraj

      Zili Zhou thanks again for all the information, after fixing some sneaky issues in the DP agent server I was able to start the delta replication.

      I have a next question regarding the transfer of changes from SAC to the replication table.

      Depending on the volume, after some seconds or maybe several minutes, you can see the replicated table is also changed.

      I my case, after changing <100 rows, i takes from 2 to 4 minutes to see the effect in the replication table - which is disappointing. The whole replication scope is about 9 million rows.

      1. Would reducing the overall replication scope help ?
      2. Are there specific configuration options in SAC/HANA/BW that could make the replication more performant?
      Author's profile photo Zili Zhou
      Zili Zhou
      Blog Post Author

      Hi Bartłomiej,

      by default, the delta will be pulled from SAC to replication table every 5 minutes.  Thus  2 to 4 minutes you have seen is normal and not relevant to data volume.

      Performance will be impacted by many factors including total records in SAC, No. delta records , network, DP Agent memory/CPU, and HANA in BW side memory/CPU, parameters like pagesize .

      our experience is increasing pagesize is a quick win if you do not encounter bottleneck in other areas.

       <PropertyEntry name="pageSize”>100000</PropertyEntry>

       

       

      best regards

      Zili