Technical Articles
Consume CDS Views from SAP S/4 HANA on-premise into SAP Data Warehouse Cloud, leveraging the ABAP Pipeline Engine (APE)
Overview
The purpose of this blog post is to provide an overview on how to extract CDS views from SAP S/4HANA 2020 on-premise leveraging the ABAP Pipeline Engine (APE) and to consume them in SAP Data Warehouse Cloud (SAP DWC). For that purpose we’ll make use of Cloud Connector to establish the connectivity.
As a first step, we will look at the configurations to be applied in the SAP S/4HANA 2020 system.
In a second step, we’ll describe the required configuration in Cloud Connector for setting up the connection including the enablement of functions necessary for the data extraction from the SAP S/4HANA on-premise source system.
Finally, we’ll create both a connection to the SAP S/4HANA 2020 on-premise system in SAP Data Warehouse Cloud and a Dataflow to ingest the data of the CDS View into a deployed SAP Data Warehouse Cloud table.
Before going into further details I would like to thank my colleagues, especially Daniel Ingenhaag and Dr. Christian Tietz for their valuable inputs.
Prerequisites
General information related to the ABAP integration can be found in the following SAP Note 2890171, which describes the integration from various SAP systems like SAP ECC/SLT, SAP S/4HANA and SAP BW systems. In our case we will focus on the integration of SAP S/4HANA on-premise systems.
The minimum prerequisites to consume CDS views from a SAP S/4HANA on-premise system are as follows:
- SAP S/4HANA on-premise system with at least SAP S/4HANA 1909 FPS01 (+ mandatory TCI note 2873666) or a higher SAP S/4HANA version.
- More information about SAP S/4HANA 1909 integration including relevant notes can be found in SAP Note: 2830276.
The relevant note for our scenario built on SAP S/4HANA on-premise 2020 is note 2943599.
The minimum Cloud Connector version to be installed on a dedicated machine on the on-premise network that has access to the internet, specifically SAP Business Technology Platform (SAP BTP) is 2.12.x or newer (Download link).
More information about the pre-requisites for ABAP sources can be found in the official Data Warehouse Cloud documentation on help.sap.com: https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/a75c1aacf951449ba3b740c7e46da3a9.html
Configurations steps
-
SAP S/4HANA ON-PREMISE
The scenario built in this blog post is based on SAP S/4HANA 2020.
The general prerequisites for connecting SAP S/4HANA On-Premise with SAP Data Warehouse Cloud can be found here.
It explains in detail the prerequisites and supported connection types and their properties.
Another important step is the definition and annotation of the CDS views to make them available for consumption. With the annotation, you can specify which views will be exposed in replication scenarios and are suitable for data replication. You can also enable the generic delta extraction functionality which is the element that should be used for filtering the data during delta load.
My colleague Martin Boeckling wrote a great blog post on CDS creation and annotation which you can refer to for more details.
Note: The scenario that is described in this blog post to extract data out of S/4HANA via ABAP Pipeline Engine (APE) into SAP Data Warehouse Cloud does currently not support delta and real-time data extraction. For details about support for delta replication or real-time replication in SAP Data Warehouse Cloud please check here. .
-
Cloud Connector
Cloud Connector serves as a link between applications in SAP Business Technology Platform and on-premise systems and lets you use existing on-premise assets without exposing the entire internal landscape. It acts as a reverse invoke proxy between the on-premise network and the SAP Business Technology Platform.
For our scenario we followed the detailed configuration steps described in this excellent blog post which is explaining how to:
- create a subaccount in the SAP Business Technology Platform
- assign a Cloud_Connector_Administrator role to your administrator user in the SAP Business Technology Platform
- add this subaccount to the Cloud Connector and connect it to the SAP Business Technology Platform
Once you are done with the installation and configuration of Cloud Connector, you can then add a new on-premise system.
Since we want to connect to a SAP S/4HANA on-premise system, let’s select ABAP system.
Add system mapping
Define the protocol to be used (RFC in our case) and maintain the system information (like application server, instance number…)
Enter server application
Click Next and allow the listing settings
Please note that the list of allowed functions depends on the source ABAP system you are using.
More details can be found in the following SAP Note: https://launchpad.support.sap.com/#/notes/2835207
Allow listing settings
Go back to your subaccount in the SAP Business Technology Platform cockpit and verify that the connection has been correctly established with the on-premise network
Connection Check in BTP
-
SAP Data Warehouse Cloud
In SAP Data Warehouse Cloud you need to create a connection to the SAP S/4HANA on-premise system.
Create Connection
Select a connection of type SAP ABAP
Select Connection Type
Define the connection and save it.
Connection Information
Select the connection and validate it
Validate the connection
Switch now to Data Builder and create a New Data Flow
Create a new Dataflow
Define a business name and technical name for the newly created Data Flow, then go to Source and click on the pop-up icon next to your Data Source
Dataflow Properties
Search for your CDS View and click Next
Import Views
Select it and click Add Selection.
This will fetch the metadata of the views and then import it into the Data Flow.
Fetching Views details
Your CDS view has now been added to the Data Flow.
Add a projection operator to map the columns you want to replicate and eventually add a filter to the data
Using a projection operator in Dataflow
Add now a target table and rename it.
Then click on Create and Deploy Table to generate it in the underlying SAP HANA Cloud database
Create target table
Confirm the creation and deployment of the table
Deploy table
Save now the Data Flow
Data Flow saved
As the next step, you can execute the saved Data Flow
Run the Dataflow
Look at the Run Details information.
Note that the status is changing every couple of seconds and that it might take some time until the Data Flow execution has been completed.
Run Details
Once the run has completed successfully, preview your loaded data in SAP Data Warehouse Cloud
Preview loaded data
I hope this was helpful.
If you have any questions please use this link.
If you need more information related to SAP Data Warehouse Cloud please check here.
References:
- SAP S/4HANA OP1909 FPS01 + TCI note: https://launchpad.support.sap.com/#/notes/2873666
- SAP S/4HANA OP2020: https://launchpad.support.sap.com/#/notes/2943599
- SAP ABAP in SAP Data Warehouse Cloud: https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/a75c1aacf951449ba3b740c7e46da3a9.html
- ABAP CDS replication in SAP Data Intelligence (Blog from Martin Boeckling): https://blogs.sap.com/2021/01/21/abap-cds-replication-in-data-intelligence/
- Replicate Data Changes in SAP Data Warehouse Cloud: https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/441d327ead5c49d580d8600301735c83.html
- Configure Cloud Connector : https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/1c7dc8c6acad44869ca9105d0b9d80c9.html?q=cloud%20connector
- How to use SAP Data Intelligence with Cloud Connector (Blog from Dimitri Vorobiev): https://blogs.sap.com/2021/03/16/how-to-use-sap-data-intelligence-with-sap-cloud-connector/
Hi Babacar, in what use cases would you copy the data once and then never again?
I refer to the statement:
Thanks Babacar. Do you mind describing in a bit more detail how the delta works? Say I want to see all sales order records from the ERP system that got changed or deleted since the last delta read. How is that achieved using the APE and what technology is used to find out the changes?
Hello Babacar,
I would like to push forward to this topic.
it will be useful if dataflows in DWC can use CDS views ODP with delta mode.
Nowadays only real time remote table replication can do so but there are limitations:
why SAP wouldn't allow dataflows to read from ODP as delta (so we can schedule properly data extraction) ?
Apart from data intelligence (that someone told me is the same service running under dataflows in DWC):
Regards,
Alberto
Hello Babacar,
are there any updates to support generic delta load for CDS Views in SAP Datasphere?
Thanks and kind regards
Sven