Skip to Content
Technical Articles
Author's profile photo Babacar Toure

Consume CDS Views from SAP S/4 HANA on-premise into SAP Data Warehouse Cloud, leveraging the ABAP Pipeline Engine (APE)

Overview

 

The purpose of this blog post is to provide an overview on how to extract CDS views from SAP S/4HANA 2020 on-premise leveraging the ABAP Pipeline Engine (APE) and to consume them in SAP Data Warehouse Cloud (SAP DWC). For that purpose we’ll make use of Cloud Connector to establish the connectivity.

As a first step, we will look at the configurations to be applied in the SAP S/4HANA 2020 system.

In a second step, we’ll describe the required configuration in Cloud Connector for setting up the connection including the enablement of functions necessary for the data extraction from the SAP S/4HANA on-premise source system.

Finally, we’ll create both a connection to the SAP S/4HANA 2020 on-premise system in SAP Data Warehouse Cloud and a Dataflow to ingest the data of the CDS View into a deployed SAP Data Warehouse Cloud table.

Before going into further details I would like to thank my colleagues, especially Daniel Ingenhaag and Dr. Christian Tietz for their valuable inputs.

 

Prerequisites

 

General information related to the ABAP integration can be found in the following SAP Note 2890171, which describes the integration from various SAP systems like SAP ECC/SLT, SAP S/4HANA and SAP BW systems. In our case we will focus on the integration of SAP S/4HANA on-premise systems.

The minimum prerequisites to consume CDS views from a SAP S/4HANA on-premise system are as follows:

  • SAP S/4HANA on-premise system with at least SAP S/4HANA 1909 FPS01 (+ mandatory TCI note 2873666) or a higher SAP S/4HANA version.
  • More information about SAP S/4HANA 1909 integration including relevant notes can be found in SAP Note: 2830276.

The relevant note for our scenario built on SAP S/4HANA on-premise 2020 is note 2943599.

The minimum Cloud Connector version to be installed on a dedicated machine on the on-premise network that has access to the internet, specifically SAP Business Technology Platform (SAP BTP) is 2.12.x or newer (Download link).

More information about the pre-requisites for ABAP sources can be found in the official Data Warehouse Cloud documentation on help.sap.com: https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/a75c1aacf951449ba3b740c7e46da3a9.html

 

Configurations steps

 

  1. SAP S/4HANA ON-PREMISE

The scenario built in this blog post is based on SAP S/4HANA 2020.

The general prerequisites for connecting SAP S/4HANA On-Premise with SAP Data Warehouse Cloud can be found here.

It explains in detail the prerequisites and supported connection types and their properties.

Another important step is the definition and annotation of the CDS views to make them available for consumption. With the annotation, you can specify which views will be exposed in replication scenarios and are suitable for data replication. You can also enable the generic delta extraction functionality which is the element that should be used for filtering the data during delta load.

My colleague Martin Boeckling wrote a great blog post on CDS creation and annotation which you can refer to for more details.

Note: The scenario that is described in this blog post to extract data out of S/4HANA via ABAP Pipeline Engine (APE) into SAP Data Warehouse Cloud does currently not support delta and real-time data extraction. For details about support for delta replication or real-time replication in SAP Data Warehouse Cloud please check here.     .

 

  1. Cloud Connector

Cloud Connector serves as a link between applications in SAP Business Technology Platform and on-premise systems and lets you use existing on-premise assets without exposing the entire internal landscape. It acts as a reverse invoke proxy between the on-premise network and the SAP Business Technology Platform.

For our scenario we followed the detailed configuration steps described in this excellent blog post which is explaining how to:

  • create a subaccount in the SAP Business Technology Platform
  • assign a Cloud_Connector_Administrator role to your administrator user in the SAP Business Technology Platform
  • add this subaccount to the Cloud Connector and connect it to the SAP Business Technology Platform

Once you are done with the installation and configuration of Cloud Connector, you can then add a new on-premise system.

Since we want to connect to a SAP S/4HANA on-premise system, let’s select ABAP system.

Add%20system%20mapping

Add system mapping

Define the protocol to be used (RFC in our case) and maintain the system information (like application server, instance number…)

Enter%20server%20application

Enter server application

 

Click Next and allow the listing settings

Please note that the list of allowed functions depends on the source ABAP system you are using.

More details can be found in the following SAP Note: https://launchpad.support.sap.com/#/notes/2835207

Allow%20listing%20settings

Allow listing settings

 

Go back to your subaccount in the SAP Business Technology Platform cockpit and verify that the connection has been correctly established with the on-premise network

Connection%20Check%20in%20BTP

Connection Check in BTP

 

  1. SAP Data Warehouse Cloud

In SAP Data Warehouse Cloud you need to create a connection to the SAP S/4HANA on-premise system.

Navigate to Space Management, click on Connections and then click on the + sign to create a new connection

Create%20Connection

Create Connection

 

Select a connection of type SAP ABAP

Select%20Connection%20Type

Select Connection Type

 

Define the connection and save it.

Connection%20Information

Connection Information

 

Select the connection and validate it

Validate%20the%20connection

Validate the connection

 

Switch now to Data Builder and create a New Data Flow

Create%20a%20new%20Dataflow

Create a new Dataflow

Define a business name and technical name for the newly created Data Flow, then go to Source and click on the pop-up icon next to your Data Source

Dataflow%20Properties

Dataflow Properties

Search for your CDS View and click Next

Import%20Views

Import Views

 

Select it and click Add Selection.

This will fetch the metadata of the views and then import it into the Data Flow.

Fetching%20Views%20details

Fetching Views details

 

Your CDS view has now been added to the Data Flow.

Add a projection operator to map the columns you want to replicate and eventually add a filter to the data

Using%20a%20projection%20operator%20in%20Dataflow

Using a projection operator in Dataflow

 

Add now a target table and rename it.

Then click on Create and Deploy Table to generate it in the underlying SAP HANA Cloud database

Create%20target%20table

Create target table

 

Confirm the creation and deployment of the table

Deploy%20table

Deploy table

 

Save now the Data Flow

Table%20Deployed.png

Data Flow saved

 

As the next step, you can execute the saved Data Flow

Run%20the%20Dataflow

Run the Dataflow

 

 

Look at the Run Details information.

Note that the status is changing every couple of seconds and that it might take some time until the Data Flow execution has been completed.

Run%20Details

Run Details

 

Once the run has completed successfully, preview your loaded data in SAP Data Warehouse Cloud

Preview%20loaded%20data

Preview loaded data

 

I hope this was helpful.

If you have any questions please use this link.

If you need more information related to SAP Data Warehouse Cloud please check here.

 

 

References:

Assigned Tags

      5 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Werner Dähn
      Werner Dähn

      Hi Babacar, in what use cases would you copy the data once and then never again?

      I refer to the statement:

      [..] extract data out of S/4HANA via ABAP Pipeline Engine (APE) into SAP Data Warehouse Cloud does currently not support delta and real-time data extraction.

      Author's profile photo Babacar Toure
      Babacar Toure
      Blog Post Author
      Hi Werner,
      Thank you for your interest in my blog and your valuable remarks.
      Delta load is actually supported by the APE and this can be leveraged using SAP solutions like SAP Data Intelligence.
      SAP is currently working on enabling delta support for ABAP-based systems in Data Flows.
      I hope this answers your question.
      Regards,
      Babacar
      Author's profile photo Werner Dähn
      Werner Dähn

      Thanks Babacar. Do you mind describing in a bit more detail how the delta works? Say I want to see all sales order records from the ERP system that got changed or deleted since the last delta read. How is that achieved using the APE and what technology is used to find out the changes?

      Author's profile photo Alberto Simeoni
      Alberto Simeoni

      Hello Babacar,
      I would like to push forward to this topic.

      it will be useful if dataflows in DWC can use CDS views ODP with delta mode.

      Nowadays only real time remote table replication can do so but there are limitations:

      • refresh rate 15 seconds -> can not be changed.
      • if DWC updates, you (as administrator) need to stop the connection before the update (there is not an automatic way to manage this). otherwise the data integrity is not guaranteed.

      why SAP wouldn't allow dataflows to read from ODP as delta (so we can schedule properly data extraction) ?

      Apart from data intelligence (that someone told me is the same service running under dataflows in DWC):

      • BODS 4.2 can read ODP and CDS View as ODP,
      • BW4/Hana integrated ETL (datasource) can do it.

      Regards,
      Alberto

       

      Author's profile photo Sven Knöpfler
      Sven Knöpfler

      Hello Babacar,

      are there any updates to support generic delta load for CDS Views in SAP Datasphere?

      Thanks and kind regards

      Sven