Skip to Content
Product Information
Author's profile photo Tobias Koebler

ABAP Integration – Replicate ABAP CDS views via SAP Data Hub

How can an ABAP CDS view consumed in SAP Data Hub?

Introduction

We enhanced our CDC mechanism and have now the possibilities to detect and read/replicate ABAP CDS views to make the data available in a SAP Data Hub pipeline. When the business data is in a pipeline available, you can leverage all capabilities of operators. Like moving the data to afile storage, messaging queue or to a self-developed operators. Or all at the same time with a multiplexer in between. One time extraction, multiple targets.

If you are not familiar wit the overall concept of the ABAP Integration, please have a look at the overview blog for ABAP Integration.

Prerequisites

As you can imagine, new technologies are based on new code :P. We shipped our new CDC mechanism with the new ABAP pipeline engine which was made available with SAP S/4HANA 1909, this is the minimum on-premise release and there is no downport planned or even possible. For cloud we aim to get it out for SAP S/4HANA 1911 cloud edition.

SAP Data Hub has to be on version 2.7 or SAP Data Intelligence on 1909.

Besides, you need to be able to establish a RFC connection from your SAP Data Hub system to the SAP system. Ideally, you have already created this connection via SAP Data Hub Connection Management. To get more details about the connectivity, have a look at following note: 2835207 – SAP Data Hub – ABAP connection type for SAP Data Hub/ Data Intelligence

Use Case

Getting business data into SAP Data Hub and consume it there, link it to big data or write it to a target are just a few use cases to mention. Very often I see requests to move the data to cheap storage or to a messaging system like Kafka. In this blog I want to explain the most frequently asked scenario: “How can I move data to a file storage or SAP HANA or a SAP BW/4HANA”.

 

In the SAP S/4HANA source system (left) you will find certain artifacts:

  • ABAP Pipeline Engine: This is the environment, where ABAP operators are getting executed
  • CDC Engine: Internal framework to allow delta detection and movement. This is based on “SLT technology” and was improved to work with ABAP CDS views. In a nutshell, we use again database triggers and a logging tables. For the ABAP CDS approach, the system automatically creates triggers and logging tables for all related Application Tables of an ABAP CDS view. With the new CDC engine also the 1:4 limitation which is known from SLT is removed. The CDC engine is automatically triggered by the ABAP CDS Reader operator.

In the SAP Data Hub system the pipeline is modeled in the following way:

  • ABAP CDS Reader: This operator will call into the source and start the replication for the defined ABAP CDS view.
  • ABAP Converter: Within this operator the data with an internal ABAP format will be converted to a string. This allows other operators to use the data.
  • Write File: Standard operator to write data to a arget. This operator could be replaced in other scenarios, for example with a Kafka Producer operator to feed a messaging queue. Here also other operators like a SAP HANA writer oder any SAP application producer (e.g. write to BW/4HANA) can be used.

And action – how to implement it in the system

For the example we used the well known “FLIGHT” data model and created a custom ABAP CDS View (yes – custom ABAP CDS views are supported ;)).

The view consists of table SCARR (which holds carrier information) and a custom table ZRATING_01 that holds rating information for the carriers. Both tables getting joined and should be replicated afterwards.

ABAP CDS view in the source

We want to replicate the ABAP CDS view Z_CDS_RATING01 which can be viewed via ABAP Development tools. See the screenshot below.

So you see we just go for an easy join, but there are annotations used to activate the extraction and the delta capturing. We will explain this in an upcoming blog in more details.

The representation in the source system is via the SQL view “Z_SQL_RATING_01”. You will see it in the screenshot below.

The existing records are looking like that. We have three records with the following values.

Note: If the screenshots are to small you can “right-click” and show them in real size in a new browser tab.

Model the pipeline in the SAP Data Hub Modeler

All operators can be easily “drag and dropped” into the pipeline. We will use the ABAP CDS Reader for the load and replication from an ABAP CDS View. The operator ABAP Converter will be used to translate from an internal ABAP format into an usable string (json, csv, xml). Afterwards the Write File operator will create a new file in S3. The result should look like that.

Let us have a detailed look at the used operators.

ABAP CDS Reader

This operator will ensure that the CDC engine is called correctly. Triggers and logging tables will be created in the source system.

First you have to specify the ABAP Connection. We re-used a connection which was specified in the Connection Management called ABAP_RFC.

In the field ABAP CDS Name you should specify the name of your view, in this case Z_CDS_RATING_01.

With the Transfer Mode you can define how the data should be consumed.

  • I – Initial Load only

  • R – Replication of delta information (including initial load)

  • D – Replication of delta information only (no initial load)

You will find the full documentation here.

 

ABAP Converter

To move the data directly into an existing operator like the Write File operator, it is required to convert the data to a string. This can be easily done with the ABAP Converter operator. The conversion will be done in the ABAP Pipeline Engine in the source and only the string will be moved to the pipeline on SAP Data Hub.

First you have to specify the ABAP Connection. We re-used a connection which was specified in the Connection Management called ABAP_RFC.

Afterwards you select the format.This can be csv, xml or json.

You will find the full documentation here.

 

Write File

We use the standard operator Write File to create a new file S3 . Every update after the initial load should be appended to this file.

First select your preferred Service. In our case, we want to write to S3. Afterwards you need to select the Connection. The connection was already configured in the Connection Management and was named S3.

The location of the file should be in

  • Bucket: bucket1
  • Path: abap/ta01/cds.csv
  • Mode: append

You will find the full documentation here.

Having a look into the folders on S3 will show you the following:

As you can see, there are already other files stored within the same folder on our S3 bucket.

Note: This is just an example with MinIO to illustrate this better.

 

Get the data flowing

The pipeline is now ready to start and the execute button can be hit.

The pipeline is now running and we expect that the file is created on S3.

The new file cds.csv was created and the initial load was performed. After downloading the file will be show like the following.

These are the same values we have in the SAP S/4HANA source system (see above ;))

 

How does a delta replication look like

Triggers and logging tables were automatically created during the creation process of the pipeline. This means that the CDC mechanism is activated and any change will automatically transferred. We will now manually modify a record in the source system for one of the joined tables.

In this example we just change the RAT AMOUNT to 5000 and save the record. The system will automatically detect this and send it via the pipeline to S3. As we configured to append new records we will see an additional record in the cds.csv file.

The updated record was appended and the information about the operator was added and the end of the record. In this case an U for update For an insert (new record) or delete of an existing record, the operations flag would be an I or a D.

Thank you for reading this blog. Feel free to try it in your own and share your feedback with us.

BR from WDF, Britta Thoelking and Tobias

 

Assigned Tags

      4 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Anders Kortbæk
      Anders Kortbæk

      Hi Tobias,

      Thanks for a nice article. Can the same replication be done using calculation views as source and SLT instead?

      Our scenario is that we have calculations views (BWonHANA) and would like to replicate the result from them to another taget (non-sap). Not only full replication, but also delta when the underlying tables of the calculation view changes.

      Thanks, Anders

      Author's profile photo Tobias Koebler
      Tobias Koebler
      Blog Post Author

      Hi,

      you cannot specify a calculation view within SLT for the replication. You can only replicate all the tables within any view and individually replicate them. In the target you need to build on top the structure. Nothing which really helps in my opinion.

      BR, Tobias

      Author's profile photo Werner Dähn
      Werner Dähn

      The one point I do not get is the relationship with the statement "how can a full object be replicated out of an ECC system".

      Let's use VBAK and VBAP as example: You can extract them individually or you can create a view with data from both and replicate that with this feature.

      But what you cannot do is create Sales Order Business Object, a nested structure where one message has one order with all its line items embedded. Something similar to what we are used from BAPIs and oData APIs.

      Is this statement of mine correct?

      If yes, that leads to a follow up question about data consistency. The obvious requirement for any realtime data transfer is to maintain the global order of the changes. Otherwise you would end up with a sales order where the customer master or the material master has fallen behind by a few milliseconds and thus the order contains a master data reference that does not exist yet. Or worse, a line item without a sales order header record yet.

      The only option is to create a view with all data in it, VBAK, VBAP, KNA1, ADRC, MARA, MAKT, T001,...... which is not feasible either.

      I have the feeling I am missing a point. Would be nice if you can provide your thoughts in that area, Tobias.

      Thanks in advance!

      Author's profile photo Santhosh Kadiyala
      Santhosh Kadiyala

      Tobias Koebler

      thank you for the post. request you to please help me clarify for the below queries:

      • do we have any limitation that it works only for Left outer join ?
      • cant we use CDS views with associations?
      • how different is this with SLT ? i see that we are more or less using the same SLT Framework            in a different way.
      • Cant we use any better approach such as SDI to achieve the same ?
      • SAP is recommending for exposed associations and many standard CDS views are using Associations and need to understand the limitations here with this approach 

       

       

      thanks