Skip to Content
Technical Articles
Author's profile photo Koti KANDUKURU

AZURE(ADF) SLT CONFIGURATION SETUP FROM SAP

Requirements for POC:

The requirement is to replicate S/4 HANA system changes with a real time using SLT configuration. The real time replicated data to be loaded into target system Azure Blob storage.

The architecture diagram given below:

Architecture Details:

source system: S49 (S/4 HANA 1909)

Replicated system: EH8 (ECC 6.0 EHP8)

The orchestration tool: AZURE ADF

Target system: Azure Blob Storage

Pre-requisites:

 

  1. SAP .Net Connector 64-bit latest version should be installed in Self hosted Integration Runtime Gateway tool installed system or server.

It is mandatory to choose Install Assemblies to GAC option while installing.

 

2. Install Self hosted Integration Runtime Gateway tool in local machine.

Data Flow Configuration setup in AZURE Data Factory (ADF):

To carry out this POC below configuration and orchestration settings have been done as below.

To extract SAP SLT replication data, we can use AZURE data flow CDC connector. So, the related dataflow settings have been given below.

 

  1. New data flow can be created navigating through data factory data flow section.
  2. Click on the new data flow option which will create new data flow.
  3. Give appropriate name to data flow – SAP SLT Replication Flow
  4. Add a source to the data flow, here in this case source is S/4 HANA.

To connect with S/4 HANA we need to establish connection using linked service and data sets. Those details have been given below.

 

5. S/4 HANA source has been added using related data sets and linked services.

6. Here we can choose either data set or inline. This approach used dataset option.

7. New dataset has been created on S/4 HANA using CDC connector.

 

To create dataset first we need to have a linked service, that will be shown in next image.

 

8. It is a default option, it can be as it was.

 

There is a sampling option, based on the requirement it can be enabled or disabled.

 

 

 

To create a linked service, we need to pass above mentioned ECC – replication server details.

Once test the connection, it would be successful.

After creating linked service, we will be able to access ODP context and ODP name. Which were configured in ECC replication system.

 

 

Under source options we need to select below options.

 

 

Here Key column can be used to identify change or inserted data from source system.

Run mode can be selected based on the requirements from the list.

What is ODP Context?

 

Using this ODP context and ODP name we can get CDC related data from source system.

ODP context and ODP name have been explained below.

The Xtract ODP component can be used to extract data via the SAP Operational Data Provisioning (ODP) framework.

ODP is a framework in SAP ABAP applications for transferring data between systems.
ODP provides a technical infrastructure for data extraction and replication from different SAP (ABAP)

Systems e.g.:

ECC

S/4 HANA

BW

BW/4 HANA

The Xtract ODP component acts as a subscriber (consumer) and subscribes to a data provider, for example to an SAP Extractor or to a CDS View.

Operational data provisioning supports mechanisms to load data incrementally, e.g., from extractors, ABAP CDS Views and a DSO objects (see below). With SAP BW/4HANA, Operational Data Provisioning (ODP) becomes the central infrastructure for data extraction and replication from SAP (ABAP) applications to an SAP BW/4HANA Data Warehouse.

ODP Name:

Provide the table you want to extract in the “ODP Name”.

SLT Replication Server setup:

Login into the ECC replication server and enter the LTRC – transaction code to setup the replication process.

 

 

Click on a create button and then we can get a configuration wizard run.

 

 

Enter the appropriate name and click on continue.

 

 

Select option RFC connection and enter source system destination connection.

Select Read from single client option based on the requirement(optional).

 

Choose “other” option and select the “Operational Data Provisioning (ODP)” option and click on Next.

 

 

Enter parallel run jobs as shown in the above screenshot (It can be varying on the requirement) and choose replication options type, here in the case we have chosen “Real Time”.

 

 

Review all the given parameters and click on create button to complete the SAP SLT replication setup.

 

 

Till now we have seen how to configure SAP replication system.

Authorization Access in SAP:

 

User having the below role authorization access in both source and replication systems.

SAP_IUUC_REPL_ADMIN

SAP_IUUC_REPL_REMOTE

 

SAP_IUUC_REPL_DISPLAY

 

After setting up the replication server in SAP, Sink (Target) system AZURE Blob Storage has been added.

 

 

Sink can be added by creating new dataset on AZURE Blob storage through number 6.

Incoming stream should be selected – that is our source.

 

 

Linked service can be created navigating through under corresponding folder, under fille path can be selected.

Linked service setup has been given below.

 

 

After setting up successful configuration setup, we need to call this dataflow into pipeline.

 

 

Create a pipeline with the appropriate name and drag dataflow activity into canvas, call already created dataflow into this pipeline as mentioned in 7th point.

Provide staging storage account and container.

After all these settings have been done, pipeline executed.

As per the given logic, pipeline has loaded full data from the source of a given table (MARA) for the first run.

 

 

On second run delta data (CDC data) has been captured and loaded into Blob storage.

 

 

 

Hopefully SAP SLT Replication data has been captured using AZURE dataflow CDC connector.

 

Assigned Tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Sai Kiran Gummadi
      Sai Kiran Gummadi

      Good blog Bala Kotaiah Much appreciated