Skip to Content
Technical Articles
Author's profile photo Pranchal Narang

Replication of data using SLT and SAP Data Intelligence Generation 2 Operators

Introduction

SAP Landscape Transformation Replication Server (SLT) is a product that allows users to replicate data between systems. While there are a lot of blogs out there covering SLT in general, I will focus on how SLT can be used with SAP Data Intelligence (DI) for replication.

SAP Data Intelligence is a product that is used to organize a heterogenous data system landscape from SAP systems and third-party systems. It is the main tool that I work with on the Data Management Team in Ireland. Within DI you have the option to build pipelines using Generation 1 Operators or Generation 2 Operators.

If you are using Generation 1 Operators, please have a look at this excellent blog by Martin Boeckling (https://blogs.sap.com/2021/07/20/replication-and-filtering-of-data-by-using-slt-and-sap-data-intelligence/).

In this blog, I will give you a step-by-step walkthrough of how you can build a pipeline in DI to replicate data from an SLT table and store it in any target using Generation 2 Operators.

Prerequisites

To follow this blog, you will need a SLT system (either a standalone DMIS system or a SLT version on an SAP S/4HANA system) and a DI system (on-prem or cloud) with an ABAP connection to the SLT system you wish to use. This SAP Note (https://launchpad.support.sap.com/#/notes/2835207) contains details of how to connect the two systems.

SLT

Create your configuration

In order to connect a SLT system to a DI system you must create a configuration within SLT using the SLT cockpit which can be accessed with the LTRC transaction. By clicking the paper icon (in the red square below), you can create the configuration in the pop-up window that follows.

SLT%20Paper%20Icon

SLT Paper Icon

Follow this blog (https://blogs.sap.com/2019/10/29/abap-integration-replicating-tables-into-sap-data-hub-via-sap-lt-replication-server/) to create the connection. At the end be sure to check you have configured everything correctly in the review screen. If everything is as necessary, click create to create the configuration.

SLT%20Create%20Button

SLT Create Button

Once the configuration has been created, you can access an overview and any current replications related to it using the glasses icon.

 

SLT%20Replication%20Details

SLT Replication Details

Table replication

For this blog, we will use the SFLIGHT table to demonstrate this replication scenario. The table appears as follows within SLT:

SFLIGHT%20Table

SFLIGHT Table

SAP Data Intelligence

Integrate SLT configuration within SAP Data Intelligence

Now that we have our SLT Configuration made, let’s look at building the replication pipeline in DI. For this, we are required to use the Modeler tile from the Launchpad. In order to use this tile and to be able to view the replication results, we must have the following policies assigned to the user we wish to use: app.datahub-app-data.fullAccess, sap.dh.member, sap.dh.developer. (https://launchpad.support.sap.com/#/notes/2981615)

Data%20Intelligence%20Launchpad

Data Intelligence Launchpad

We will now connect to the SLT system, replicate the data to a file in DI and view the results.

Build your pipeline

For this, we will use the Read Data from SAP System Operator which can be accessed from the “Operators” tab on the left, under the ABAP category. By right clicking the operator and selecting the view documentation option, you can read the details of the operator and its parameters.

Read%20Data%20from%20SAP%20System%20Operator

Read Data from SAP System Operator

Now to configure the operator. To do this you can either right click and choose the open configuration option or you can use the shortcut that appears when you select the operator.

Open%20Configuration%20of%20Operator

Open Configuration of Operator

This will open the configuration panel where we must specify the ABAP connection to be used to connect to out SLT system and choose the version of operator we wish to use.

ABAP%20Connection%20for%20SLT

ABAP Connection for SLT

Specify%20Operator%20Version

Specify Operator Version

Once this has been specified, we will be able to see additional fields in the configuration panel. Namely, Object Name and Transfer Mode. When specifying the Object Name, we will need to select the Mass Transfer ID we previously created and once we select that, we can search for the SFLIGHT table within it.

Selecting%20the%20Table

Selecting the Table

Next, we must specify the Transfer Mode we wish to use for our pipeline. Here we have three options: Initial Load, Delta Load and Replication. These determine what data gets replicated when the pipeline is run. Initial Load will only replicate the table and not look for changes, Delta will only replicate over the changes that occur in the table. Additionally, Delta will add a flag to show what change occurred (I – insert, U – update and D – delete). Lastly, Replication is the Transfer Mode that does both. It first does the Initial load and then any changes that are made to the table also get replicated over.

Three%20Transfer%20Modes

Three Transfer Modes

You may notice that an output port is automatically added by DI for the table we are going to read.

Output%20Port%20Added

Output Port Added

Since we want to write our table data into a file, we will have to firstly convert the table. To use the Binary file producer, our output from the Read Data Operator must be compatible with the input for the Binary File Producer. If we try to connect the two operators, we will see an error due to incompatible port types.

Incompatible%20Port%20Types

Incompatible Port Types

To fix this, we will use the Table to Binary operator to convert the table to binary before producing the file from it. We connect the output port labelled binary to the input port of the binary file producer as follows:

Converting%20Table%20to%20Binary

Converting Table to Binary

Now that we have matching ports, we just need to configure the Binary File Producer Operator using the Open Configuration shortcut as before.

Open%20Configuration

Open Configuration

Binary%20File%20Producer%20Configuration

Binary File Producer Configuration

Firstly, we must specify which connection we wish to use for our target system. Let’s use the S3 connection for our example.

S3%20Target

S3 Target

Next, choose the path mode you want to give and select your path to the target. If you wish for a new file to be created, just select the folder and add the file_name.csv at the end. After this the mode must be selected, this is what happens to the file. We have three options here, Overwrite (if the file already exists, its contents will be deleted and the new contents will replace them), Append (if the file already exists, the new data will be added to the bottom and old data will be kept too) and Create Only (no changes occur if the file already exists).

Write%20File%20Modes

Write File Modes

That completes our pipeline!

Just for convenience though, we can add the Wiretap operator to the end to be able to view the records as they come in. This will display the traffic to the browser window.

Our completed pipeline should look like this:

Generation%202%20Pipeline%20for%20SLT%20Replication

Generation 2 Pipeline for SLT Replication

You can now save and run the pipeline using the controls highlighted in red above.

Note that you must specify the time to capture snapshots for the Read Data from SAP Operator to work. You do this by clicking the arrow beside the run button and choosing the Run As option from the dropdown menu.

Capturing%20Snapshots

Capturing Snapshots

For errors regarding the same Mass Transfer ID being used multiple times, this SAP Note may be helpful: https://me.sap.com/notes/0003204663

Once your graph is running, you will be able to see it in the status tab:

Running%20Graph%20Status

Running Graph Status

To see the records coming in, open the UI for the wiretap. This will open in a new tab. Your results should look similar to this:

Open%20Wiretap%20UI

Open Wiretap UI

Wiretap%20Results

Wiretap Results

And if you browse the connections using the Metadata Explorer, you should be able to see the new file in the location you specified. The catalog will only contain the published files and folders.

Using the glasses icon, you can view the data that has been replicated.

View%20Factsheet

View Factsheet

The data should be visible to you under the data preview tab.

Data Preview

Conclusion

Congratulations! You now have a pipeline that allows you to replicate data from SLT to DI using Generation 2 Operators.

If you have any further questions, feel free to comment down below. Feedback is also welcome.

Assigned Tags

      8 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Armaan Singla
      Armaan Singla

      Hi Pranhcal,

      Thank you so much for writing the blog post.

      Where to specify the Generation 2 scenario while setting up the SLT configuration in LTRC? I am not getting this option following the mentioned blog

      What is the recommended way to extract the data at table level using standalone SLT or SLT on an SAP S/4HANA system? How is the performance using the SLT on an SAP S/4 HANA system in real case scenario ?

      Regards,

      Armaan

      Author's profile photo Pranchal Narang
      Pranchal Narang
      Blog Post Author

      Hi Armaan,

      Thank you for your response.

      If you are using DMIS 2018 SP06 or higher, you can specify the Generation 2 Scenario when creating the configuration within SLT
      on the Target System Connection details page. Here you get three options: RFC Connection,
      DB Connection and Other. If you select Other here, you will see the SAP Data Intelligence
      (Generation 2 Operators) option in the drop down menu.

      For older versions, you will have to work with Generation 1 operators.

      As for the standalone SLT vs SLT on SAP S/4HANA's performance. This blog here shows the pros and cons of each:
      https://blogs.sap.com/2022/03/28/sap-landscape-transformation-replication-server-slt-a-cost-saving-use-case/

      I hope this helps. Please feel free to follow up if I can help with anything else.

      Best Regards,
      Pranchal Narang

       

      Author's profile photo Poshan Reddy
      Poshan Reddy

      Hi Pranchal,

      It was a very clear explanation on usage of SLT with Gen 2 operators. Thanks for sharing the knowledge.

      Can we use SLT mechanism to replicate data from S/4 HANA to Snowflake DB?

      Looking for advise on how to handle upserts, deletions with Snowflake using SLT mechanism. Can you please share your thoughts.

      Thanks

      Poshan

       

      Author's profile photo Pranchal Narang
      Pranchal Narang
      Blog Post Author

      Hi Poshan,

      Thank you for your comment. Here is a blog post by Ankit Sharma that you might find helpful:

      https://blogs.sap.com/2023/01/19/loading-data-into-snowflake-database-through-sap-di-custom-operator/

      Best,

      Pranchal

      Author's profile photo Este Scholtz
      Este Scholtz

      Hi Pranchal,

      Well written in such a methodical manner!

      Kind regards,

      Esté

      Author's profile photo Pranchal Narang
      Pranchal Narang
      Blog Post Author

      Thank you Este!

      Author's profile photo Barbara Souza
      Barbara Souza

      Hello Pranchal!

      Great article!

      I have a question that I think you can answer me quickly. Today I'm working on a solution where we use DI's SLT replicator. First we do the initial load and then we leave the delta load on. I noticed that in the operation type column we have the X I U L A values. What would be the L and A value? I tried to find it in the SAP documentation but I wasn't successful in my search.  Is there any documentation explaining this operation type column?

      Bellow an image showing the delta moviments we receive.

       

       

      Kind regars,

       

      Bárbara Souza

       

       

      Author's profile photo Pranchal Narang
      Pranchal Narang
      Blog Post Author

      Hello Bárbara,

      Thanks for reaching out! This help page shows a table of the possible values at the very bottom. I have attached a screenshot here for your convenience.

      Possible%20Tags%20Delta%20Load%20SAP%20DI

      Possible Tags Delta Load SAP DI

      I hope this helps!

      Best Regards,

      Pranchal Narang