Skip to Content
Technical Articles
Author's profile photo Martin Boeckling

Replication and filtering of data by using SLT and SAP Data Intelligence

SAP Data Intelligence provides different ABAP operators to extract data from ABAP systems. In my last blog post I looked at the extraction of CDS views from a SAP S/4HANA system using the CDS Reader operator, which is part of the ABAP operator section. SAP Landscape Transformation Replication Server is a product that replicates data between systems, also called SLT. As there are a lot of good blog posts within the SAP community covering SLT in general, I want to focus in my blog post on a certain functionality of SLT. In this blog post, we will look at the possibility to combine SAP Data Intelligence with SLT and especially how we can only use a subset of data that we want to replicate. Before I go into the technical details of how to configure the described scenario, let me briefly recap what SAP Data Intelligence and SLT are about:

SAP Data Intelligence is a product that is used to organize a heterogenous data system landscape from SAP systems and third-party systems. A more sophisticated overview of SAP Data Intelligence focused on the ABAP Integration can be found in the following blog post from Tobias Koebler.

SAP Landscape Transformation Replication Server is a replication software which was at the start based on SAP NetWeaver and now can also be deployed on the S/4HANA foundation. It has the possibility to replicate data between different systems, either between SAP systems or certain third- party databases. An overview of the different sources and targets depending on the version and deployment form can be found in the following document.

When we want to make use of data from a system, we are often only interested in a subset of the data with certain characteristics, for example data related to one certain company. Within SLT a user has the possibility to specify filters to only replicate a subset of the data. SAP Data Intelligence can make use of this functionality and extend the possible targets to process data. In this blog post, I will also explain three different ways to filter a table we want to replicate.

The blog post includes two main sections. The first section will explain the configuration within the SLT system, and the second section will have a look into the setup within SAP Data Intelligence.


To be able to follow the hands-on tutorial it is helpful to have your SLT system (either a standalone DMIS system or a SLT version on a SAP S/4HANA system) and a SAP Data Intelligence system in which the SLT system is already connected. If you need to connect your SLT system to your SAP Data Intelligence system, you can follow the listed action within this SAP Note.


Your starting point – Create your own configuration

To be able to connect a SLT system to the SAP Data Intelligence system the user must create a so-called configuration within the SLT system. To create the configuration, the user must enter in the transaction field the abbreviation LTRC to enter the SLT cockpit, which looks like the following picture:


Overview of LTRC screen

To create a configuration within SLT you need to click on the icon with the paper sheet, marked red in the above picture. After clicking on the button, the user receives a Pop up window which guides him through the process of creating a configuration. Overall, the user must specify 4 sections in which the user has to put in the necessary information. A detailed overview how SLT can be connected to SAP Data Intelligence is detailed explained in the blog post from Britta Thoelking.

To ensure yourself that you didn’t misconfigured something within the configuration, check carefully if the different settings are configured correctly. If everything was configured correctly, the user can click the Create button in the below-right corner, which is marked with the red rectangle.


SLT Configuration overview

After creating the configuration, the user can view the created configuration by selecting the new entry in the list and click on the glass icon which redirects him to the overview of the created configuration where he can see the current replications related to the configuration.

Table replication

The table that will be used for replication within this blog post is the well-known SFLIGHT table. It contains of 14 columns and 8 records. In the below image the structure that is within the system is displayed:


SFLIGHT table overview

For the filtering we will especially look at the CARRID column. The records of interest are for us the records that contain the abbreviation LH. In the following section, I will explain how the different possibilities for filtering can happen within the SLT system.

Filtering records

For the filtering the user has the possibility to differentiate between three different ways to filter the records. The first option that I am going to look at in this blog post is the filtering via the performance option filtering. By filtering the table, the performance is optimized as only a subset is loaded and transferred. The other possibility for the filtering can be done in the section rule assignment section within the LTRS transaction. The configuration for the different filtering possibilities will be discussed in the below sections.

Performance Options filtering

To access the possibility to filter a table, the user must enter the transaction LTRS to access the Advanced Replication Settings. In the below image the UI of the Advanced Replication Settings is displayed.


Performance options filtering overview

To be able to add a filter for the Performance Options section, the user has to right-click on the entry and click on the pop-up window with the text Add Table.


Performance options add table screen

The user gets presented by a window in which the user can type in the table name he wants to filter.


Table name screen

After entering the desired table, the user can click the green hook to create an entry for the performance options. To edit the entry the user needs to right click on the SFLIGHT entry and click afterwards on the Change entry of the pop-up window.


Performance options table popup window

After the user clicked on the Change entry, the user can edit the different options that are displayed in above picture. The user can select different options. For our scenario we will use the second option, which will filter for the initial load and the replication. To create the rule, the user needs to click on the sheet icon. The user gets presented with the option to create a Filter or to add a filter condition. As we first need to create a filter before we will be able to add an additional filter condition, we select the first option from the drop-down menu. The user gets presented with a pop-up window in which the user needs to specify the filter condition of the created filter. As we want to filter only entries from Lufthansa (LH) as a flight carrier, we must filter on the field CARRID that contains the value LH. Below you can find the configuration of the filter condition.


Specify Filter Condition window

After clicking on the green hook icon, the filter gets created with the condition and should look like the following.


Filter configuration window

In the next section we will elaborate on the two different options a SLT user has for the filtering of table entries under the Rule Assignment.

Rule Assignment filtering

Like the performance options we need to add the table to the Rule Assignment section. Therefore, the user needs to right click on the section Rule Assignment, click on the Add Table popup and enter the table name Sflight into the popup window. After creating the entry, the LTRS overview should look like in the following picture.


Rule assignment overview

To now create a filter condition, the user needs to click on the sheet icon, which is right in the middle of the different icons of the rule overview in the above picture. After clicking on the icon, the user gets two different rule options. The first rule option is called Event-Related Rule, the second rule option is called field-related rule. First, we will look at the field-related rule configuration.

Field related rule

After selecting the Field related rule, a pop-up window is displayed in which the user can select the field the user can select the field he needs to filter. In this blog post we use the CARRID field and click on the green hook icon to create the field related rule.


Field related rule creation

After the creation is completed, the user needs to specify information in the below section of the window. The user has either possibility to include a script or to put in the section Line of Code an ABAP code. As we only want to filter the entries of one column, we will not create a separate script but put in the coding in the last section.

To interact with a column in the Field-Related rule type the user needs to add the prefix ‘E_’ before the naming of the column. In the ABAP code the SKIP_RECORD function is used to jump over these records who are not LH. So therefore, our code looks like the following:


To activate the filter the user must change the status of the rule from New to Released. The rule needs to look afterwards like in the below image.


Field Related rule overview

Event-Related rule

The starting point for the Event-Related rule is the same as for the creation of the Field Related rule. Instead of selecting Field related rule in the drop-down window we select in this case the Event-Related rule. Instead of specifying a Field we specify in this rule setting an Event when the entries should be filtered and when not.

The different events can have a different result in the filtering. Below I have added an overview for the different options for Events. As we want to filter the records of a single table, we select the BOR (Begin of Records) option.


Window for Creation of Event Replated Rule

Start Load/ Replication
	BOP – Begin of Processing
		Get next portion from source system
		BOT – Begin of Block
		LOOP AT source
			BOL – Begin of loop
			MOVE – CORRESPONDING source To target
			BOR – Begin of records
			{Individual field mapping}
			EOR – End of Record
			EOL – End of Loop
		Write to target system
		EOT -End of Block
	EOP – End of processing
End Load/ Replication

For more details and examples how to use event-related rules, see also the SLT system documentation. As we want to filter the results from a single table, we will use the event Begin of records (BOR) to filter through every record.

For the replication of the table the user needs to specify the details for the created rule. To make the rule affecting the user needs to change the status to the status Released. To now filter the records for the table, we need to add Code into the section Line of Code. To access the SFLIGHT table, the user needs to add a prefix to access a column. In the case of the Begin of Record and the End of Record event, the prefix <wa_s_table>-columnname needs to be used. In the following the code that is used is displayed.


Overview of Event Related Rule settings

With the different possibilities discussed in this blog post, the user can filter certain records out of SLT. An important difference between Performance Options and Rule Assignment where we used coding is that with Performance Options the data is subset before the transfer to SLT. The option Rule Assignment transfers the data to SLT and applies the rules to the transferred data to subset the data. To integrate the defined filters from SLT, the integration with Data Intelligence will be discussed in the following sections.

SAP Data Intelligence

Integrate SLT configuration within SAP Data Intelligence

After we prepared our SLT system for one of the filtering options, we will now focus on the creation of our Replication pipeline. For this we will mainly use the Modeler within SAP Data Intelligence. In the created pipeline, we connect us to the SLT system, extract the table we defined a filter in the SLT system and view the result of the replication within SAP Data Intelligence. Therefore, we need the following two operators:

  • SLT connector
  • Terminal

In the following sections the configuration of the two operators will be presented. The output of the replication will then be discussed in the last section.

SLT Connector

To trigger a replication from SAP Data Intelligence, the user needs to add the SLT connector operator in the created pipeline of SAP Data Intelligence. The SLT connector can be found under the ABAP Operators.


SAP DI ABAP operator overview

After adding the operator to the pipeline, the user gets the possibility to specify the connection to the SLT system and the version of the operator. When the user clicks on the entry ABAP connection, he receives a list of the already connected ABAP systems. Select the SLT system that you predefined the filter in the Advanced Replication Setting in the system for a table. To use the SLT connector operator the user needs also to specify the Version of the operator.


SLT connector operator configuration

For the SLT Connector operator there are different versions existing. A detailed overview of the different versions can be found here. One of the main differences is the type of the output.

The versions V0, V1 difference to the version V2 is that the output of the SLT operator in Version V0 and V1 is of type ABAP, the output of version V2 is of type message. For our replication, we use the version V2 to do not incorporate an ABAP Converter to display our result in a Wiretap operator.


SLT Connector operator version overview

After selecting the version, more fields are displayed in the operator setting that need to be specified. For the operator we need to specify a Subscription Type, Subscription Name, Mass Transfer ID, Table Name and Transfer Mode. For the Records per roundtrip setting we will keep the default settings of 50000. As we want to create a new subscription regarding the connection we use, we select for the Subscription Type the entry New. For the subscription type we could also use the entry Existing, where an already existing subscription is used to connect to the data source. To tell SAP Data Intelligence which configuration in SLT to access, the user needs to enter the respective Mass Transfer ID. Also, the table that we want to replicate needs to be specified, in our case the SFLIGHT table. As we want to also capture Delta Changes on the table, we want to enter the Transfer Mode Replication. In the following, the overview of the configuration for the operator is displayed.


SLT connector operator settings


To display our replication results, we want to use the Wiretap operator from the utilities category (Operator in the orange rectangle). To use the wiretap in the pipeline, you can simply drag and drop the operator in the pipeline.


SAP DI Utilities operator overview

In the next section, I will look on how the replication using SLT will work in the pipeline.


After we connected the output of the SLT connector and the input of the wiretap, the user will see the following overview in his created pipeline, as displayed in the following picture.


SAP DI SLT Pipeline

After saving the pipeline and clicking on the Play icon to run the pipeline, we see that the pipeline should run without any problem as it can be seen in the below picture when we check the status of our pipeline.


SAP DI Pipeline running

The result of our replication can be checked by clicking on the icon beside the wiretap operator and in the new opened tab. As we can see, the output of our replication is only with Lufthansa entries.


Wiretap operator output

Take aways

SLT offers a huge amount of flexibility for the replication of data. As I focused on my blog post only on the different possibilities of filtering data, it is worth to say that SLT offers more replication settings that could be used for data replication. An overview of the different possibilities of the replication settings can be looked up under the following Help Portal site. I want to thank Britta Thoelking, Sarah Detzler and Daniel Ingenhaag for their support while writing this Hands-On tutorial.

If you have feedback you want to provide, feel free to write it into the commentary section.


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Cesar Sanchez Gutierrez
      Cesar Sanchez Gutierrez

      Hi Martin Boeckling ,

      Very useful blog Martin ! Thanks a ton:-)


      Thank you very much, I am following your example and I have a concern,

      I am using SLT, to replicate data from ECC SAP to DI, is there any way to identify if the record I get from replication corresponds to an INSERT OR UPDATE in the source or to somehow get what the timestamp was in the that the change occurred in the source system, in this case the ECC SAP.

      The destination where I should keep these records is Bigquery, where I don't have the option to update (UPDATE OR UPSERT), so I need to identify which is the last modification of the record.


      Kind regards




      Author's profile photo Fardeen Sookun
      Fardeen Sookun

      Hi Cesar,


      Can you please advise if you manage to differentiate the records (changes) from the source system ? In SAP Help Portal it is mentioned that using the SLT Connector you should get an indication of the records when using Replication mode.

      Screen shot of extract attacehedbelow-can you please advise if you manage to solve this ?


      Best Regards


      Author's profile photo Armaan Singla
      Armaan Singla

      Hi Martin Boeckling,


      We are trying to do the same and reading the data using SLT but we are stuck with an error of "error during init process" in DI. In the LTRC , table is in the "Scheduled" status

      We tried many options but still not able to resolve the error

      Could you please suggest if you have ever faced that issue before ?

      Thanks for the help!




      Author's profile photo Rajesh PS
      Rajesh PS

      Martin Boeckling,

      Requesting you to please check on the below Question and revert

      Thankyou in advances! Appreciate your Valuable Inputs on above question