Skip to Content
Technical Articles

SAP Data Intelligence(3.1-2010) : CSV as Target using SLT Connector Operator

Introduction:

The SAP Data Intelligence provides a platform to orchestrate and integrate the data.

SAP Data Intelligence provides a lot of functionality starting from simple move of the file to real time replication of data from multiple sources to multiple targets.

I have been interacting with SAP Data Intelligence for quite a time now and worked on the multiple aspects of it.

The agenda for writing this blog post is primarily focused on one of the functionality of  ABAP SLT Connector operator where using the custom python operator we are going to write csv files with headers.

 

Background:

In one of our customer project we were supposed to replicate the data from ECC source system to Azure Data Lake Storage Gen 2(ADLS Gen2) via SLT. The Customer requirement was to have the real time replicated data in data lake in csv format.

Currently there is no direct provision in the SAP Data Intelligence to generate csv files with header information except to build your own code using any language supported by SAP Data Intelligence (Python, Golang, JavaScript).

We chose to build this custom code in Python 3.

CSV Output with Default SLT Connector Operator

You can see in the below screenshot the SAP Data Intelligence SLT Connector Operator by default generates csv files without actual column names. All csv files will have generic column names like C0,C1 etc..

 

Note:We tried to generate the JSON format since it supports out of the box metadata information such as column names, datatypes etc. However, working with JSON is quite expensive approach from performance point of view while dealing with high volume tables(more than 50 million) for Initial Load.

 

ABAP SLT Connector:

The SLT Connector operator establishes a connection between SAP Landscape Transformation Replication Server (SLT) and SAP Data Intelligence.

The SLT Connector comes with different versions. Until recent release SAP Data Intelligence offered V0 and V1 versions where the output type is abap.

Recently SLT Connector V2 version release provides a message output.

 

SLT Connector V2 provides a flexibility to extract the body and attributes separately from the input data.

As per current SLT Connector operator it can generate files in below format:

  • CSV
  • XML
  • JSON

 

Custom Python Code:

As explained earlier in the blog post we were not getting the actual column names and hence we created a custom python code to append the headers to the csv files being generated in the target by exploiting the functionality of SLT Connector V2 version.

Basically, In SLT V2 operator the message output has two sections:

  1. Attributes
  2. Data

In this python code we have extracted the body and attributes of the input message separately. The attributes of the input message basically contains the metadata information with the column names, data type etc.

The attributes information looks something like this.

We just extracted the column names from the metadata using the below code snippet and appended the columns over the body before writing the csv file to the target.

def on_input(inData):
    
    data = StringIO(inData.body)
    
    attr = inData.attributes
    
    ABAPKEY = attr['ABAP']
    col= []
    for columnname in ABAPKEY['Fields']:
        col.append(columnname['Name'])
    df = pd.read_csv(data, index_col=False, names=col)
    df_csv = df.to_csv(index=False, header = True)
    
    api.send("output", api.Message(attributes={}, body=df_csv))

api.set_port_callback( "input1", on_input)

 

CSV Output with Customized SLT Connector Operator

As you can see now we have appended the csv with actual column names coming from source table.

 

Enhancements:

  • The code snippet works well for Replication Transfer Mode, for Initial Transfer Load few tweaks are required in the code because in Initial load the 2 additional columns Table_Name , IUUC_Operation columns will not be present.
  • The file generated in the target is based on the portion size being created from SLT and sometimes it results in huge number of files with a smaller chunk size (ex 1-2 MB).With few enhancements in the code we can combine the potions and can generate the file size as per our requirement.

Note: Please visit this blog by one of my colleague for the above said enhanced scenario related to File sizing.

https://blogs.sap.com/2021/02/19/sap-data-intelligence-slt-replication-to-azure-data-lake-with-file-size-limit/

Summary:

We started with challenges with the default SLT Connector operator offered by SAP Data Intelligence and then explained the limitations of earlier versions of SLT Connector and then we talked about latest version of SLT Connector operator and how we can exploit it to achieve our desired goal.

I hope I’ve shown you how easy it is to create a custom code in SAP Data Intelligence using the base operator.

If you are interested to understand how the process were carried out or have ideas for the next blog post, please feel free to reach me out in the comments section. Stay tuned!

For more information on SAP Data Intelligence, please see:

Exchange Knowledge: SAP Community | Q&A | Blogs

2 Comments
You must be Logged on to comment or reply to a post.
  • Hi Manish,

    it's funny. We wrote posts about the same topic 🙂 Here is my post: https://blogs.sap.com/2021/02/15/sap-data-intelligence-how-to-get-headers-of-cds-view/

    Regards,

    Yuliya

    • Yeah.. 🙂 We faced this issue in one of our customer projects and we were able to solve this using custom code so just wanted to blog it