Skip to Content
Technical Articles

Data Intelligence : ABAP , SLT and More

Introduction

SAP Data Intelligence as a platform is offering a lot of holistic features especially for Data scientist and Data analysts. It goes without saying there is so much to explore in this tool however i wouldn’t want to digress from the main agenda for writing this blog.

I have been interacting with the ABAP connectivity ,ABAP Operators , SLT replication to DI since a while now and often the functionalities don’t work as they are supposed to. Lets be honest , as stated by the Murphy’s Law – ‘Whatever can go wrong , will go wrong’ .

My main objective to write this blog is to look into the aspect of connectivity especially how DI connects with ABAP systems or ABAP stack and how different operations are performed. This would be a technical write up which basically deals with some issues i have faced , how to deal with them  and troubleshooting the ABAP side of the Story of Data Intelligence.

 

ABAP Connectivity to DI

With the latest version of DI 3.0 onwards, DI offers also the option of connecting to ABAP systems via SAP Cloud Connector. However before I continue, lets take a step backwards.

ABAP systems are usually hosted in an on premise network (usually with its own firewall settings). DI as a platform from a technical aspect is a cluster running over a Kubernetes cluster.Ideally such clusters can be hosted in an on premise network ( Own Kubernetes installations) or over a cloud environment ( GCP , AWS , Azure etc ) . SAP Cloud Platform also provides Data Intelligence as a service which can be enabled.

Connectivity is relatively straightforward when both DI and the ABAP system are in the same environment ( either on-premise or an open environment).

However Communication between an ABAP system hosted in an On-Premise network to a DI cluster hosted over Cloud isn’t as straight forward as providing the host name and User Details because of the firewall settings of the on-premise network.

SAP Data Intelligence running over SCP can connect to an On-Premise hosted ABAP system via SAP Cloud connector ( From 3.0 version onwards of DI) or a VPN.

If DI is hosted over other cloud providers like GCP for example , the connectivity to an ABAP system can be managed via a VPN.

There are few good blogs on ABAP connectivity so i wont go too much in detail however this blog on connecting the dots literally sums up the whole picture.

There are couple of points which still need to be checked like the SP (incase the system is not an S4 system ), or the version of S4Hana system and SAP Notes installed.

Incase you are facing issues in connecting to an ABAP system , please check the FM DHAMB_SERVICE_SYSTEM

Each connectivity request to an ABAP system from DI ,  irrespective of HTTPS / WebSocket RFC / RFC , would first come over here. In case you need to check if a connection is established , please put a break-point in this FM and troubleshoot further. At least if the debugger window opens up , it gives us some relief , a faint hope , like the light at the end of the tunnel.

We particularly found this interesting since we struggled to connect an S4Hana 1909 system and after debugging we figured out the FM DHAMB_SERVICE_SYSTEM was raising an exception for a Move Cast error.

In case some one is facing this issue – This bug was fixed with TCI note 2873666 for S/4HANA 1909.

 

ABAP SLT Operator

The blog until now has relatively slow and i myself had to pour in a lot of caffeine to keep on going. Lets spice things up a bit.

DI with ABAP operators provide a couple of pre-configured operators and also gives developers the power to create custom ABAP operators with custom logic to meet some specific business requirements.

One of such in built operator is the ABAP SLT Operator which can transfer data from ABAP systems to DI with 3 separate modes now -> Initial , Delta and Replication. As the DI platform has evolved , the SLT operator has also evolved with it and it comes with some pretty cool features like :

  • The operator now has couple of features which were missing in the previous versions like Replication Modes
  • Resilience feature ( Resuming Delta Loads from the point when a DI graph fails)
  • Controlling the size of data portion to be send from ABAP to DI per transfer

 

As of today , the latest working version of the SLT operator is -> SLT Connector V1 (com.sap.abap.slt.reader.v1)

Latest versions of the available SAP operators can be leveraged by using the SAP ABAP operator.

In the configurations for the operator choose your ABAP connection and F4 on the ABAP Operators which will give you a list of all the available SAP ABAP operators :

From here select the SLT Connector V1. After this step the ABAP Operator will automatically generate all the parameters of this operator.

More details around the operator can be found under the documentation of this operator.

Now lets dig deeper into this operator.

From the ABAP side of the story , each operator is a BADI Enhancement implementation of the BADI:

  • BADI_LTAPE_ENGINE_OPERATOR (ECC systems) OR 
  • BADI_DHAPE_ENGINE_OPERATOR (S4 Hana)

and Enhacement Spot:

  • ES_LTAPE_ENGINE(ECC System) OR
  • ES_DHAPE_ENGINE(S4 Hana)

The Enhancement Spot Implementations will give a list of already created SAP and custom implementations. Since my screenshots are from an ECC version of system , they show the implementation in LTAPE* naming , incase you are in a S4Hana system the naming would follow the DHAPE* convention.

The BADI implementation LTAPE_OPER_SLT_READER_V1 corresponds to the ABAP SLT operator.

 

This BADI implementation uses for executing the operator the class (CL_IUUC_LT_OPER_SLT_READER_V1).

The Classes in general for the ABAP operators have 2 main methods :

  • Get_Info – This method holds the information about the operator , the InPort and Out Port in DI , the Parameters , the Name of operator etc.
  • New_Process – This is where the whole logic of the operator is in place.

TroubleShooting the SLT operator

SLT operator in generally behaves pretty well and is quite stable but when you are working on a daily basis an error like below could become a common site (depending on how much you would want to explore with the data in the graph for post processing etc.. )

 

Incase the operator doesnt work after few try’s , the first and most common issue would be an entry in the table IUUC_LT_TABLES.

This table holds the entries for the existing running SLT tables and there could be some times when after failure of graphs / previous unsuccessful executions / Debugging sessions there might still be an entry prevalent in this table.

In case you see an entry for a table which isn’t running , please delete it.

Of course you might still find yourself questioning why to delete this entry , so lets try to debug a situation when an entry in this table can cause issue.

As mentioned before, the premise of this operator is the Class CL_IUUC_LT_OPER_SLT_READER_V1.

Now Lets check inside the local class used for the method New_Process , which would have several methods like On_Start , On_Resume. In this particular case , the issue we are discussing is inside the Method On_Start in the below part of the code ( Method get_graph_by_tabname)

I couldn’t gather more screenshots about the response but in case it finds an entry in this table it returns it and an Exception is raised due to it.

The Exception is a genuine SAP coding with legit reasoning to avoid multiple execution of the Graph but there could be times when an entry isn’t cleaned up automatically and this has to be done manually.

You can create a small ABAP routine to delete the specific record or you could follow the below steps (in case you aren’t scared of some Audits and of course have the relevant authorizaiton)

Open the table , select the entries to be deleted , type /h in the transaction screen , and press the glasses symbol ( to view data)

This would open the debugger, continue the operation until you reach the form

Inside this form , type OK_CODE in the variable and change it to DEL5 and press F8 ( Let the execution end) and this should give you a confirmation screen requesting to confirm the deletion of the entries as below  :

As already mentioned , this isn’t a SAP standard way to follow and ideally should be just used as a workaround to quickly mitigate the issue ;).

There could be more issues which i might not have encountered yet but this can then further be debugged in the Class which i discussed in this blog.

The output sent by the V1 Operator is in the OUTPUT_DATA method which provides 2 components , Metadata and DATA which are self explanatory.

Below is a screenshot of SFLIGHT data with 9 records and 14 columns of Table structure sent to DI .

 

No Transfer of Data : Long Running Graphs

There could be situations when the Graph is running in DI however it isn’t receiving any data from a long time. The best way to check the issue is to go into the ABAP system , LTRC and check if there are any errors.

In case you encounter a problem like below in your LTRC config for Migration object already existing

Please go to table DMC_COBJ , find the specific Migration object and delete it .

In case your LTRC Config shows a data transfer situaiton like below ( No records inserted , 0 Finished and no processes in execution)

This would either mean your background processes are either compromised or Max limit has been reached. Please check SM50 and in case you see unwanted backend processes Kill them.

 

What Next

There is so much more that i want to discuss and talk about but to be honest i don’t want to over stretch this blog. I hope i can make more time and write more around the DI side of the story in the coming weeks.

Hope this blog could be of some use to someone , somewhere , someday 🙂 ! Have a nice time ahead and thank you in case you could still survive this  ;).

 

3 Comments
You must be Logged on to comment or reply to a post.
  • Hi Pranav, thanks for sharing!

    I’m just tried to configure SAP Data Intelligence, cloud edition with an on premise SLT configuration. I did it using the cloud connector and so far, so good. The problem is that, apparently there is a step missing, cause I have the job running for a long time in SLT, but no data is transferred to DI. Have you seen something like this?

    Thanks again!
    Tanabe

    • Hi Tanabe,

      This depends on your configurations. How have you connected the on premise system to SLT? Is it via Sap CC?

      Maybe there is some whitelisting related issues?

      In SLT config , do you see the number of processes running and finished? It could be that the system is compromised and there are no background jobs available for running the processes.

       

      Best Regards,
      Pranav

  • Hey Pravan

    I had the issue you discussed in this blog. I have deleted the entries in the table. Kicked off the pipeline again, it ran, loaded the initial load, and then pipeline goes into error\dead again. I get the following issue:

    Are there other tables that needs to be cleared as well?

     

    Thanks

    Mia

    /