Skip to Content
Product Information
Author's profile photo Britta Thoelking

ABAP Integration – Calling a ABAP function module within a SAP Data Intelligence pipeline


This blog post demonstrates, how you can trigger an execution in a remote ABAP system, such as a Business Suite or SAP S/4HANA onPremise system by implementing a custom ABAP operator. Whenever you gained any intelligent insight that you would like to send back to the ABAP system, you can make use of ABAP operators to write your custom logic.

This functionality is part of the ABAP Integration within SAP Data Hub. If you are not familiar wit the overall concept of the ABAP Integration, please have a look at the overview
blog for
ABAP Integration.

Remark: SAP Data Hub and SAP Data Intelligence can be treated for the purpose of this scenario exactly the same. For simplicity reasons I will mention SAP Data Hub only. In case you would like to run this scenario with a SAP Data Intelligence system, the procedure is exactly the same.


For SAP S/4HANA systems greater than 1909, you are good to start – no installation required. If you however run this scenario with a SAP Business Suite system, you need to make sure, that the non-modifying add-on DMIS 2018 SP02 (or DMIS 2011 SP17) is installed on that system.

Besides, you need to be able to establish a RFC connection from your SAP Data Hub system to the SAP system. Ideally, you have already created this connection via SAP Data Hub Connection Management. To get more details on the connectivity, have a look at the following note: 2835207 -SAP Data Hub – ABAP connection type for SAP Data Hub / SAP Data Intelligence 

Use Case

In our example use case we have a file (rating.csv) that stores mass data for customer flight ratings. We want to read this data and write it to a custom table in our ABAP system. We will use standard SAP Data Hub functionality to read the file, but to store it in our custom table and to do a simple calculation upfront we want to implement our own ABAP operator.

As you can see in the screenshot, we will make use of the following operators

  • Read File: reads our S3 file rating.csv to get the customer flight ratings.
  • To String Converter: converts the message format coming from the Read File operator to a string format.
  • MY_OPER: our custom ABAP operator
  • Graph Terminator: terminates the pipeline once the execution within the custom ABAP operator finished (as we do not want the pipeline to run forever)

To reduce manual activities to a minimum, there is a framework that supports you in the creation of all artifacts in the ABAP backend that are required for your own ABAP operator. The framework consists of two reports that must be executed in sequence:


In the future this process will be simplified even further. For now, let’s get our “hands on” and see how things are working.


Implement the custom ABAP operator in the source system (ABAP system)

  1. Logon to the ABAP system and enter transaction se38.
  2. As report name enter “DHAPE_CREATE_OPERATOR_CLASS” and click on the execute button.Note: If your ABAP-based SAP system is not an SAP S/4HANA System, use report R_LTAPE_CREATE_OPERATOR_CLASS to create the operator implementation class.
  3. Now specify the parameters as below, giving your operator a suitable name (like “MY_OPER”) and execute the report.
  4. Note: The name specified here (in our case “MY_OPER”) is also the name, that will be displayed later within SAP Data Hub Modeler.
  5. Assign a package or choose “local object” in the popup that occurs.
  6. Enter se38 in the transaction field.
  7. Specify “DHAPE_CREATE_OPER_BADI_IMPL” as report name. Note: If your ABAP-based SAP system is not an SAP S/4HANA system, use report R_LTAPE_CREATE_OPER_BADI_IMPL_CLASS to create the operator implementation class.
  8. Also here, provide the following parameters and execute the report. Note, that the operator name needs to be the same everywhere.
  9. Assign a package or choose “local object” in the popup that occurs.
  10. Double-click on the “Implementing Class” to show the methods, that the class is containing.
  11. Double-click on the GET_INFO method.
  12. Add or remove inports, outports and parameters according to your needs. Remember, inports and outports are used to connect the operator in the end to other operators to come up with a data pipeline, whereas the parameters are used to influence the execution of the logic encapsulated within an operator. For our case we need one inport, one outport and no parameters.
  13. Now go back and double-click on the PROCESS method and on the next screen enter the local process method (lcl_process).
  14. Inside the process method, you can implement the actual logic that you would like to execute within your custom ABAP operator. A simple event-based model is offered here that allows you to implement one or more of the following methods:

    • ON_START: Called once before the graph is started.

    • ON_RESUME: Called at least once before the graph is started or resumed.

    • STEP: Called frequently.

    • ON_SUSPEND: Called at least once after the graph is stopped or suspended.

    • ON_STOP: Called once after the graph is stopped.

    Additionally, the abstract class provides several helper methods that you can use:

    • GET_PORT: Get an input or output port.

    • GET_CONF_VALUE: Get a value from the process configuration propertialsoes.

  15. As you can see, there is already some sample code offered, which we of course need to adopt for our scenario. First thing we need to change is line 27. As we do not use any parameters within this operator, we out-comment this line of code.
  16. Now we need to include the call of our function module. In our case, we will call the custom function module Z_STORE_RATING_00. The function module call needs to be part of the step event. Of course, this custom function module is just an example. You can call any function module here or do anything else, that you can think of.

    Note: You might want to check this documentation on how to create ABAP function modules.

  17. Save and activate the changes.
  18. In case you are interested, let’s have a brief look at a code snapped of our function module Z_STORE_RATING_00. There are a lot of things going on inside this function module, however what I’d like to highlight is, that we perform (after a number of calculations and accumulations of the ratings) an update on a custom table. This is important, if we later want to understand, whether the execution was successful.
  19. Via se16, we can see the current state of the custom table ZRATING_00. It is empty at the moment except for the three carrier IDs:

Great. The implementation of the custom ABAP operator in the ABAP system is done!

Before you can actually work with the custom ABAP operator in SAP Data Hub, make sure, that the operator is whitelisted accordingly. Whitelisting is documented in the Security Guide attached to SAP note 2831756.

Implement the SAP Data Hub pipeline

Now we are actually ready to build the pipeline which uses among other operators also our MY_OPER operator.

  1. Build a pipeline like the one in the below picture by drag and dropping the operators accordingly.
  2. We will start with the Read File operator. Be careful when linking the Read File operator to the ToString operator. Make sure, that the outFile port is linked to the inInterface port.
  3. Configure the Read File operator accordingly, specifying the service and connection to the S3 system as well as providing the bucket and file path of the rating.csv file. In our case the file lies in bucket “bucket1” within folder “abap”:
  4. The to String Converter operator does not require any specific configuration – it just works out of the box.
  5. For the custom ABAP Operator we need to drag and drop the “SAP ABAP Operator” into the workspace.
  6. Now we open the configuration of this operator and provide the connection to our SAP system. If we have defined the connection already in SAP Data Hub Connection Management, we can just reuse it. If not, we can manually create it. Furthermore we choose the “MY_OPER” from the list of available operators. This step will make the operator automatically adopt to its implementation in this specific ABAP system. This means, that the inport(s), outport(s) and parameter(s) are loaded, accordingly. Consider here, that potentially you have your SAP Data Hub system connected to multiple ABAP systems of different releases, having different states of standard ABAP operators and potentially also of custom operators.
  7. Link the custom ABAP operator to the toString Operator.
  8. Like for the toString Converter, the Graph Terminator operator works out of the box – no configuration required here.
  9. Save your changes.

Execute the SAP Data Hub pipeline

Before we execute the pipeline, let’s have a quick look at our rating.csv file.

To access the file, we use a so called MinIO Browser, that allows – as the names tells already – to browse our S3 bucket for files and folders:

There are also other files available. Just ignore them :).

The data inside the rating.csv file looks like this:

We have hundreds of carrier ratings – one rating represented by a single record. The rating consists of different values (between 1 and 5) for different categories, such as service or comfort. They will later be processed by our function module and updated in our custom table in the ABAP system.

Now coming to the execution:

  1. To start the execution, go back to the Modeler and push the execute button.
  2. After the pipeline completed, let’s go back to the ABAP system and display table ZRATING_00 via se16. We should see that the table records got updated: So after the execution of our pipeline, the custom ABAP operator has updated the ZRATING_00 with the accumulated rating information out of the rating.csv file in S3 – Cool!


Thank you for reading this blog. Feel free to try it in your own and share your feedback with us.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Arunkumar M
      Arunkumar M

      Great job well done  Britta Thoelking

      Author's profile photo Vijay Kamath
      Vijay Kamath

      Excellent blog Britta.


      I would like to how to debug the ABAP code written in the badi implementation.  I have tried with external breakpoints, but does not seem to work.  Appreciate if you could provide some details on this.

      Author's profile photo Serdar Simsekler
      Serdar Simsekler

      Hi Britta Thoelking

      Thanks for the detailed blog. If we have a requirement to write data into a Business Suite system where upgrading DMIS add-on is not an option, would we have any options?

      For SAP DI to read data from the Business Suite system, it seems we can have SLT to be in between the Business Suite system and SAP DI. For pushing data from SAP DI to the Business Suite system, would that mean we can only execute functions in the SLT system?

      Author's profile photo Lei Feng
      Lei Feng

      HI Simsekler:

      If I can understand correctly, you mean your landscape like: source system -> SLT system -> DI

      From a technical perspective, you can expose RFC from your source system, then create a customized operator in the SLT system, in the customized operator, call the RFC exposed from the source system.

      Since your SLT server definitely got DMIS addon and has the ability to create customized operator.

      Author's profile photo Thorsten Schneider
      Thorsten Schneider

      Hi Britta,

      I noticed during implementing that one of the DMIS reports you mentioned (R_LTAPE_CREATE_OPER_BADI_IMPL_CLASS) is actually called R_LTAPE_CREATE_OPER_BADI_IMPL without the _CLASS.

      Vijay Kamath : The most common issue with external break points not working is that they are by default created for the user who sets the BP. So it will only jump to the debugger when the code is running under the user who created it (e.g. KAMVI or so). You need to set the Break Point for the user stored in the Connection in Data Intelligence. But beware that this will then probably also affect other users.



      Author's profile photo Martin Donadio
      Martin Donadio

      Hi Britta,


      "In the future this process will be simplified even further. For now, let’s get our “hands on” and see how things are working."


      Is there any update regarding this process ?








      Author's profile photo Martin Donadio
      Martin Donadio

      Hi Britta,

      I know that the point of this blog is to show how to call an ABAP FM from Data Intelligence, but in the example you use, I see that the file is written into the table without any additional transformation.


      For this case, could we use a HANA operator in DI to write data directly into the ABAP managed HANA table ?





      Author's profile photo Rajesh Kumar Katkoori
      Rajesh Kumar Katkoori

      Hello Britta,

      We are In the S4 FPS 2 version. In program DHAPE_CREATE_OPERATOR_CLASS there is new parameter 'Protocol Version' is added. It looks this is mandatory field. It has possible values as Sub-engine Version as v6, v7.

      Could you please advise how to determine which value we need to choose? Basically, we need how to determine DHAPE_GRAPH_OPERATOR, SUB ENGINE Version?

      Also, is there any possibility to delete or rename Custom ABAP Operator after creation? If yes, what is the process?





      Author's profile photo Rajesh PS
      Rajesh PS

      Britta Thoelking



      Is it possible to call an ABAP proxy from SAP DI?

      Author's profile photo Rajesh PS
      Rajesh PS

      Britta Thoelking  / Vijay Kamath / Martin Donadio / Thorsten Schneider / Lei Feng

      Gday! Requesting you to please check on the below Question and revert

      Thankyou in advances! Appreciate your Valuable Inputs on above question