Modern SAP ERP Data Integration using Apache Kafka
The classic way of SAP ERP Data Integration is to use one of the many APIs available, reading the required data and putting it somewhere else. There are many APIs and different tools for that, all with their own pros and cons.
The S/4HanaConnector provided by rtdi.io goes a different route. Its aim is to rather utilize technologies built for Big Data to solve the problem in a flexible, easy, less intrusive and convenient way.
Step 1: Establish the connection
Given the fact that all ABAP tables are transparent tables in S/4Hana, the connection is made on database level for performance reasons.
Step 2: Defining the ERP Business Object
As a consumer I would like to get Sales Orders, Business Partners, Material Master and the such. Therefore the first step is to define the scope of above objects and where the data comes from. This can be achieved via multiple ways:
- Using the ABAP CDS Views, they join all tables belonging together.
- Using predefined Business Objects.
- Using the ABAP data dictionary to define the object scope.
For option #3, the most complicated one, the connector provides a UI to define such Business Object Entities.
It allows to browse through all the ERP tables, here VBAK containing the Sales Order header data, and to drop that on the output pane. A sales order consists of sales items as well, hence the VBAK table is expanded and shows all relationships to child tables. Dropping the VBAP on the output side adds it as a child and the relationship as defined by SAP is added automatically.
Finally the entire Business Object gets a name “SAP_Sales_Order” and is saved.
With that a Json file with the structure definition of the Sales Order Object got created.
Step 3: Configure the Producer
All that is left is assigning above Business Object to a producer.
From now on all S/4Hana changes are sent to Apache Kafka and available for further consumption. Every single field of VBAK and VBAP as one object.
Simple, isn’t it?