ABAP Integration for SAP Data Hub and SAP Data Intelligence – Overview Blog
ABAP Integration – the thing to connect the big data world with SAP data
Three years ago SAP announced a new product called SAP Data Hub to help customer to bring together data from business systems with data from the big data world. At the beginning the focus was on the connectivity to HANA as well as SAP BW, now we expended the footprint also to every ABAP-based SAP system and call it in the context of SAP Data Hub and SAP Data Intelligence just ABAP Integration.
The idea of this blog series to share unique insight in this new capabilities and also explain certain scenarios in more details. Also we want to use this platform for a fruitful interaction and discussions with you guys.
When I talk about we, I mainly mean Britta Thoelking and myself Tobias Koebler .We both belong to the SAP Data Hub / SAP Data Intelligence product management team with a background on ABAP, especially SAP LT Replication Server aka SLT 😉
For all of you not too familiar with SAP Data Hub / SAP Data Intelligence, feel free to have a first look at it here:
- use the search 😉
Probably the first question you have is about “why is he writing about SAP Data Hub and SAP Data Intelligence, what is the difference”.
Just a quick answer to that. SAP Data Hub is the on-premise shipment which was released around three years ago.This year at TechEd we announced the full service offering of SAP Data Hub and we called SAP Data Intelligence. On top there are also some apps for ML and Data Science Scenarios. For more details check the links above.
The ABAP Integration works for SAP Data Hub and SAP Data Intelligence almost similar. To keep it more easy for us as an author we will just use SAP Data Hub (but of course anything we describe counts also for SAP Data Intelligence).
ABAP Integration in a nutshell
The idea of ABAP Integration is to establish a unified model to consolidate all interaction scenarios between SAP Data Hub and an ABAP-based SAP system (directional and bi-directional).
Certain capabilities can be derived from that interaction model:
- Provide ABAP METADATA to the SAP Data Hub Metadata Explorer
- ABAP DATA PROVISIONING to transfer data into SAP Data Hub or SAP Data Intelligence
- ABAP FUNCTIONAL EXECUTION that is triggerable as an operator in SAP Data Hub or SAP Data Intelligence
SAP Data Hub comes with comprehensive features for discovering, managing, adjusting, and cleansing metadata from different sources. All metadata can be managed in the Metadata Explorer. In the past, however, this functionality did not cover data from ABAP-based SAP systems. Now metadata from business systems (like SAP S/4HANA) as well as older systems (like SAP ERP) can be made available to the Metadata Explorer, enabling users to get insights into this data as well. This means that you can see for example details about a table (fields, technical names, component,etc) directly in the SAP Data Hub.
ABAP DATA PROVISIONING
Getting access to and using real business data in an SAP Data Hub pipeline helps you to build new intelligent applications and data flows. For example, you may want to obtain the replication data from an SAP S/4HANA system, enrich this data in an SAP Data Hub pipeline, and then feed it to SAP HANA, a file storage in the cloud or any other application.
The ABAP Data Provisioning gives you access to SAP S/4HANA and allows you to consume ABAP CDS views directly in a pipeline. ABAP CDS is the semantically rich data model in SAP S/4 HANA and allows the consistent representation of a business object (such as a business partner). It is possible to just get this data as an initial load, but also to have a stream approach established to consume every update, insert, and delete that happens in the SAP S/4HANA system.
For older releases, there is a table-based replication approach available to consume the data within a pipeline. To get access even to releases lower than a NetWeaver 7.00 system, the SAP LT Replication Server can be connected and will feed data in real time into SAP Data Hub. Existing SAP LT Replication Server deployments can be reused to lower the impact and footprint of an additional system.
ABAP FUNCTIONAL EXECUTION
In certain scenarios, it is required to enhance the scope of a data-driven application by accessing and writing data into an SAP S/4HANA or NetWeaver 7.00 system (or higher) . For example, it may be necessary to execute a function module or BAPI within a pipeline to read data into SAP Data Hub, post information into an ABAP-based SAP system, or trigger an execution in the remote system. If you have a requirement of this type, you can now create your own operator in SAP Data Hub that references the corresponding ABAP functionality.
Will you get now ABAP operators for a pipeline?
The answer is YES! SAP Data Hub 2.7 and SAP Data Intelligence 1909 comes with pre-delivered ABAP operators that are ready to use. You find a full list of ABAP operator at the help page by navigating to Repository Object References: https://help.sap.com/viewer/p/SAP_DATA_HUB
We would like to highlight the three main operators and you get more insight into their behavior in the linked blog posts.
ABAP CDS Reader
ABAP CDS is the semantically rich data model in SAP S/4 HANA and allows the consistent representation of a business object like a Business Partner.The ABAP CDS READER operator allows the replication of SAP ABAP CDS Views and customer ABAP CDS views in initial load and delta mode. ABAP CDS Views replication is only available for SAP S/4HANA 1909 and SAP S/4HANA 1911 CE or higher.
You find more details in the following blog post: ABAP Integration – Replicate ABAP CDS views via SAP Data Hub
The SLT CONNECTOR operator establishes a communication to the remote ABAP System and consumes data (initial load and delta) via SAP LT Replication Server technology. This allows the consumption of business data directly within a Data Hub pipeline to leverage a tight integration between business data and big data. Allows data consumption of lower SAP NetWeaver (<7.00) and S/4HANA (<1909) releases.
You find more details in the following blog post: ABAP Integration – Replicating tables into SAP Data Hub via SAP LT Replication Server
Custom ABAP Operator
CUSTOM ABAP OPERATORS can be used to enhance the scope of a data-driven application. For example it is necessary to execute a function module within a pipeline to read data into SAP Data Hub, post information into an ABAP-based SAP system or trigger an execution in the remote system.
You find more details in the following blog post: ABAP Integration – Calling a ABAP function module within a SAP Data Hub pipeline
How to get the software
We created an own blog to share some more insights into the software delivery and installation aspects. You find the related blog here: ABAP Integration – Software delivery and Installation aspects
- ABAP Integration – Software delivery and Installation aspects
- ABAP Integration – Replicate ABAP CDS views via SAP Data Hub
- 2814951 – SAP Data Hub/ Data Intelligence ABAP Integration – DMIS 2011 SP17/ DMIS 2018 SP02
- 2830276 – SAP Data Hub/ Data Intelligence ABAP Integration – S4 OP1909
- 2835207 – SAP Data Hub – ABAP connection type for SAP Data Hub/ Data Intelligence
- 2849542 – ABAP Connection Check with Data Hub 2.7/Data Intelligence 1909 or higher
We hope that you got some valuable insights and a first overview about the topic. Feel free to share your opinion and bother 😉 us with questions. We are happy to help you and move this topic to the next level.
Thanks for reading,
Britta & Tobias
I participated in the Data intelligence session in SAP TechEd and the key messages were :
New relating blog post (about table-based replication of SAP business data to SAP Data Hub) available here: https://blogs.sap.com/2019/10/29/abap-integration-replicating-tables-into-sap-data-hub-via-sap-lt-replication-server/
Britta Thoelking / Tobias Koebler
Is it possible to call an ABAP proxy from SAP DI?
Britta Thoelking / Tobias Koebler / Yoav Yahav
Gday! Requesting you to please check on the below Question and revert
Thankyou in advances! Appreciate your Valuable Inputs on above question