Technical Articles
SAP BTP ABAP Environment – Setup Data Replication with SLT
Overview
One of the main scenarios of the SAP BTP ABAP Environment (“Steampunk”) is to build side-by-side extensions to ERP core applications. This side-by-side approach requires data from the core application. In order to read and write this data you can use external APIs. Sometimes, if the data is more frequently required, data replication might also be a valid option. Data replication of on-premise custom tables (ERP core application) to Steampunk is recommend via SLT using SLT SDK and custom-defined Steampunk Inbound RFC.
This approach is supported with the current Steampunk release and can be used immediately.
Reasons for this recommendation:
- SLT is a well-established and widely used tool for replication of mass data
- SAP DI strategy recommends SLT for ABAP to ABAP replication, also long-term
- There are custom SLT installations with > 30 millions transferred data changes per hour (numbers with Steampunk setup not yet available / to be tested)
- Replication into Steampunk (application-level) instead of HANA Cloud (DB-level, below Steampunk) avoids lifecycle issues with DDIC tables and is nicely integrated into the existing Steampunk application Identity and Access Management (IAM) concepts
Limitations:
- Only custom tables are supported on target side (Steampunk), but no SAP-delivered tables
- Only replication from on-premise to Steampunk is supported
- According to the multitenancy approach of Steampunk, data replication must always be tenant-aware (not cross-CLIENT)
Architecture
The following picture illustrates the involved systems and the data flow:
General setup of data replication
The relevant systems and components are
(1) On-premise system(s)
- Database tables as SLT data source in on-premise system(s)
(2) SLT system / component
- SLT component can be used in a separate system or in one of the data source systems (1)
- Target system connection needs to be configured as SLT SDK
(RFC Connection is not yet supported for Steampunk)
- An SLT BAdI needs to be implemented to call custom-defined RFCs for initial load and delta load
(3) Steampunk system
- Custom-defined database tables need to be exist. It is recommended to use the same structure like in the on-premise systems
- Inbound RFC needs to be configured based on a custom-defined Communication Scenario and Communication User
- Custom-defined RFC (called by the SLT BAdI implementation) is responsible to handle the initial load and delta load
Setup of Inbound RFC
As the data replication from SLT on-premise system to Steampunk is based on inbound RFC, we want to start developing a simple RFC in Steampunk and setting up the inbound RFC connectivity.
Implement a test RFC in the Steampunk system
The purpose of this test RFC is simply to validate the connectivity setup from on-premise (SLT system) to the Steampunk system.
The following steps are required in the Steampunk system via ABAP Development Tools (ADT) in Eclipse. All names (in bold) are only proposals:
- Create a package Z_SLT_MAIN. This step is optional but it helps to group the respective objects for later reference. Assign all objects to be created in the next steps to this package.
- Create a function group Z_SLT.
- Create an RFC-enabled function module Z_SLT_TEST in that function group which returns a string with a constant test value.
- Activate the function group and the function module. As a result, a COM inbound service and a IAM app is automatically generated as you can see in the ADT Project Explorer.
- Create a communication scenario Z_SLT, assign the COM Inbound Service Z_SLT_TEST_RFC and set only Basic as Supported Authentication Method.
- Save the communication scenario and press Publish Locally.
- All required development objects for the test scenario have been created in ADT. Now, please open the SAP Fiori Launchpad for the Steampunk system to perform the administration tasks to setup the connectivity to the Steampunk system.
- Open the app Communication Systems. Create a new communication system with ID and name SLT. Select the check box Inbound Only. Navigate to the section Users for Inbound Communication and create a communication user. The communication user will be used later to call the RFC. Save the communication system.
- Open the app Communication Arrangements. Create a new communication arrangement Z_SLT based on the created communication scenario in ADT Z_SLT. If Z_SLT does not appear in the value help, you missed the Publish Locally of communication scenario Z_SLT in ADT. Assign the communication system SLT and the created communication user and save the the communication arrangement.
Setup the RFC Destination in the SLT System
Now you need to setup the the RFC destination from the SLT system to the Steampunk system via SM59 in the SLT system. Depending on the SAP_BASIS release of the SLT system:
- < 7.54: Create an RFC destination of type “3” RFC connection to ABAP system which also requires a Cloud Connector setup for the SLT system
- >= 7.54: Create an RFC destination of type “W” RFC connection to other ABAP system using WebSockets. This RFC type does not require a Cloud Connector setup and is the recommended option if the release prerequisite is fulfilled.
The following screenshots show the required settings for a WebSocket-based RFC connection. Start transaction SM59 in the SLT system and create a connection A4C_RFC of type “W”.
Navigate into the app Communication Arrangements and copy the API-URL into the Host field of the RFC connection. Remove the leading https:// protocol string. Enter 443 as Port:
In the Logon & Security tab, select Explicit Client by Hostname and Authentication Method by User/Password with Alias User and enter the user name and the password of the communication user created in the app Communication Systems.
Save the connection and press the button Connection Test. Correct the settings in case of connection errors.
Call the test RFC in SLT system
Create local report ZSLT_TEST_RFC in the SLT system to test the RFC call:
REPORT zslt_test_rfc. DATA lv_test TYPE string. CALL FUNCTION 'Z_SLT_TEST' DESTINATION 'A4C_RFC' IMPORTING test = lv_test EXCEPTIONS OTHERS = 1. IF sy-subrc = 0. WRITE: / lv_test. ELSE. WRITE: / 'Error occurred'. ENDIF.
Activate and execute the local report. The RFC implemented in the Steampunk system is called and the expected output is 42.
Setup of SLT: Database Table
In our example, we use the following simple custom table defined in the source system. The described SLT approach requires, that the custom table is created manually in the Steampunk system as well. It is recommended to use the same structure definition in the Steampunk system like in the source system. For that purpose, the technical settings (direct typing) are sufficient – there is no need to create all domains and data elements in the Steampunk system.
Create the following custom table ZSLT_TEST in the source system and in the Steampunk system with the following structure:
@EndUserText.label : 'SLT Data Replication Test' @AbapCatalog.enhancement.category : #NOT_EXTENSIBLE @AbapCatalog.tableCategory : #TRANSPARENT @AbapCatalog.deliveryClass : #A @AbapCatalog.dataMaintenance : #RESTRICTED define table zslt_test { key client : abap.clnt not null; key uuid : abap.raw(16) not null; name : abap.char(10); description : abap.char(100); }
Create a local report ZSLT_TEST_FILL_DATA in the source system to fill test data into our test custom table:
REPORT zslt_test_fill_data. PARAMETERS p_count TYPE i DEFAULT 10. PARAMETERS p_delete AS CHECKBOX. START-OF-SELECTION. PERFORM main. FORM main. IF p_delete = abap_true. DELETE FROM zslt_test. COMMIT WORK. ENDIF. DATA lt_slt_test TYPE STANDARD TABLE OF zslt_test WITH EMPTY KEY. DO p_count TIMES. DATA(lv_index) = sy-index. TRY. APPEND VALUE #( uuid = cl_system_uuid=>create_uuid_x16_static( ) name = |{ lv_index NUMBER = USER }| description = |{ sy-datlo DATE = USER } { sy-timlo TIME = USER }| ) TO lt_slt_test. CATCH cx_uuid_error. ASSERT 1 = 2. ENDTRY. ENDDO. INSERT zslt_test FROM TABLE lt_slt_test. COMMIT WORK. ENDFORM.
Activate and execute the local report with default settings to create 10 records in custom table ZSLT_TEST.
Setup of SLT: Configuration
Use transaction LTRC to create a new SLT configuration. The important setting for Steampunk is in step Specify Target System: Select Connection Details = Other and Scenario = SLT SDK.
For more information about other settings please refer to the SLT documentation below.
Implement the SLT SDK Business Add-In
Create an implementation for enhancement spot ES_IUUC_REPL_RUNTIME_OLO_EXIT. Create a BAdI implementation for BADI_IUUC_REPL_OLO_EXIT based on interface IF_BADI_IUUC_REPL_OLO_EXIT. Only two of the four interface methods are relevant:
- IF_BADI_IUUC_REPL_OLO_EXIT~WRITE_DATA_FOR_INITIAL_LOAD
- IF_BADI_IUUC_REPL_OLO_EXIT~WRITE_DATA_FOR_REPL
Create an empty implementation for the other two interface methods.
A generic implementation for the initial load any custom database tables might look like:
METHOD if_badi_iuuc_repl_olo_exit~write_data_for_initial_load. CLEAR ev_error_code. CLEAR et_return. "push initial load TYPES BEGIN OF ty_ls_table_w_payload_il. TYPES tabname_source TYPE c LENGTH 30. TYPES payload TYPE xstring. TYPES END OF ty_ls_table_w_payload_il. TYPES ty_lt_table_w_payload_il TYPE STANDARD TABLE OF ty_ls_table_w_payload_il WITH DEFAULT KEY . DATA lt_table_w_payload_il TYPE ty_lt_table_w_payload_il. DATA lv_dbtab_xml TYPE xstring. FIELD-SYMBOLS <lt_dbtab_data> TYPE STANDARD TABLE. LOOP AT it_table_w_content ASSIGNING FIELD-SYMBOL(<ls_table_w_content>). ASSIGN <ls_table_w_content>-payload->* TO <lt_dbtab_data>. TRY. "serialize table data to xml or json IF 1 = 2. "xml CALL TRANSFORMATION id SOURCE root = <lt_dbtab_data> RESULT XML lv_dbtab_xml. ELSE. "json DATA(lo_xml_writer) = cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ). CALL TRANSFORMATION id SOURCE root = <lt_dbtab_data> RESULT XML lo_xml_writer. lv_dbtab_xml = lo_xml_writer->get_output( ). FREE lo_xml_writer. ENDIF. CATCH cx_root INTO DATA(lx_root) ##catch_all. ev_error_code = 1. APPEND VALUE #( type = 'E' message = lx_root->get_text( ) ) TO et_return. RETURN. ENDTRY. APPEND VALUE #( tabname_source = <ls_table_w_content>-tabname_source payload = lv_dbtab_xml ) TO lt_table_w_payload_il. FREE lv_dbtab_xml. ENDLOOP. DATA lv_payload_xml TYPE xstring. CALL TRANSFORMATION id SOURCE root = lt_table_w_payload_il RESULT XML lv_payload_xml. FREE lt_table_w_payload_il. CALL FUNCTION 'Z_SLT_WRITE_DATA_INITIAL_LOAD' DESTINATION 'A4C_RFC' EXPORTING iv_payload = lv_payload_xml IMPORTING ev_error_code = ev_error_code et_return = et_return EXCEPTIONS OTHERS = 1. IF sy-subrc <> 0. ev_error_code = sy-subrc. APPEND VALUE #( type = 'E' message = 'RFC error occurred' ) TO et_return ##NO_TEXT. RETURN. ENDIF. ENDMETHOD.
A generic implementation for the delta load any custom database tables might look like:
METHOD if_badi_iuuc_repl_olo_exit~write_data_for_repl. CLEAR ev_error_code. CLEAR et_return. "push delta load TYPES BEGIN OF ty_ls_table_w_payload. TYPES operation TYPE c LENGTH 1. TYPES tabname_source TYPE c LENGTH 30. TYPES payload TYPE xstring. TYPES END OF ty_ls_table_w_payload. TYPES ty_lt_table_w_payload TYPE STANDARD TABLE OF ty_ls_table_w_payload WITH DEFAULT KEY . DATA lt_table_w_payload TYPE ty_lt_table_w_payload. DATA lv_dbtab_xml TYPE xstring. FIELD-SYMBOLS <lt_dbtab_data> TYPE STANDARD TABLE. LOOP AT it_table_w_content ASSIGNING FIELD-SYMBOL(<ls_table_w_content>). ASSIGN <ls_table_w_content>-payload->* TO <lt_dbtab_data>. TRY. "serialize table data to xml or json IF 1 = 2. "xml CALL TRANSFORMATION id SOURCE root = <lt_dbtab_data> RESULT XML lv_dbtab_xml. ELSE. "json DATA(lo_xml_writer) = cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ). CALL TRANSFORMATION id SOURCE root = <lt_dbtab_data> RESULT XML lo_xml_writer. lv_dbtab_xml = lo_xml_writer->get_output( ). FREE lo_xml_writer. ENDIF. CATCH cx_root INTO DATA(lx_root) ##catch_all. ev_error_code = 1. APPEND VALUE #( type = 'E' message = lx_root->get_text( ) ) TO et_return. RETURN. ENDTRY. APPEND VALUE #( operation = <ls_table_w_content>-operation tabname_source = <ls_table_w_content>-tabname_source payload = lv_dbtab_xml ) TO lt_table_w_payload. FREE lv_dbtab_xml. ENDLOOP. DATA lv_payload_xml TYPE xstring. CALL TRANSFORMATION id SOURCE root = lt_table_w_payload RESULT XML lv_payload_xml. FREE lt_table_w_payload. CALL FUNCTION 'Z_SLT_WRITE_DATA_DELTA_LOAD' DESTINATION 'A4C_RFC' EXPORTING iv_payload = lv_payload_xml IMPORTING ev_error_code = ev_error_code et_return = et_return EXCEPTIONS OTHERS = 1. IF sy-subrc <> 0. ev_error_code = sy-subrc. APPEND VALUE #( type = 'E' message = 'RFC error occurred' ) TO et_return ##NO_TEXT. RETURN. ENDIF. ENDMETHOD.
Both methods are referring to the RFC modules implemented in the Steampunk system (see next section) and the RFC destination A4C_RFC.
Implement the Steampunk SLT Inbound RFC
Create new RFC-enabled function modules in function group Z_SLT:
- Z_SLT_WRITE_DATA_INITIAL_LOAD for the initial load implementation
- Z_SLT_WRITE_DATA_DELTA_LOAD for the delta load implementation
Activate both RFC-enabled function modules which creates two COM inbound services and two IAM apps automatically. Add the two COM inbound services to the custom communication scenario Z_SLT and save. Press Publish Locally for the communication scenario Z_SLT.
An example implementation for Z_SLT_WRITE_DATA_INITIAL_LOAD is:
FUNCTION z_slt_write_data_initial_load IMPORTING VALUE(iv_payload) TYPE xstring EXPORTING VALUE(ev_error_code) TYPE i VALUE(et_return) TYPE bapirettab. "get initial load TYPES BEGIN OF ty_ls_table_w_payload_il. TYPES tabname_source TYPE c LENGTH 30. TYPES payload TYPE xstring. TYPES END OF ty_ls_table_w_payload_il. TYPES ty_lt_table_w_payload_il TYPE STANDARD TABLE OF ty_ls_table_w_payload_il WITH DEFAULT KEY . DATA lt_table_w_payload_il TYPE ty_lt_table_w_payload_il. TRY. CALL TRANSFORMATION id SOURCE XML iv_payload RESULT root = lt_table_w_payload_il. FREE iv_payload. DATA lr_dbtab_data TYPE REF TO data. LOOP AT lt_table_w_payload_il ASSIGNING FIELD-SYMBOL(<ls_table_w_payload_il>). CREATE DATA lr_dbtab_data TYPE STANDARD TABLE OF (<ls_table_w_payload_il>-tabname_source) WITH EMPTY KEY. CALL TRANSFORMATION id SOURCE XML <ls_table_w_payload_il>-payload RESULT root = lr_dbtab_data->*. FREE <ls_table_w_payload_il>-payload. INSERT (<ls_table_w_payload_il>-tabname_source) FROM TABLE @lr_dbtab_data->*. COMMIT WORK. FREE lr_dbtab_data. ENDLOOP. CATCH cx_root INTO DATA(lx_root) ##catch_all. ev_error_code = 1. APPEND VALUE #( type = 'E' message = lx_root->get_text( ) ) TO et_return. RETURN. ENDTRY. ENDFUNCTION.
An example implementation for Z_SLT_WRITE_DATA_DELTA_LOAD is:
FUNCTION z_slt_write_data_delta_load IMPORTING VALUE(iv_payload) TYPE xstring EXPORTING VALUE(ev_error_code) TYPE i VALUE(et_return) TYPE bapirettab. "get initial load TYPES BEGIN OF ty_ls_table_w_payload. TYPES operation TYPE c LENGTH 1. TYPES tabname_source TYPE c LENGTH 30. TYPES payload TYPE xstring. TYPES END OF ty_ls_table_w_payload. TYPES ty_lt_table_w_payload TYPE STANDARD TABLE OF ty_ls_table_w_payload WITH DEFAULT KEY . DATA lt_table_w_payload TYPE ty_lt_table_w_payload. TRY. CALL TRANSFORMATION id SOURCE XML iv_payload RESULT root = lt_table_w_payload. FREE iv_payload. DATA lr_dbtab_data TYPE REF TO data. LOOP AT lt_table_w_payload ASSIGNING FIELD-SYMBOL(<ls_table_w_payload>). CREATE DATA lr_dbtab_data TYPE STANDARD TABLE OF (<ls_table_w_payload>-tabname_source) WITH EMPTY KEY. CALL TRANSFORMATION id SOURCE XML <ls_table_w_payload>-payload RESULT root = lr_dbtab_data->*. FREE <ls_table_w_payload>-payload. CASE <ls_table_w_payload>-operation. WHEN 'I'. INSERT (<ls_table_w_payload>-tabname_source) FROM TABLE @lr_dbtab_data->*. COMMIT WORK. WHEN 'D'. DELETE (<ls_table_w_payload>-tabname_source) FROM TABLE @lr_dbtab_data->*. COMMIT WORK. WHEN 'U'. UPDATE (<ls_table_w_payload>-tabname_source) FROM TABLE @lr_dbtab_data->*. COMMIT WORK. ENDCASE. FREE lr_dbtab_data. ENDLOOP. CATCH cx_root INTO DATA(lx_root) ##catch_all. ev_error_code = 1. APPEND VALUE #( type = 'E' message = lx_root->get_text( ) ) TO et_return. RETURN. ENDTRY. ENDFUNCTION.
Test the SLT Replication
Start transaction LTRC and navigate to the created SLT configuration. Click on Data Provisioning, add the table name ZSLT_TEST and select Start Replication which incudes Start Load automatically.
Take a look into the SLT logs and check, whether the initially created 10 entries are transferred into the Steampunk system.
Use the local report ZSLT_TEST_FILL_DATA to delete the existing 10 entries and to create 5 new records in table ZSLT_TEST. Please verify manually, that these changes in the source system are immediately reflected in the Steampunk system.
References
Steampunk:
SLT:
Thanks Frank, it was useful.
Hi Frank,
Regarding limitations, does this approach guarantee transactional consistency across multiple tables? Ie. if multiple tables are updated in the same database commit on source side, are they committed in the same database commit on target side?
Hi Lars,
yes, that is ensured by the SLT SDK. As you can see in the code sample above, the BAdI implementation is working on a data structure which is based on a list of tables (but not a single one) which represents the commit boundary.
Hi,
I would like to know if there are any limits on the amount of data that can be transferred during initial load.
Can we for example transfer 1000 000 purchase orders during initial load?
Will each PO be transferred with a single RFC call?
Hi Krzysztof,
you can configure the package size in the SLT configuration for your scenario. E.g. 1000 or 10000. According to the SLT SDK approach, the BAdI is called as many times according to the package size. Regarding your question: No, transferring each PO with a single RFC call would not be a good approach from a performance perspective and transferring 1m records is also not recommended. The SLT SDK supports you to implement a generic data transfer with a configurable package size.
Hi Frank
I would like to replicate an EDIDC table from about 8 ECC systems into a BTP ABAP instance. I realise I will need to create the target table and have raised a question on the community about this but I'd also like to check whether this concept will work via SLT.
Reading your limitation of 'Only custom tables are supported on target side (Steampunk), but no SAP-delivered tables', this suggests to me that SAP delivered on source but custom on target is fine.
Is that the case?
Regards
Ian
Hi Ian,
Yes, your proposed solution is possible. With "Only custom tables are supported on target side (Steampunk), but no SAP-delivered tables" means the actual (target) tables in the Steampunk system, independent of whether this table is a SAP-delivered table in the source system.
You can replicate the data of an SAP-delivered table (source system) into a custom-defined table in Steampunk (target system).
BR, Frank
Thanks Frank,
your answer <<<You can replicate the data of an SAP-delivered table (source system) into a custom-defined table in Steampunk (target system)>>> is what I have been looking for! This clarification is much appreciated.
For additional information on the BAdI setup see SAP note 2652704(requires login)
Thanks Frank. It was very useful.
I am getting RFC error from BAdi. But if we check the RFC connection test from SM59 to steampunk system it is fine. Any suggestions please ?
The SM59 connection test is checking "only" whether the endpoint can be reached. If this is successful, but the RFC call not, the issue seems to be during the logon to the Steampunk system. Did you configure the CC* user (created by the Steampunk system) in the SM59 settings but not the entered alias user name?
And did you use in SM59 WebSocket RFC or due to the actual on-prem release the standard RFC incl. Cloud Connector?
Hi Frank,
i observed that when i use CC* user in SM59 we are getting Name or Password incorrect error. But not for Alias user.
Yes i am using WebSocket RFC.
In case of further issues please create a ticket on BC-CP-ABA
Hi Frank,
We are working on a BTP-ABAP application. In this application we need F4 help for Plant. As we do not have any standard API's to read plant data from backend system, we would like to use SLT replication for T001W table, so that we can refer to local ZT001W for providing F4 help in BTP.
Is there any way to replicate standard tables using SLT, like the one explained above?
Thanks in Advance.
Standard tables are not supported due to different reasons: For example, T001W is an application table and not part of SAP BTP ABAP Environment (Steampunk). If T001W would be somehow generated as a SAP table (in SAP namespace), direct SQL access to a SAP table is forbidden in Steampunk, because SAP tables can only be accessed via CDS views.
Therefore, you need to create an own table with the actually required fields (subset of fields of T001W as ZT001W). Direct SQL access to such a custom table ZT001W is allowed.
Another option would be to create a custom API in your backend system (OData Service or RFC) and call this custom API in Steampunk instead of replicating the data. This approach would avoid data redundancy.