Data Migration from SAP ECC To S4HANA using SAP DATA INTELLIGENCE
I thought it is a good time to do quick Proof of Concept to achieve Data Migration from SAP ECC system to S4HANA using SAP Data Intelligence. SAP has not pitched forward this as a Data Migration tool, but it can be a game changer later as this tool is capable to perform most of the tasks related to Data. The purpose of this Blog to show that SAP Data Intelligence is capable to perform SAP Data Migration also.
The blog is depicting Data Migration of Bank Master data object using SAP Data Intelligence. This blog post includes two main sections. The first section will explain the Data Analysis using Metadata Explorer Tile of SAP Data Intelligence and the second section will have a look into the Data Migration using SAP Data Intelligence. The overall architecture is depicted in below diagrammatic representation: –
How we can perform Data Analysis of existing Legacy system using SAP Data Intelligence?
For the data analysis we can use the Metadata Explorer tile a lot to do the data analysis. The outcome of data analysis phase results Mapping documents.
The Data Analysis phase is required to identify Business objects and corresponding migration objects. A business Object represents as a single object to be used to model a business process. It is a semantic entity. For example, material, customer bank. For this demo we are using the Bank Master object.
A migration object represents an entity of a business object that is used to migrate the data of a business object, for example, basic material data from system A, material sales data from system B, and so on. The relationship between business object and migration object is 1: N. There can be many migration objects for one business object but only one business object per migration object.
In case of Bank Master, we are using BNKA table for our demo. There are two ways we can perform data profiling in the Metadata Explorer tile of SAP Data Intelligence: –
- First way is Direct profiling to SAP ECC system using ABAP system type connection. This way is similar way to profile a table in SAP Information Steward using Application type of connection.
- Second way is to stage the data in a database and then perform profiling on it.
To perform above operation, we need a connection in SAP Data Intelligence to SAP ECC system (RFC Connection).
For this demo we have opted second way to stage the data then performed the profiling. We have staged data using the Data Pipeline in the Modeler application of SAP Data Intelligence
Fact sheet gives you the detailed information of the Metadata of the data set, it provides the sheet with the columns, Data types, tags, Unique key and description. It also provides the detailed information of the data set, Connection ID, Type of Data set, Data set Size, Last Modified, Last Published and many more:
Column level profiling
Reviews and Ratings: We can review the data and mark comments and initiate discussions on the dataset and provide the rating also. You can rate and add comments to any published dataset. Ratings and comments provide opinions about the quality of your dataset. The rating can mean whatever you want to convey to others about the dataset.
Followed same structure of SAP Data Intelligence Data Pipeline as like Job Design of SAP Best Practices RDM (Rapid Data Migration) solution.
Used SLT Connector in SAP Data Intelligence created connection to the SLT system. The following are the steps but before that need to understand the prerequisites: –
Performed table based replication using SLT via SLT Connector in SAP Data Intelligence. In my scenario SLT is configured on the same ECC system. SLT can be configured on separate server
also which points to your ERP system. The SLT Connector in SAP Data Intelligence is configured only if the following system requirements are in place:
- SAP ECC NW 7.52 with DMIS 2018 SP02 or higher;
- SAP ECC NW 7.00 with DMIS 2011 SP17 or higher;
- SAP S/4HANA on premise 1909 or higher.
Note that DMIS Add-On availability depends on your NetWeaver version. Whenever possible, use the latest version available, and use DMIS 2018 rather than DMIS 2011.
In my case it is
Once the prerequisites mentioned above are met, the next thing that you have to do is to maintain your security settings and give the authorized users the respective authorization objects. (If this is not done, the respective requests from SAP Data Intelligence will fail.)
- Configure SLT for the SAP DI using T-code LTRC to initiate and control the table-based replication process and note down the Mass Transfer Id.
- Login to the Connection management of SAP Data Intelligence and create a connection to the SLT server.
- Configure SLT operator and ABAP Convertor and whitelist (Refer to Note 2831756 in Important Assets section) them in the SLT system.
Mapping & Validation:
Table consumer operator extracts data from Staged data table and transform data using Data Transform operator. Data Transform operation is used to map fields and transform accordingly.
In case of validation, we have used Validation operator and load data in HANA database .In the Validation operator defined validation rule on the SWIFT column
In the Validation operator define validation rule on the SWIFT column and before that define input schema
In validation operator the output ports contain Pass, Fail, Fail information and Error information. All are string type. We are channelizing Pass data and failed and loading in the SAP HANA tables separately (Valid and Invalid records).
Enrich & Data Loading
In the last Data Pipeline, we are enriching the data and loading using the ABAP Pipeline Engine by creating customer operator.
In case of Loading there are three different ways we can load data to S4HANA
- Calling ABAP Pipeline Engine by creating customer operator and encapsulate and calling RFC function.
- If we have existing infrastructure of SAP Data Services and want to load data using IDOC then we can populate the Enrich table and trigger the IDOC dataflow only of the job in SAP Data Services (Bypass other dataflows at DS level).
- Generate a file and use the Migration cockpit to load the data.
Using custom ABAP operators or using SLT Reader or ABAP Convertor Operator in SAP DI requires roles and authorisation with whitelisting of an operator. The note 2831756 would be helpful (Link is given in Notes section of Important assets)
Thank you for reading this blog. Feel free to try it in your own and share your feedback with us.
- Replication and filtering of data by using SLT and SAP Data Intelligence
- ABAP Integration – Calling a ABAP function module within a SAP Data Intelligence pipeline