Skip to Content

Deep dive on how to migrate your SAP data with Rapid Deployment Solutions

This is the second part of my blog about a better way to migrate data to SAP systems with rapid-deployment solutions. In this blog, I will provide a step-by-step deep dive into using the provided migration content in SAP Data Services.

The following is a list of all software components for data migration included in the RDS package: 

·     SAP Data Services: Data management platform to support all migration tasks

·     SAP Information Steward: Data profiling and assessment as well as data lineage and reconciliation from target to source

·     SAP BusinessObjects BI platform: Includes SAP Crystal Dashboard Design (formerly known as Xcelsius Dashboards), SAP BusinessObjects Web Intelligence, and SAP Crystal Reports

The following is a list of content for data migration included in the RDS package:

·     Migration content (available here) loaded into Data Services for mapping (field mapping) and validation

·     SAP BusinessObjects Web Intelligence reports for monitoring the data migration

·     Migration Services tool to map source data field values to SAP business context (value mapping)

·     Migration content for reconciling what was loaded in the target to the source

As mentioned in part one of this blog, the leading software for SAP Data Migration is Data Services. Data Services is a well-known and scalable data integration platform for ETL processes (Extract, Transfer, Load). Some of the key features of Data Services include:

·     A single interface to design data structures and build transformation rules

·     A visualization of source-to-target metadata to analyze data lineage and impact

·     Profile and analyze any source data 

·     Database extraction to easily extract legacy systems

·     Easy mapping (reusable)

·     Reusable and sharable transforms and functions (drag and drop) to minimize own programming

·     Web-based administration to manage job operations

·     Operational dashboards to monitor job execution and trends

Data Services can connect to any source or target system via the adapter framework. While it does have native connectivity to SAP systems, it also has connections to files, ODBC, mainframe, XML, Excel, and many other connectivity options. Architecturally, Data Services resides in the orchestration and integration layer of your existing architecture.  

From a business perspective, you can reuse Data Services after the data migration project for ongoing integration (e.g., master data integration with multiple systems). You can also use it for ongoing data quality, including an on-entry quality check within the SAP application. Additionally, it can help you drive data governance projects, especially as they relate to cleansing, quality, and the synthesis or removal of duplicate data.

Step-by-Step Demo on Using the Migration Content in Data Services

Now that you have a basic understanding of the solution, the software, and the content, the following is a step-by-step example of how to get started with SAP Data Migration. In this example we are going to migrate customer master data to an SAP CRM system. Normally companies do not embark on a data migration project alone, especially on their first data migration project to the SAP system. Use these steps below to get started with the software and the content:

Step 1. Go to to check for an overview and to download the package. Optionally, you can review the overview presentation and watch demos. Also you can download the DVD content in the Software Download Center even if you do not own the software.

Step 2. Once you complete the download, you can start to explore the content, accessing the getting-started guides and the documentation to get a complete picture about the content and the deliverables of the data migration package.

Step 3. At this point, even withoutthe software installed, there are still key guides and places to start planning for the migration. The migration content has Word documents and Excel templates for each of the objects in the SAP system (e.g., a document on business partner contact data, and another on quotations). You can read the document to understand the object and use the Excel spreadsheet to start the research on how the source data maps to the target.

Step 4. When you do decide to install the software, you can get a temporary key from However, if you already have Data Services installed, you can use the guides to install the content manually. In addition, the RDS package contains a services component that helps you to deploy the entire package including the software and getting started with the first business objects within weeks.

Step 5 (optional). If you are new to Data Services, you might want to start by going through the Data Services tutorial that is included in the documentation when you install the product.

Step 6. Once everything is installed, you can see a project called AIO_BPFDM_IDOC that exists in Data Services. This project has jobs for each of the business objects, such as customer master data or material master data.

Step 7. Drill into the the job and you can see the mapping and validation that takes place (Figure 1). The validation is against required lookup fields in the SAP system, validating mandatory fields as well as the format (e.g., if the phone numbers all include country codes and if the U.S. ZIP code is in 5+4 format).


Figure 1 Mapping and validation in the Business Partner job

The mapping within the job, as shown in Figure 2, has the source fields on the left and the target fields on the right. This example shows the mapping of the country from the source to the target, which is done using drag and drop.


Figure 2 Mapping within the job

Step 8. The migration content includes Word documents explaining each IDoc, as well as spreadsheets for mapping the legacy Business Partner data to the SAP IDoc structure. Use the pre-delivered Excel template to start the mapping research of the source to the target prior to the mapping in Data Services. 

Step 9. The other component that is important during the migration is the Migration Services tool. This tool, which is delivered by SAP Best Practices, reads the IMG configuration data from the SAP system and enables mapping of source values to SAP configuration values. In this example, U.S.A., US, and Vereinigte Staaten von Amerika in the source system should map to US in the target system (Figure 3).


Figure 3 Migration Services tool

Using the steps above, you can smoothly and securely migrate your data into new systems. SAP Rapid-deployment solutions for Data Migration allows you to confidentially map, cleanse, transform, and load you data  – thus ensuring your data migration project is a success! The following solutions are currently available:

And everything can be downloaded here:

Look out for our next blog on data migration rapid-deployment solutions – explaining how to visualize your data migration projects. Please read part 3.

You must be Logged on to comment or reply to a post.
  • There is a new date and availability to attend the all-new SAP Education Data Migration training workshop on-site. The course will cover SAP Data Services usage combined with the SAP RDS Rapid Data Migration content.

    New date is November 1-2, location is Newtown Square, PA.

    More information and how to register under the following link:

    The two-day workshop is available in SAP Education as course TZIM4M.

  • Today we’ve released the Rapid Data Migration to cloud solutions from SAP package including content for SAP Customer on Demand (CoD) and SuccessFactors Employee Central (EC). Please continue reading here.


    The official landing page in the SAP Service Marketplace is as below:

  • Hi Frank

    I am curious to know what is that migration service tool because it doesn't look like neither Data services nor Business intelligence ? 

    • Hello Abhilash,

      You are right, the Migration Services tool is neither part of SAP Data Services nor BI Platform. This little tool comes as a part of the RDS package and is deployed on the Apache Tomcat webserver. The RDS documentation explains in detial the easy install and also the connection to the SAP Data Services database.



      • Hi Frank,

        Thanks for clarification. As of I what read, RDS is free if you have procured  SAP DS ,SAP BI and SAP ECC . Right ? So will this install come free with the RDS install ?

        • Hi Abhilash,

          You are right, you can access the RDS content for target system SAP ERP in the SAP Service Marketplace ( free of charge. Besides the content, the RDS package includes the software (license necessary for the tools you've listed) and the service component (for a fee). If you are a partner you can even use an own service for a customer and even get qualified on the RDS package what I highly recommend. This is also free of charge and shows your customers that they will get a trusted data migration. If you are a customer, just make sure to have the licenses and you can roll out your own data migration. Optionally, you can make use of the RDS service rolled out by SAP consulting for a fixed price with a fixed scope.

          The Service Marketplace content includes a Step-by-step guide with links to the download in the Software Download Center (SWDC), the correct path is also given in the SAP note 1791183.

          Best regards,


      • Hi Frank,

        We need to perform an initial load of Customer master data from ECC to MDG . We have installed RDS package for MDG (S/4 HANA). Could you please let us know where can we find this Migration Services tool in the installed package to do the mapping and how to configure this tool from Designer.



  • Hi Frank,

    While I would agree that this would be the best approach for SAP data migration, I see that a major step that customer/client needs to do is not highlighted or less emphasized that is preparing the source files/tables  in a format  that RDS is expecting ie the above example of customer address for that matter in a source system like JDE is spread across multiple tables and we would either need an extract job to create the file with address file or a complex queries to join multiple tables.
    I am not sure if I am missing something. Please correct if I am wrong.

    • Hello Abhilash,

      You are right, we've pre-built the content for the target side only - including all the validations. This is because we know the SAP side very well. But we don't know the source side, so there is still some work to do. The example descibed in my blog is only this, an example. While we deliver sample mapping, we are not asking the cutomer to come up with the data in a specific format. And this is the difference to so many data migration solutions out there. We allow to complete the mapping from any source within the Data Services Designer tool. So what you wanna do is connecting to the JDE system, importing the metadata of all the multiple tables (no extract job necessary, it works on the fly) and perform the mapping exercise. Our mapping templates will help you with that and you can use the drag & drop in the tool to actually complete the given data flows per object. You can do joins and complex mappings including conversion functions easily. With the content we provide you can even reuse a lot already, although we are not providing any source to target mapping out of the box for JDE. When you run the job, it will pull out the data from the legacy side.

      If you are a customer, you have to do this exercise and might want to get some help from our consulting service. If you are a partner, you can reuse the mapping for another customer project where the source might be JDE again. Of course, JDE is just an example in this scenario, it could be any source including mixed setups between systems, databases, flat files and Excel workbooks.

      Hope that helps. Have a great weekend!



      • Hi Frank,

        Thanks for clarifying and its aligned to my expectations.

        We are a partner and Our client has already procured the tool but just waiting for hardware to be procured and suite to be installed

        One area which I couldn't align was cleansing area.As per what I read through the document for extensions(referred adding EU tax file to Customer master), there is an extraction, Validation, enrichment and then load for each IDOC segment . So i believe enrichment is meant to be doing both Cleansing and Transformation . Is that the intended architecture or is there a scope of cleaning outside this in RDS and similarly what we see as profiling in the RDS ppt, Isnt that also outside RDS ie through information steward?

        • Hello Abhilash,

          Did you already watch the recorded demo in the Service Marketplace? You can download it here:

          I tried to explain the data quality process and the similar Map, Validate, Enrich steps in detail. As a part of the package, there is one Building Block BPD (Business Process Document) which has been specifically built for the Data Quality task. Also, you might want to leverage SAP Information Steward for this purpose.



          • Hi Frank,

            Thanks I have seen this video.

            Its doing a Map, Validate  and Enrich in order  as you have mentioned and not doing Extract&Profile, Cleanse , Transform, Validate & Load in the order shown in RPD presentations . Profile is done outside RDS using DQ though there is some minimal profiling done as part of validate .cleanse & transform is part of Enrich .



  • Hi Frank,

    I installed everything and I'm happy to see that this package is so complete! Detailed validations we wouldn't think of and job execution with parameters to switch parts on or off.

    Only one question (for now 🙂 ): Is it possible to export to a file (xml or other) rather than an IDoc? I thought I read this somewhere but can't find it anymore.

    • Thanks for the feedback!

      Regarding your question, with SAP Data Services you can send the output schema also to flat files or XML. So the answer is yes. It's just that our pre-built content leverages the IDoc interface to send the IDocs directly via RFC. However, of course it's possible to write the data (even as an IDoc) to a file and for example using LSMW to finally do the load.

  • Hi Frank

      I have an issue. My Basis team are trying to configure Migration service tool. But it was unsuccessful. Can you provide the link or document to configure this. We want to do testing on this.


    Vijay Mukunthan

    • Hi Vijay,

      To deploy the tool, please follow the Configuration Guide, it's as simple as copying a file into a specific folder. The WAR file has to be placed into the TomCat WebApplication server folder.

      To configure the tool, I have uploaded a User Guide for the tool, please download it here:

      SAP Mobile Documents

      Best regards,


  • Hi Frank,

    I'm mapping the various objects one by one and I'm stuck at SD Pricing. The "Filter Segment with Seperated Condition Key (KOMG)" is part of it but I'm struggling to find the necessary info in the SAP database as KOMG is not available as a physical table. Where can I find this info?

    • Hi J.

      conditions are normally stored using KONH (Header), KONP (Position) and KONM (volume) or KONW (value).

      The main table of each condition is a so called a table A table.

      For example table A007 contains the condition keys for "Division/Customer" based conditions.(VKORG, VTWEG,SPART, KUNNR, DATBI, DATAB and KAPPL+ KSCHL) plus the link to the KONDH table key KNUMH.

      So all Kind of conditions are stored in KONH. KONP etc. and linked to the respective A table via KNUMH.

      E1KOMG is now a structure which contains the "Allowed Fields for Condition Structures". So the fields  you have to fill are the control fields for the A table and the key fields.

      For A007 conditions you have to fill the structure as follows

      KVEWE = A  (=Sales)

      KOTABNR = 007   (=Division/Customer)

      KAPPL = V (= Sales Distribution)

      KSCHL = <condition type> i.e. PB00  (= Price gross)

      and then you have to fill the key fields:

      VKORG, VTWEG,SPART, KUNNR, DATBI, DATAB with your values.

      That's it, normally?


      Frank (Finkbohner)

      P.S. for IDOCs using a BAPI to store the data  it's sometimes useful to check the documentation of the BAPI which is called by the IDOC inbound function module (can be found in table TBDBA-FNAME_INB). Unfortunately the conditions IDOC type COND_A is not calling a BAPI.


    • What you saw is the same solution, the Rapid data Migration is just the new name and has the update content. Updated means it's available for ERP, Suite on HANA and S/4HANA now. So basically the SAP Best Practices for Data Migration and Data Quality became the Rapid data Migration packages (see the date on that help page, it's from 2012). Same people behind (including me) and same foundation. But: new and updated content plus more objects.Check out The Data Migration Guys to learn more: Introducing the Data Migration Guys

  • Hi Frank,

    I have one doubt.

    I am trying to implement the SAP AIO for customer master.

    While doing so there are a lot of lookup tables which we are using for check table purpose .

    Is there any way to pull all of them to a SQL database . Do we have any such option or method ?

    Thanks in advance.

    • With the downloaded content, you should have all the lookup table values from our internal test system in Excel format as well. With the deployed content you have it in the DS underlying data base, too. There you should be able to access it with different tools as well. Especially, after you load the data from your SAP system to the application.

  • Hi Frank,

    We are planning to implement Rapid Data Migration to S/4 HANA. As per SAP Note , Content which is available with package includes Reconciliation Guide as well but we are not able to find the same in content. Could you please share a link to get Reconciliation Guide for Rapid Data Migration to SAP S/4 HANA Using Data Services 4.2

    Also, Could you please help with Reporting Templates for the same package.

    Many thanks in advance.




    • Hi Tanvi,


      The reconciliation guide as you mention is found in the RDM content here:

      The guide explaining the reconciliation process is called IDoc Status Check.  The direct link for it is found here: