Skip to Content

A better way to migrate your SAP data with Rapid Deployment Solutions

Data migration is a major task and key for a successful SAP implementation. SAP Rapid Data Migration with SAP Data Services enables you to migrate your data, ensuring your data can be trusted by the users and is ready for business process execution within SAP systems as Business Suite powered by HANA, ERP, SAP Business All-in-One, CRM, and SAP industry solutions for utilities. New content includes data migration content for SAP’s HCM suite in the cloud SuccessFactors Employee Central, SAP Cloud for Customer, and SAP BW. The first ever RDS package available for SAP HANA Cloud Integration has just been released offering content for a Rapid Data Migration to HCM cloud, Workforce Analytics (WFA).

Data migration projects are relevant for all companies that need to move data from one application to another but planning for a data migration project is a major task. Migration projects are risky. If not done correctly, they can cause a myriad of issues, including delayed production go-live, poor data quality that prevents core business processing, and poor user adoption of the new or upgraded SAP system. Cleansing, transforming, and loading data into a new SAP application can seem like a daunting task. For those of you interested in data migration solutions that are fast and easy to deploy or want state-of-the-art data migration best practices so that you can be a data migration maestro, check out SAP’s Rapid-deployment solutions for Rapid Data Migration.

Understanding SAP Rapid Data Migration

SAP Rapid Data Migrations has three major components:

  • Software driven by SAP Data Services (hereafter Data Services)
  • Migration content developed by SAP’s Rapid-deployment solution organization
  • Consulting services provided by SAP Services and partners

Watch the demo recording:

Downloading SAP Rapid Data Migration content

SAP’s rich RDS content for a rapid data migration:

The combination of the software, migration content, and optional consulting services provides a complete solution for migration to SAPapplications. Please find a list of the supported objects here.

Data Services is a key element of SAP’s solutions for enterprise information management (EIM).  Data Services provides capabilities for data integration, data quality management, and text data processing. If you’re looking for more information about SAP’s EIM solutions, there is a new book EIM with SAP that explains the changing face of EIM at SAP: what the different products are, how they work together, and how to get started using them. For the purposes of data migration, the focus is on the data integration and data quality management capabilities. The piece for data integration is an extraction, transformation, and loading (ETL) tool for ETL aspects from one or many sources to one or many target systems. Together with the data quality piece it is best suited for migration between non-SAP applications to SAP.

Leveraging the SAP BI platform, the data migration solution also enables BI reporting and contains dashboard analysis. The migration content includes some SAP Web Intelligence reports to get a better picture of the entire data migration process and its progress also for business users.

SAP Data Migration software is used across the following steps:

  1. Profile and extract data from the source system 
  2. Map the data to the target data structures
  3. Validate the data against the target business context (i.e., the business rules of the target system, such as if there are specific fields required or if the country codes are correct)
  4. Load the data into the target SAP system
  5. Reconcile the data between the target SAP system and the source system
  6. Repeat the entire process iteratively until the data is ready for loading into the production target system

Let’s take a look at the architecture of data migration in Figure 1. We’ve called out six specific areas, which we discuss in more detail.


Figure 1 Architecture of data migration using SAP software and SAP Rapid Deployment solution components

1.  Source and target systems: First look at the target environment and legacy data environment. These show the source and target applications. When using the Rapid Data Migration for SAP content, the environment is always an SAP application, typically including the rapid-deployment solution for Data Migration package. The legacy environment can be any non-SAP environment, from connectivity to databases, legacy applications, flat files, or XML. 

2.  Extract and profile: The staging area is provided using Data Services. In this staging area, you extract and profile data from the source systems. The profiling of data is a critical step as it provides insight into the state of the existing source systems. Examples of important details you can check for are patterns across data. For example, in the U.S., ZIP codes are five digits plus an optional four-digit code. A data assessment can determine how many unique ZIP codes you have and how often the ZIP+4 extension is used across your source systems. Another example is to know the pattern of country designations. For example, for Germany the terms Germany, DE, or Deutschland could be used in the source system.

3.  Cleanse, transform, and validate: This includes updating the data so that it meets specific patterns, transforming the data according to rules, and validating data against the SAP business context. This can involve combining two fields into one, splitting fields, updating the data within a field to match certain rules (for example, telephone number formats), and validating data against required fields and lookup values from the SAP business context and configuration. 

4.  Extraction of SAP configuration: As part of an SAP implementation, SAP is configured with many values such as plants, material types and groups, and sales territories. Mapping of the source data normally requires mapping fields that comply with the SAP configuration. The extraction of SAP configuration data takes the settings in the SAP system so that the source data can conform to the required format in the target system.

5.  Reconciliation: Reconciliation looks at what was actually loaded versus what was expected to be loaded.

6.  Reporting and dashboards: Throughout the process, dashboards are available for people involved with the project to know the status of the migration. Additionally, the migration starts the process of data quality expectations and governance around data management. 

The process outlined in Figure 1 enables companies to migrate data quickly and effectively – data migration is no longer a daunting task. The SAP Rapid-deployment solutions for data migration will empower you to cleanse, transform, and load data like never before. 

Look out for my next blog on data migration rapid-deployment solutions – including a step-by-step deep dive on using the migration content in Data Services, please read part 2 for that. If you want to know how to visualize your Data Migration projects, check out part 3.

For more information and to ask questions to the community, please visit our Data Migration SCN space:

Data Migration

You must be Logged on to comment or reply to a post.
  • There is a new date and availability to attend the all-new SAP Education Data Migration training workshop on-site. The course will cover SAP Data Services usage combined with the SAP RDS Rapid Data Migration content.

    New date is November 1-2, location is Newtown Square, PA.

    More information and how to register under the following link:

    The two-day workshop is available in SAP Education as course TZIM4M.

  • Very insightful Frank- missed the previous workshops- are there any more online trainings or workshops coming up for RDS Data Migration specific? Very interested in attending one.

    • Hi Caroline,

      As this was the first teach, we're currently reviewing the class and doing our lessons learned. Thus, there's currently no new date set. However, please check out the link above (for the search on TZIM4M class) once in a while for new dates.



  • Today we’ve released the Rapid Data Migration to cloud solutions from SAP package including content for SAP Customer on Demand (CoD) and SuccessFactors Employee Central (EC). Please continue reading here.


    The official landing page in the SAP Service Marketplace is as below:

  • Hi Frank,

    Nice Article and great way of explanation.


    When the usage of RDS is recommended? i mean we have few
    customizations or custom fields in our SAP ERP (SD) implementation. Is it OK to
    proceed with having RDS in place? as it fits and recommended better for standard

    What is the latest version of RDS available?


    What software’s should be in place to proceed with RDS data
    migration to implement.


    We have BODS 4.1 , Information Steward 4.1 in place on same
    server. any thing extra we need?

    FYI., We took a conventional approach in last phase of implementation
    by using BODS limited to uplifting of the legacy data and converting it to SAP
    template formats and then LSMW to load data in ECC. and now we are thinking of
    using RDS for this phase of implementation.

    Thanks again for your valuable inputs and suggestions.



    • Hi Srinivas,

      Thank you for the feedback!

      Using the Rapid Data migration content is recommended for any legacy data migration where high data quality is in focus. The built-in data validation rules which are coming out of the box with the RDS package are reusable even for different or custom data objects. The Data Services data quality capabilities also are a core differentiator compared to other data mapping or data load tools like LSMW for example. The customizing is easy as any of the data flows or jobs in Data Services can be used as a template and replicated for your own use. We even deliver an enhancement guide with the package explaining how to do so.

      The latest version is always available here:

      The current content for ERP and CRM target systems under the below link:

      If you browse thru the package, you’ll be directed to all the material and also the SAP note which contains the Quick Guide. In this case the SAP note number is #1791183, the note shows you where to download the content:

      -> Installation and Upgrades

      -> Browse our download catalog

      -> SAP Rapid Deployment solutions

      -> Rapid data migration to SAP ERP and SAP CRM

        > SAP RDS BIP CONTENT V2 (BI Platform Content)

        > SAP RDS DS CONTENT V3 (Data Services Content)

        > SAP RDS MIGRATION SERVICES V1 (Migration Services Tool)

      This will work with SAP Data Services 4.1 and BusinessObjects BI Platform 4.0.

      The combined approach in using DS for the extract, profile, data quality, data validation and mapping and LSMW for the load step is also possible as the Rapid Data Migration content can also write to files rather than loading the IDocs directly via tRFC. And these files can be used in LSMW easily.

      Best regards,


    • hi Srinivas,
      Would you mind sharing a sample (or PPT) of BODI/RDS w.r.t extract , transform and loading to SAP templates?

      since you have done the data migration, thought of seeking help.

      kind regards

  • Hi Frank,

    Thanks a lot for your response and clarifications.

    We will leverage the positives of the RDM like built-in data validation rules for the objects and standard field to idocs segments mappings.

    Here is our scenario,

    We are not able to find few data objects we require in our implementation in the list of RDM IDoc Mapping templates. Can we still use load through IDocs (Out of RDM),

    We are provided with a data object template from functional consultant for mapping to legacy systems, but is there any way to map these templates to the corresponding IDocs?

    As we are implemented MM, FICO already and doing SD now, many are extensions of the existing objects rather than first time load , Do IDOC Load helps here?

    Is it right approach to go with blended approach? few with RDM and few with conventional approach using LSMW (for objects not part of RDM and objects under extention views)?



    • Hi Srinivas,

      The Rapid Data Migration content will be a good starting point also for any missing or custom objects. However, to find the right interface for these is something you want to have a look into with a functional resource. They will also help with the mapping, don't forget: as part of the data migration content you are getting Excel mapping sheets as well - where the target side has been pre-filled by us with the IDoc interface structure and the meanings of the fields. You always have two possibilities: Use the mapping engine in Data Services to directly map from the legacy to this target IDoc structure or to have the functional people filling out our sample Excel source files.

      Regarding the use of LSMW: that's always possible with the file output. However, make sure to use Data Services for getting the Data Quality right. Also bear in mind that LSMW uses BAPI and IDoc, too - both used in our content via the BAPI/ALE interface (SAP transaction code BDBG) in the same way as LSMW does it. Only missing part is Direct Input, Batch Input and Screen Recordings as techniques where you want to run the load via LSMW.

      Hope that helps,


      • Hi Frank,

        Many thanks for the fantastic blog on RDS.

        It would be great if you can post some blog or papers on RDM using Custom IDocs. In most of the scenarios we would require to change the standard IDocs to meet the requirements.

        thanks in advance & awaiting for your reply.



    • Hi Srinivas,

      thanks for the good questions you have asked to get some idea on using the custom IDocs as part of the RDM.

      I am also looking for similar kind of options.In phase 1 we have migrated some data by writing ABAP programs to process the files which has been generated by the BODS the flow as follows.BODS--> FTP File--->ABAP program with BAPI to process the files.Now I want to give a try with IDcos but we have many custom fields. have you find any feasibility to use custom IDocs in terms of Data migration....?? if yes please let me know what kind of customization have done and what kind of setting you did to process those custom Idocs in SAP system.

      Many thanks,


  • Hi Frank:

    I would like to know some more info regarding one of the point you mentioned:

    "4.   Extraction of SAP configuration: As part of an SAP implementation, SAP is configured with many values such as plants, material types and groups, and sales territories. Mapping of the source data normally requires mapping fields that comply with the SAP configuration. The extraction of SAP configuration data takes the settings in the SAP system so that the source data can conform to the required format in the target system."

    How will I accomplish this in one shot? Is there a way I can connect to the Target SAP and pull In all the configurations at once? If yes then Rapid Data Migration toolset can help me on this and how?

    Thanks in advance!


    • Hi Avinash,

      With our SAP Data Services content that we developed and you deploy into your own SAP Data Services instance, you will get migration object related content to read the check tables (aka lookups or T-tables) and send the information back to SAP Data Services. With this we basically replicate the customizing settings from the target SAP system back into our migration platform. Also, we are providing pre-configured content for this - but in any case you want to run this initial lookup population job to learn about the values in the target system to perform a proper validation in the Rapid Data Migration process.

      Hope this helps.



      • Thanks Frank for your response!

        Does that mean if I go ahead with the RDS and Data Services then Target SAP configuration extraction into my local repository/database would be possible at one stretch. If yes then could the extraction be scheduled on a weekly/monthly basis so that it runs automatically.

        Well I am very new to RDS thus my questions may seem weird. Sorry for that.

        Thanks in adv!


  • See how Kellogg's came across all the big data pitfalls while using the Rapid Data Migration RDS during their project. This is a nice story on how a daunting task becomes GR-R-REAT! 🙂

    • Hi Karin,

      Unfortunately we don't have Rapid Data Migration content for Social Services objects yet. But there is another RDS package out there using the same methodology. This Rapid Deployment Solution takes care of social data load into SAP CEI, for more information please read my colleague Bharath's blog part 1 and part 2.



  • I'm installing the RDM content for Retail. When I want to install the BI content, it ends in error. I think because I've installed the Information Platform Services. Do I need the full BI platform for this?

    • Yes, that is right. You can either run the SAP Data Services (DS) content standalone with IPS (Information Platform Services) but then the BI content in form of the WebI reports cannot be used. To leverage the rich BI content (not mandatory but a helps a lot to resolve data errors during the migration), you want to have SAP BI Platform. You can then install DS on top of that and don't need IPS anymore.

    • Hi Dominik,

      Did you also see this video?

      SAP Rapid Data Migration - Demo - YouTube

      Regarding your question, the Migration Services tool is part of any Rapid Data Migration package. The tool is simply deployed on the web server (underlying the BI launchpad) and the deployment is as simple as copying the war-file into the right folder.

      I'd suggest to download the content of one specific package first, for example the ERP package. Use the 'Download it here' button and explore the content:

      There is also a link to the SAP note on that page which includes the user guide for the Migration Services tool and also the Quick Guide to install everything. Also, the note describes where to get the content including the war-file from, it's in the Software Download Center (see the note for details). There you'll find the various parts:


      Migration Services V1 is what you are looking for.

      I hope that helps, please let me know in case you need anything else.



      • Hi Frank, thank you very much for your detailed reply. I've one last question, are there any license costs to use the SAP RDS? Or is this "best practice" for free? Thanks in advance!

        • Hi Dominik,

          There is no additional cost for the RDS content. You can simply download with your S-user in the Service Marketplace and Software Download Center. It's included in your software license and brings you additional value free of charge.

          Best regards,


  • Hi Frank

    First of all, thank you very much for the detailed blog.

    Can you please confirm if there is any standard DS jobs (from RDS content) to load Warehouse Management Data ( Bins, Initial Stock balance, Storage Unit, Stock Transfer Orders)  and Purchasing Module Data (Purchasing Info Records, Condition Records, Purchase Order Confirmations, Message Determination, Purchase Order Manifest)?

    Best Regards

    Balaram Agarwal

    • Hi Balaram,

      Thanks for your interest in the Rapid Data Migration packages! When you go to you will find the list of supported objects (Data Migration Objects).

      We cover some of the objects you mentioned, for some others I'm not completely sure which ones you are referring to. So let me quickly walk thru it...

      Warehouse Management:

      Bins --> There is a standard program that can be used in LSMW (really old, not supported by our content)

      Initial Stock balance --> Inventory object, part of the ERP package

      Storage Unit --> Either customizing or part of Material Master (ERP package)

      Stock Transfer Orders --> Please check the Inventory object if the fields you are looking for are covered

      Purchasing Module Data:

      Purchasing Info Records --> Object exists

      Condition Records --> Object exists

      Purchase Order Confirmations --> Did you check the Purchase Order object?

      I'm sorry, but I do not have any idea what you the objects Message and Determination are. Could you please give me more details?

      Best regards,


      • Thanks Frank for your response.

        Quick question: Can you please confirm how can we use the Migration Services? Do I need to download any tool / install any software to use that?

        Thanks & Regards

        Balaram Agarwal 

        • Hi Balaram,

          The tool Migration Services is part of the RDS content. That means, once you download any of the Rapid Data Migration packages, it will be included.

          The deployment is quite easy: As it's running on top of the Apache TomCat WebServer (that also powers the BI Platform's CMC/BI Launch Pad and the SAP Data Services web tools), it's as easy as copying the WAR-file from the download into the dedicated TomCat folder. And our documentation even explains it step-by-step!

          Here you'll get the download, just click on the dedicated Rapid Data Migration package, for instance rapid data migration to SAP ERP and SAP CRM right at the beginning.

          Hope that helps!



  • Hi Frank.   

    We're thinking in migrate our BW 7.01 to BW on HANA. We have read a lot of documentation and we have the following doubt. If necessary to use the methodology RDS ?.



  • Hi Frank, thanks for the info, i'm on a banking project thats migrating deposits from a legacy system, the thing is the master data of deposit is already in SAP, but nor the balance or the transactional data is. Would this apply to this project? Do you have any info i can take a look? i've tried to look infor for banking industry but was unable to find it.

    Thx in advance


    Nicolás Cieri

      • Thanks for your answer Frank! i see what you told me but when i click on the SAP LM data migration it doesn't take me to the webpage, and i was actually looking for a DM RDS or Migration Tool, if you can help would be much appreciated.



        • Hi Nicolas,

          That's right, the LM package is no longer available. We have to clean that up in all our documentation. Is that one a package that you were looking for? If yes, please send me a private message. In the meantime, please check that object list PDF to find out whether the objects you are looking for are covered or not.

          Best regards,


  • hi Frank,
    thanks for excellent posts on data migration topic.

    I am new to SAP data migration , however familiar with ETL (informatica).

    we have Some plans for next year - Data migration requirement is from PeopleSoft Finance & HCM to SAP ECC/HCM.

    My question is -

    Does RDS package help in the above contest by having pre-defined set of extracts/transformation/loads to SAP ECC or HCM ?

    For example , thru RDS can I directly map PeopleSoft as source and extract load data to HCM? infotype tables etc or

    Kindly share some pointers/light to pickup such information.

    King Regards


    • Hi Sri,

      Absolutely! The RDS covers most of the HCM content based on a US Baseline deployment for migrating data to SAP ERP.

      You can find a list of the objects including info types here:

      What is in it?

      The package comes with mapping sheets to identify the target fields and info types in SAP. Also, the validations are pre-built and the setup is ready to connect to PeopleSoft. Data Services will automatically provide the metadata of the source and you are ready to go with the extract and profiling of the source data in the tool.

      What do you have to do?

      You go ahead and perform the mapping according to the mapping templates that have been filled with the help of business people who know the source. The tool supports field and value mapping (e.g. country keys).

      Please just go ahead and download the content here and visit the overview page:

      Hope that helps,


      • Thank You Frank for the response.

        I shall go thru the RDS and come back to you if any queries.

        Appreciate your information sharing and encouragement to new bees like me.

        Kind Regards


  • Hi Frank,

    Recently my client has a requirement to do master data migration to ECC. Hence , after following your blog we have decided to use RDS. Now here are few questions , if you can help us with the info..

    1. We have BODS 4.2 , BI etc. installed in our server already  that we are using for last many years for DWH projects, so do we need to pay any license cost for RDS package ( Migration services, ATL codes, WEBI reports etc.. ) ?

    2. If we want to use Information steward also for data profiling , what would be the license cost for Information steward?

    Please help. Thanks

    • Hi there,

      First of all, thanks for following my blog and also for considering the Rapid Data Migration packages.

      I'm not a sales guy so cannot talk numbers. But one thing I can tell you: The Rapid Data Migration content that you can download in the SAP Service Marketplace (SAP Service Marketplace - Rapid data migration to SAP ERP / CRM HANA) and Software Download Center (…) is at no extra cost! Call it free of charge, I call it best practices delivered by RDS as a part of your existing DS license. You are already on the latest version (might need to upgrade the SP), so you can just deploy the ATL-files and WebIs to DS and the BI platform. Again, at no cost for you.

      Regarding Information Steward, that comes with an extra license. My recommendation is to talk to your SAP sales contact as these costs are country specific. My suggestion: Go and start with the content and see what you can already do. Then you still can decide whether you'll need IS and what kind of value it would give to you.

      Best regards,


      • Hi Frank,

        Thanks for the detailed information , I am glad to update you that we have started the installation for RDS ERP , however I am facing one problem in Migration services tool which gives me a message always "

        MGMT_PROFILE_MAP is empty." And I can not see any object/view or lookup table in this tool. We have successfully executed the jobs Job_DM_Lookups_Initialise and Job_DM_Lookups_FileToApp jobs that I know populates the lookup table and also I checked the staging database , the lookup tables are populated with the data which is provided in the package content as lookup CSV files. I tried to investigate but could not understand why I am not able to see the details in Migration services. Could you please suggest?

        • Hi Joy,

          I am from rapid data migration development team. we are glad that your team decide to use our RDS solution.

          Sometimes, there are a lot of reasons which cause your issues. Please check the following steps:

               STEP1, please check the datastores for the STAGE database is configured successfully.

               STEP2, please check the table MGMT_PROFILE_MAP if it is really empty from the database.

               STEP3, please check whether the configuration iof the Staging Area in the migration services tool is correct. for example, for the SME mode, Repository name field don't need to fill out.

          For the remote support, could you please create a customer message and provide the detial system environment with the component SV-RDS-EIM?

          We will process it ASAP.

          Best regards,


          • Thanks Benny , now my issues is resolved after removing the repository name field as per step 3. 🙂

            Just few questions:

            1. I can see we have got all lookup files pre existing in Migration_erp folder and the files are filled with data already. Do we need to modify these data against the lookup files after reviewing our own ERP system and populate the lookup files with our own data ? How we should proceed , the reason is we may have many other distinct values in our ERP tables which are not existing in the sample pre existing lookup files?


          • You are right, the content comes already pre-filled with what we call the Baseline for US. It's like a sample, best-practices system. So there is a delta to your own ERP-system. But you can already get jump-started.

            This pre-filled content got populated when you first ran the Lookup "file to application" job in Data Services (Job_DM_Lookups_FileToApp). This read the data from the sample files we provide.

            You can get your own lookup data into the data migration application while running the "SAP to application" lookup job (Job_DM_Lookups_SAPToApp). This should be done when you have already some own customizing done. But you can repeat it every time you change something in your SAP system setup.

            Hope that helps,


          • Benny,

            I also have the error of MGMT_PROFILE_MAP with MigrationServices and followed all the relevant steps.  My staging area is on SYBASE ASE and repository is SQL_Anywhere.  This problem was with MigrationServices 3.00

            Fortunately I had the MigrationServices.war file for 2.42 and so I tried this one and it worked fine....

            Any clue as to why the new MigrationServices.war would give me this problem?


            As my staging areas is Sybase ASE I changed my universe connections to ODBC with ASE and the Lookup reports run fine.  I do have problems with the Data Migration Object report as they do not like the syntax from the derived tables on which the universe objects are based on.  As Sybase ASE 15.7 is supported I am surprised at this problem.  Trying to work round this problem I can only see a resolution of using the derived syntax from the universe and inputing into DataServices as freehand SQL and then load into a new table. Ths works but very long winded as I need to do this for every derived table and then repoint each object in the universe...

            Appeciate any comments you have



          • Hi Chris,

            Could you add the detail information about this issue?

            From this version, you need to do some addtional configuration, could you please click  QUESTION mark at the right top corner and check the steps?  You can get the detail information.

            Best regards,


          • /
          • Hi Chris,

            Not sure what is the really reason from our side now.

            Please copy and zip the logs from your tomcat installation directory, for example  E:\Program Files (x86)\SAP BusinessObjects\tomcat\logs , and then create the ticket with the component SV-RDS-EIM and attach the logs.

            We will update you ASAP.


            Best regards,


          • Benny - I am using 2.42 Migration Services for now as it works fine in same environment.  Will reproduce problem in 3.00 later and send in error.

            What is the main difference in 2.42 and 3.00?

          • Hi Chris,


            About the difference between 2.42 and 3.00, because some code of 2.42 doesn't follow SAP new security policy, in 3.00, we do some changes for that.



          • Hi Benny,

            We are using RDS for data migration. I am facing similar issue, MGMT_PROFILE_MAP is empty.

            I have executed the jobs with right parameters as per RDS documents, please help.



  • Hi Frank

    We are using SAP Rapid Data Migration package to load Material Master data and trying to upload all the views of the materials in 1 single job using the IDOC (MATMAS_MASS_BAPI_03). However the IDOC is failing stating Out of Memory issue.

    Is it advisable to load all segments of Materials in one go or we should split the program to load view by view or is there anything that needs to be done in application server from performance tuning point of view?

    Your suggestion will be highly appreciated.



    • Hi Balaram,

      First of all, great to hear that you are using our Rapid Data Migration RDS! I'm eager to hear your feedback after the project 🙂

      Regarding your issue, I'm assuming that the memory issue is only for material master data. As the IDoc that is created by SAP Data Services and our content is for mass upload, make sure that the memory settings in your system are correct.

      I suggest to set the abap/heap_area_dia and abap/heap_area_total just for testing to 6GB. You can use RSMEMORY ABAP report for this purpose. The value can be set to 6000000000 for a quick test, please save your work using the button Copy. No restart is necessary as these parameters are set/changed dynamically.

      If you still get the dump (I'm assuming your are talking about an out of memory or a no roll dump), you want to check whether the checkbox segments (a BAPI needs to have the so called X-segments set to an X for every value that you want to update - e.g. E1BPE1MARAX for E1BPE1MARA). Maybe it would be easier to start with this check first. If the check segments are not filled correctly, the ALE IDoc inbound runtime may fail with an endless loop resulting in such a dump.

      Please let me know if that was helpful. If the issue still persists, please send more information.

      Best regards,


      • Hi Frank

        Many thanks for your proactive response.

        I am not using the X - segment  and deleted it from the program as we have no requirement to update the material master data. Am I doing something wrong there? Is this segment necessary and to be populated with default 'X ' value?

        Secondly can you please confirm what the message type and  process code value to be used for IDOC? I am using the value 'MATMAS_MASS_BAPI' as message type and 'BAPI' as a process code?

        Thanks once again.

        Balaram Agarwal

        • Hi Balaram,

          Then that's the issue. You have to use the X-segment in case you use the segment. What you need to do: Just fill the 'X' for every field that you want to update from the real segment. The X-segment is mandatory in case you fill the segment. Please check how the Data Services Data Flows are build for the other segments.

          Regarding your other question, to make use of the mass load of BAPI MATMAS_MASS_BAPI, you have to use process code BAPM instead of BAPI. The 'M' stands for mass upload.

          Hope that helps,


          • Hi Frank

            Many thanks for your reply. The job has executed successfully.

            I must say the content of Rapid Data Migration is really good and saving a lot of time from effort perspective.

            In few cases, I have not used the Rapid Data Migration content as it only uses IDOC. I have built my own program if there is an available BAPI to load data. That's the suggestion I would like to bring to the table that if BAPI could have been used rather than IDOC as BAPI returns the parameter, which makes it easy from reconciliation point of view.

            Thanks once again


          • Hi Balaram,

            Great that your moving along with using the RDM RDS.

            Would it be possible to get some feedback from you regarding your experience with the solution?

            You can reach me at

            Frank and I would be appreciative of your thoughts and feedback.



          • Hi Frank , Balaram,


            can you please provide details needed to create "BAPM" Process Code to process Idoc MATMAS_MASS_BAPI_03 for Example Function Module.



            Shiva Sahu

  • Hi Frank Densborn/ Michael Sanjongco

    Great article.

    I was going through the RDS fot SF and workforce analytics, specifically this

    " HCI (DS) Configuration for SuccessFactors Workforce Analytics (X64)

    Building Block Configuration Guide", and I can see that there is a config step where we need to create a custom infortype (9001). At the moment we are using this IT for other purposes...

    So if we create a different one, lets say 9020 instead.. what other settings we would have to make to allow HCI-DS to communicate with SAP HR ECC.

    3.2 Create Infotype


    In this section, you can create infotype which will be used in HCI-DS to communicate with SAP HCM backend system.


    • Hi Mariano,

      In your case, I just provide the minimum changes and you just need to replace the source table from PA9001 to PA9020, don't need to do any other changes. Please try to check the following steps to update your HCI-DS tasks:

      Step 1, go to DATASTORES tab, import the table PA9020 which will be created automatically after IT 9020 created successfully via TCode PM01.

      Step 2, go to PROJECTS tab, you will see 2 tasks which need to be updated, such as WFA_PA9001_Delta and WFA_PA9001_Initial, you don't need to change the tasks names.

      Step 3, then, you need to replace the source table PA9001 with PA9020 from those 2 tasks. For example, in task WFA_PA9001_Delta, you need to change 2 DataFlow, such as DF_PA9001_Delta_Stage and DF_PA9001_Deletes, you don't need to change the DataFlow name. In Edit Data Flow screen, you can click the button Source Table and drag the source name PA9020 into ABAP group box, delete the source table PA9001 and then modify Mapping etc.

      In DF_PA9001_Delta_Stage, you may need to do the replace twice, because the source table PA9001 is used twice.

      To verify if the change works, please access your HANA instance for Workforce Analytics and check if the data in the table STAGE_PA9001 is correct for IT 9020.

      By the way, please make a backup for the tasks before you change it.



  • Hi,

    I am deploying the RDS for a customer implementing SAP ERP and all going well apart from issues with Migration.war file which was resolved in using on older version and also with the prebuilt universe and reports.  These are built from MS SQL Server but my DB is Sybase Ase so I needed to change a lot of syntax as Sybase does not like quotes and I had to alter the universe derived tables.

    Anyway moving on can you pass on your recommendations for languages of business objects.  For example for equipment I am loading various items for my customer and he needs them in French and English.  As my target Idoc for equipment does not have a language field I said I cannot meet his requirements and therefore for language he should use LSMW.  Is there a better way in SAP Data Services where I can do something for language fields as this will occur for many other objects like vendor, material etc

    Any thought are appreciated.  For me I am limited by my target Idoc as if there is no field there for their requirements I am stuck, unless I find another IDoc.



    • Hi Chris,

      The data migration for Equipment is done via a BAPI (EQUI.Create) that is used with an IDoc shell (EQUIPMENT_CREATE). LSMW would use the same interface and as you said, there is no language field as a part of this interface.

      What fields are you referring to that should be localized? If it's about long texts, typically this is loaded with a second step after the initial load with a different object.

      Please list all the fields that you are referring to and I'll check in detail.

      Best regards,


      • Two fields we wanted were -



        For the language fields we wanted -



        What is your recomendation for languages for RDS?

        Many thanks for all your input on this thread - it is highly appreciated

        • Hi Chris,

          If I understand right, it's about partner function and partner number. Typically, you'd load all the data with one language (e.g. English) with the correct partner function value for that language. If the customizing is done correctly, the partner function will display with another code depending on the logon language. The texts behind PARVW should be maintained in table TPART. There the code refers to a text depending on the language.

          However, equipment related texts are stored in RM63T structure depending on the language. I'll need to have another look into this, could you please let me know what fields you want to fill in the BAPI interface structure related to this (I'm not an Equipment expert)?



    • Hi Chris,

      Regarding the Sybase ASE, the solution already support it and it's better to use the ODBC connecting the database instead of direct connect to avoid the error you met, because there is a setting called Use Quoted Identifiers in connection tab.



      • Hi Chris,

        we delivery a documentation called package configuration guide, in which have a chapter of configure unverse connections using the Sybase ASE.

        Below is the comments.

        Select the ODBC Drivers if are using database Sybase ASE 15.x. Make sure you have installed the Sybase ASE Client prior to create new ODBC Data Source for Sybase ASE Select driver Adaptive Server Enterprise, and Select the indicator named Use Quoted Identifiers in tab connection.

        • Great to know Thomas though I did not find that document, would have saved me a fair bit of time.  To be honest I did find the all the documents a bit of a minefield to navigate - think that could be improved into couple of large PDFs.  Could you tell me where this document is located from the downloads for SAP RDS for ERP and CRM as I was unable to find it.

          Comment on the excel templates spreadsheets as a source - when there is a value required like 000056789 the excel column should be formatted as text, otherwise I found the leading zeroes are removed and hence it fails when doing a lookup.

          Once more thank you for your time and hope are happy to receive constructive feedback.  If you need any information from me please ask.  I will be passing on my knowledge from the RDS to the customer so they can be self sufficient to load data and will look to pass on their feedback should it interest you.  I am sure I will be using other RDS solutions in future and if you have any ramp up programs to test using latest offering I am happy to be a guinea pig

          Cheers again

          • Hi Chris,

            Sorry for not let you know where to get the document, it was a attachment( in SAP Note 1791183.

            Chapter 5.6 - Configure Universe Connections



  • RDS.JPGI am trying to load Profit Centers into my ERP where I have multiple company codes assigned to a single profit center.  When loading through the idoc PROFITCENTER_CREATE01 it does not seem function as the first line with profit center pass and the second line gives me an error as believe it is a loading the same profit center.  the documentation says segment E1BP_0015_7 allows for one to many.  I have attached a sample of the input file as wondering if this is filled incorrectly

    Thanks for any feedback

    • Hi Chris,

      Sorry for late response,the IDOC is ok, i did a small test with one profit center correspond to multiple companycode, it works fine.

      The issues occurrd at the processing logic in DS content when call IDOC, it's wrong to generate two IDocs base on upon situtation. The correct should be one IDoc and segment E1BP_0015_7 contains two companycode infos.

      I will check the content and update you further.



    • Hi Chris,

      you need to do some changes for DS Dataflow called DF_DM_ProfitCenters_GenerateIDOC.this is only for temp solution. But for long term consideration, need to seperate the companycode related info  from the Profit center master data as a seperated excel worksheet to aviod the redundant records. Could you tell me what's DS version you are using. so that i can build the DS content for you.



      • Thomas,

        Thanks for getting back to me.  I am running DS 4.2 but dont worry about it for now as we are not going to have multiple comany codes.  Will leave it for your guys to fix in next release of RDS.

        I am using RDS for many objects so if you want more feedback shall I contact you directly or simply post in here?

  • Hi,

    It is great the way SAP Data Services uses its own functionality to create new numbers for items such as materials but this has limitations.  If I run Data Services to create new materials this works fine.  However I then manually add the a new Material within SAP and later go to add more records with Data Services..  The problem with this is that Data Services does not know I have added a new Material and hence used a new number from the range available. Therefore it will try and use this number to create a materail but this is already in use.

    I have modiifed the function within DS which assigns a new material so it uses the SAP function NUMBER_GET_NEXT.  I believe this is an enhancement as both Data Services and SAP are in sync with each other.

    Was just wondering if you have any comments on this customisation i implemented.



    • Hi Chris,

      This is great and actually how we'd love to see our customers using the content as Best Practices and enhancing it! It would be great if you could share the code with the community!



  • Thx for the blogging, this is a great forum.

    I can see that running DS 4.x seems like a good idea. However, my client runs DS 3.2 - any comments on DS versions?

    Should they upgrade? Or will the RDS's work, anyway?

    • Hi Ole,

      We keep updating our content to make it run with the latest release. This means that the most recent content is built for DS 4.2 and won't work in DS 3.2. What is the reason for not upgrading? You could also install 4.2 and check whether you can downport the content.

      I doubt that you can simply import the DS 4.2 content into DS 3.2.



  • Excellent series of blogs on Rapid data migration for a beginner.

    This is what i was looking for ! Thank you 🙂

  • Hello Frank,

    the hyperlink in paragraph-2 with this text "A better way to migrate" doesnt seem to work, seems that the address is not reachable. please update the same.

  • Hi Frank,

    we are migrating the customer masters data from ecc to hana from sap bods(data service designer), as we downloaded the standard packages from sap market place , when we try to import the ATL files present in package in to BODS system,it asks for password,Could you tell me which password is to be given for the ATL file to import from Rapid data migration package from sap market place. please reach me to for private message if needed.



    • Hi Hari,

      There is no password (blank), please just hit enter or click ok. Password should be left blank as also stated in the documentation. Did you download the docu package on the Service Marketplace, too? Also, please check out all the content attached to the associated SAP note.



  • Hello Frank,

    are the Rapid Deployment Solutions als available for SAP S/4HANA on-premise 1511 and 1610?

    When yes, where can I find more information about this topic.

    Thank you in advance.



    • Hi Christian,

      Sorry about the late reply, still figuring out how I receive the notifications on our new blogging/collaboration platform...

      There are packages for SAP S/4HANA and actually, the Rapid Data Migration approach is the best way to migrate to a New SAP S/4HANA Implementation!

      You can find the package in the new Best Practices Explorer:

      For more information, please continue reading here:

      And probably the best, SAP S/4HANA customers can get access to a free Data Integrator license key code for their SAP Data Services system:

      Another good source for information might be provided in one of these books (depending on your preferred language):
      Hope this helps,

  • Hi Frank,

    In the S4HANA customer presentation RDM_S4H_DSV42V2_Pack_Presentation the following bullets of interest on loading G/L, Customer / Vendor (aka Business Partner) balances are show on deck 23:

    ŸVendor Open Items (AP)

    ŸCustomer Open Items (AR)

    ŸGL Open Items

    ŸGL Balances

    However, this seems to be applicable to the SAP Migration scenario only, and I am needing clarification on whether these balances can be automated through the RDS for non-SAP customer (greenfield) scenario moving onto 1610.

    My initial perception from the migration options presented is the answer is No.  Could you please clarify?

    ~ Wayne


    • Wayne,

      All these objects are for SAP S/4HANA New Implementations, independent of SAP source system(greenfield) or non-SAP source system (legacy data migration). While the historical data should be left behind, the data migration tools allow for a migration of open transactional data.


  • Hi Frank,

    We've been using this product since we began our global migration to SAP ERP back in 2011. Every 18 months after a cycle go live I have the opportunity to upgrade the software to move us back into support. One thing that is holding us back is that RDM2SAPERP is not certified on Oracle 12 or SQL Server 2012. Right now we are on extended support on Oracle 11gr2.0.4 just for this package, and our DS servers are still on 2008 r2 just for this package, while all our other Data migration products have advanced (SQL 2012, etc. )   My question, has anyone tried this package on Oracle 12 or SQL 2012? I fear SAP is not going to certify this product on any later versions....

    • Hi Chris,

      I understand your concern that our RDM solution was built on the certain versions of databases, but it doesn't mean our solution cannot run on newer versions of the databases. The reason is because our migration solution is built on SAP Data Services. This means whether our solution works or not on the newer databases, it depends on whether SAP Data Services supports them or not. If SAP Data Services can run on SQL Server 2012, I don't see the reason why our migration solution cannot do.

      Our latest RDM solution was built and tested on the following databases:

      SAP HANA 1.x
      Oracle 12c
      Microsoft SQL Server 2008 R2
      IBM DB2 v9.7
      Sybase ASE 15.7

    • Hi Surya,

      Unfortunately, we don't have RDS packages for migrating to SAP Ariba and we're not planning on that.

      However, we do have Best Practices content for integrating with SAP Ariba, please use the Best Practices Explorer as the single point of entry:


  • Hi Frank,

    This is so wonderful forum for RDM. I just has a question. I want to covert the WIP( production oder and the component and routings) from the legacy ECC5.0 to ECC6.0 on Hana. But SAP has a low performace when i use traditional method. I wonder if the RDM is a quicker way for conversion all these data.


    • Thank you for the feedback!

      First of all, let me ask you why you want to consider a greenfield approach (data migration to a new system) rather than brownfield (upgrading your system and converting it to HANA DB via DB migration)? Do you have issues with the upgrade or the DB conversion (or both)?

      When you do a data migration into a new system instead, you should also do business process reengineering. You're starting with a brand new car rather than putting a new engine into your facelifted model. But it also means you have to setup the system from scratch.

      I'd recommend to run an internal analysis and create the business case for both to compare and decide. Please validate that all required objects are part of the data migration content:


      • Hi Frank,

        Good Day,

        Quick question I know you are data guys, I want to down load templates for each migration object for SAP Data Services like where we are downloading the templates from hana cockpit LTMC. Where we can down templates from data services for each migration like material master, vendor, customer etc. Please let me know I want down load the templates from data services or RDM Content along with ATL files because these templates for functional guys to provide data to load using data services.


        Thanks in advance


  • Hi Frank,

    is there a RDM package for Article Master object specific to SAP retail ? I can see material master in list but not Article Master.

    • Hi Deepak,

      The standard ERP/CRM package does include the ARTMAS IDoc with the below segments:

      • E1BPE1MATHED: Header Segment with Control Info
      • E1BPE1MARART: Material Data at Client level
      • E1BPE1MARART1: Material Data at Client level 1
      • E1BPE1MAW1RT: Extension of Basic Data
      • E1BPE1MAKTRT: Material Descriptions
      • E1BPE1MARMRT: Units of Measure
      • E1BPE1MAMTRT: Unit-Dependent Material Texts
      • E1BPE1MEANRT: International Article Numbers (EANs)
      • E1BPE1MLEART: Vendor-specific EANs
      • E1BPE1MLANRT: Tax data

      However, as this is an older package, it might not fulfill your needs or be outdated for your system. You should double-check for that.

      This package can be downloaded here:

      Best regards,

  • Hi Frank,

    "The combination of the software, migration content, and optional consulting services provides a complete solution for migration to SAP applications. Please find a list of the supported objects here"

    In above paragraph, looks like supported objects URL content is unavailable.

    Could you please provide us the new URL which was migrated to new SAP community or information which is in that link.




  • Hi Frank, Do you know that in recent BPDM/RDM package there is a job Job_DM_Vendor_MC which is not working?  First of all this job will fail validation. It has fields like Title, Many J_ fields, BPEXT and so on in Vendor General and in 2 more places for mapping. These fields are not present in Excel input file defined "MC_VENDORGENERALDATA". They all have red X. Looks like even validation did not happen. Anyway, I solved these by mapping these to null. Then I ran this job with sap provided sample data. It failed with below error. We thought it could be memory issue and bumped memory to 48 GB. But this is the issue with 4-5 validate DFs. All those DFs are failing. Validation conditions added in Validate Transform is causing this issue. (I am told that these are now part of interview questions to judge how if a candidate is used RDM). Finally, I could resolve all issues and could use RDM but it was a shock at the beginning.

    |Session Job_DM_Vendor_MC|Data flow DF_DM_VendorGeneralData_Validate
                                                          Data flow <DF_DM_VendorGeneralData_Validate> received a bad system message. Message text from the child process is
                                                          Collect the following and send to Customer Support:
                                                          1. Log files(error_*, monitor_*, trace_*) associated with this failed job.
                                                          2. Exported ATL file of this failed job.
                                                          3. DDL statements of tables referenced in this failed job.
                                                          4. Data to populate the tables referenced in the failed job. If not possible, get the last few rows (or sample of them) when
                                                          the job failed.
                                                          5. Core dump, if any, generated from this failed job.
                                                          ==========================================================>. The process executing data flow <DF_DM_VendorGeneralData_Validate>
                                                          has died abnormally. For NT, check errorlog.txt. For HPUX, check stack_trace.txt. Also, notify Technical Support.