Skip to Content

Data Migration: Installing Data Services and Best Practice Content

In a previous First look at SAP’s Data Migration Solution I discussed the options for data migration and the technologies used.    We stated that SAP delivers software and content for data migration projects.  In this blog I’ll discuss the installation of the Data Services software and loading of the best practice content.  

In order to get started you can use online help or the bp-migration alias in SMP.   In this blog I used the best practices link in online help.    There is a lot of great collateral here to take advantage of, however when I was ready to install The combination of the content down (bp-migration) and the online help giver you everything you need to get started!  It will take you to the ERP quick guide that literally walks through what you need!   It includes a silent installation.  I tried the silent installation, with no luck, so I went to the appendix and did the steps for the manual installation.  I’m sure the silent installation works if everything is setup correctly, but I wasn’t too worried about it since I wanted to see the required steps for the manual installation. 

Experience with the manual installation

With the manual installation you install Microsoft SQL Express, Data Services, and do things like create specific repositories, load best practice content into Data Services, and ensure everything is ready to run.    I had to do the manual installation between meetings, so it took me a couple of days, but in man-hours it probably only took ½ of day – and a lot of that was me reading the guide to ensure it was done correctly.  There are a lot of scripts that you need to run, so I’d suggest setting up the passwords to be the same as what would be set in the silent installation.

The installation of Microsoft SQL Express and Data Services is very straight forward.  I did them on my laptop which has other SAP software and didn’t run into any problems.   I couldn’t find a trial download of Data Services on SDN.  Hopefully your company has already purchased the product.   Someone mentioned you can download and then just get a temporary key from, but I’m not sure.   I also think there is a version of BI-Edge that includes it, but it’s not in the basic BI-Edge, you at least need the “BI Edge with Integrator Kit”.   If you have experience with using Data Services as a trial, please post a response and share!

Once the basic installation is done you start to import the content into data services.  The content is grouped into rar files and you download and unzip the content.  The import happens into a couple of Data Services repositories.  The first repository is related to all the lookups that are required during the migration.  The first content I unzipped had spreadsheets for IDOC mapping and files that will be used for doing lookups to validate data. All of this goes into the lookup repository.   One example is the following screenshot.  This is for HCM data.  It has description of the field, if a lookup is required, and if the field is required.  There’s other information as well.


Spreadsheet with HR IDOC information

At this point I wanted to know what else is delivered to help understand the IDOCs.   So, I explored a couple things.  The first is the documentation with the best practices in online help.   In online help you’ll find sample project plans, and links to ASAP methodology that has data migration content.  I haven’t explored the content available in detail for project management, I’ll do that and blog about that soon.     You can also download the documentation from SWDC on the service market place.   The documentation is the same in online help and in the download area on SWDC.    The documentation includes a word document for each of the objects.  So, for example, in the HCM example mentioned above, the word document is 32 pages and includes description of the IDOC segments, number ranges, and process steps. 

After the import I had datastores, jobs, project content in Data Services.  Examples are below:

Project and jobs created for creating lookup tables in the staging area for the relevant migration objects. These look up tables will be used to validate the data when the migration jobs execute.  

Delivered project


Data flows created in data services for the lookup table creation jobs:


Datastores were created. The datastores create lcocal storage for temporary data for the migration, as well as linking to the SAP target system.   I haven’t yet created a datastore for the source system.  The next step will be to update the DS_SAP datastore with the actual SAP system to be the target system for the migration. 

Data stores

 The second repository holds all the migration jobs with the target of the IDOCs going to the SAP system.   Once the import for this repository were completed I had a project with jobs related to the IDOC structures:

Project and jobs for migrating data




From this screen shot you can see the jobs for some of the content objects such as accounts payable, bill of material, cost elements, etc. 


That was pretty much it for the manual installation. The last piece was to add the repositories to the Data Services Management Console.  The management console is web-based way to schedule and monitor the jobs.   OK – now I’m ready for the post-installation work!    Look for the next installment to discuss post-installation and start using the content!

You must be Logged on to comment or reply to a post.
  • Hello Ginger,

    I need to evaluate Data Migration tool.
    Do you know if is possible to have a license key just for evaluation? In other products there is similar license key with expiration after 14 days.


    • Sorry,
      I’m hurried.
      Your blog is very complete. In the link provided by you there are the license key!! Perfect.

      Little note, also for me the automatic installation doesn’t work. I don’t know why. Win 7?
      With manual installation it works fine.

      • Hi Sergio,
        I’m glad to know the license key worked – that’s super!   Regarding the automatic installation, it also did not work for me.  I’ll ask the development team about it.




    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services

    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customer’s legacy data to SAP ERP and SAP CRM (New!).
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  – Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals – Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Services– Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example – Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.

    Dec 13:
                    Monday 10AM – 5PM
                    Hotel Kalipeh  – Energy Room
                    Dietmar-Hopp-Allee 15                                                               
                    Walldorf, Germany 69190                                                             
                    Phone:  + Tel.: +49/06227 / 778000
    Dec 14-15:
                    Tuesday 9AM – 5PM
                     Wednesday 8AM – 3PM
                     SAP AG WDF 1
                     Building 1, Biel Room (6th floor)
                     Dietmar-Hopp-Allee 16                                                   
                     Walldorf, Germany 69190             
                     Phone:  +49/6227 / 743002 or 743344



  • Ginger

    I attended your session in Vegas back in Feb2010 and have been playing with DM Best Practices since. There is some really good material here to help us through the process.
    But I am now wrestling with the process of developing and modifying the BP code. Are there recommendations for how to scope a Project, if code check-in / check-out is to be performed at the Project Level. I anticipate that I will move the BestPractice code into multiple smaller projects than is currently in the install – but what Best Practice recommendations exist for scoping a Project within Data Services ?

    • If you are still not in a live project situation I would recommend that you wait for the new release coming out end of this month (Sept 2011).  In this release we have improved the deployment options to make it easier to deploy and customise for both SME and LE deployment.  We will be running Partner Enablement Sessions to explain the new deployment logic and additional DQ components that will plug into the solutions.  If you would like an invite to these sessions please post back to the blog with a reply and we can forward you the invite.

      The check-in / check-out process is described in the LE Deployment Guide which comes as part of the documentation package for the latest content and will be available to download from the usual address



    • Gary,

      If you are not in a production / on going project scenario right now I would wait until the Sept 26th release of the SAP BP for DM v1.40 is released.  This new release contains a number of enhancements to the content and architecture to support both 3.2 and 4.0 release of DS and BOE.  It also improves the ability for the solution to be deployed in a multi developer landscape by introducing the concept of shared lookup and management datastores and developer owned object staging datastores.  We are running PSD (partner enablement sessions) on this new release on the 28th and 29th Sept where we will discuss in more detail the content, architecture and new Data Quality features.  Please repost to the blog if you haven’t received the invite to the PSD session and we can post it here.  In the new documentation package there is a Large Enterprise deployment guide which discusses the use of central repositories and how to use the check in / check out functionality.  If you need more information now please repost.

  • Hello,

    these contents are used uniquely for small company or can be used for big company ?

    I am interested by the Job_AIO_BillOfMaterial_IDOC.

    Can I have a description of all content of this job ?

    Thank you.