Skip to Content
Technical Articles
Author's profile photo Soujanya Krishnareddy

S/4 Data Migration – How to use Data Migration Cockpit effectively – Tips & Tricks – Part 1 in the Series of 2 Parts

In S/4 data migrations, the data migration cockpit became an essential tool. With rapidly evolving changes in the S/4 world, SAP is coming up with new enhancements to the Data migration cockpit almost every week.

Let us look into a few tips and tricks today. Please note, this blog is for the team who is already familiar with data migration cockpit and know the basics of tcodes LTMC & LTMOM.

If you want to have a brief overview of the above tcodes, please let me know in the comments section, I’ll publish one.

Today, we will review the below points:

  1. How to move the Data migration cockpit project from system to system
  2. How to process the records with the same key again
  3. How to overcome the performance issues

Move Data migration project from system to system:

Either within the same environment (Development -> Quality -> Production) or to other project environments, we will have a need to move the existing data migration project. Especially, once we extend the existing objects and /or add new objects to the project, it is very essential to move the project to avoid rework.


There are 2 steps involved in this.

  1. When we extend the objects we change the structures, system creates workbench transport. Same goes with creating new objects and adding new source code creates a workbench transport as well. Move the workbench requests for structure changes and source code changes just like any workbench transport.

2. Export and Import the Data Migration Cockpit project. This step is very similar to old LSMW      days. LTMOM mapping and all related things get copied to a local file.To export, in the source system client -> Open LTMC –> Click on Export Content

Upload the file downloaded (from source system client), from local machine.


Tip: Make sure to ‘synchronize structures’ and ‘Generate runtime object’ the data objects in every client. This can be done from tcode – LTMOM. Open the project -> data object and execute the 2 steps. Generation of data objects should be done in every client in the system.


Process records with same key multiple times:

The full blown change functionality is not supported yet with standard data objects. We can enhance or create new data objects using LTMOM to support that (we can discuss it further in the coming blogs).

There will be situations we want to process the same record again but, data migration stops us with an error. This functionality is a good sanity check. However, if we need to process the same record again, we need to clear the old record from table DMC_FM_RESTART_K

There is a standard report from SAP supports the deletion of old records for the data migration object. Launch SE38 and execute report DMC_RESET_FM_RESTART_TABLE

Tip: If you are building the custom data objects, please make sure you build the similar functionality (not allowing duplicates until explicitly one tries it). This comes very handy while doing audit. Just letting the system update any and every record sabotages the purpose of these checks built in.

Performance issues:

There are many performance concerns in using data migration cockpit. A few tricks can help to improve the performance.

  • Memory Issues


If we get the memory errors like the one below, clearing the logs help.

No more memory available to add rows to an internal table.

Go to the tile ‘Application log – Delete Expired Log’ or tcode SLG2. Provide the relevant input and delete the logs which are not needed. If there is a log exist for the project, System tries to read through it and it hinders the performance. Clearing logs helps to resolve that issue.

Once we execute the deletion of expired logs, it generates a back ground program to delete the logs.


  • File size


While working with files, the default size limit for each uploaded XML file is 100MB, but it depends on parameter icm/HTTP/max_request_size_KB, which controls the size of the http request. You can increase the size limit for each uploaded XML file by changing the system parameter (icm/HTTP/max_request_size_KB).

However, it is very much possible that, even after increasing the file size parameters, the timeouts may happen. At that point, it is required to adjust the Timeout Options for ICM and Web Dispatcher accordingly to avoid timeouts during the file upload.

Please review the SAP Note – 2719524 for more information.

  • Cloning the data object


Working with staging tables can be one solution for file size issues. That is the main purpose of staging tables.

However, the processing time issue is the same. The processing times with data migration cockpit are slightly higher than old direct input methods supported by LSMW. There are various reasons for it. The solution working better at this point is, slicing the input file into multiple files and clone the data object to use multiple background processes with those small files.

This step needs to be balanced out always, with the number of background processes available and how many users consuming them at any given point.


In further blog posts let us discuss more tips and tricks of data migration cockpit. Please refer to the note 2537549 to learn, new happenings on Data Migration Cockpit.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Mopuri Nageswara Rao
      Mopuri Nageswara Rao

      very useful information.



      Author's profile photo suryaprakash Undralla
      suryaprakash Undralla

      Knowledge Gain Information very nice please continue rest of cockpit.....

      Author's profile photo Wendy Li
      Wendy Li

      Very useful article, thank you very much!

      I also want to share some thing which might be useful here:

      During my transportation from DEV to QAS, "error when uploading. The file name is invalid or the file is too big" occurs when I import the content.

      After changing IE to Chrome, changing a short name for the downloaded file, and put the file on desktop, I can import the file successfully.

      Author's profile photo Soujanya Krishnareddy
      Soujanya Krishnareddy
      Blog Post Author

      Also, file is too big is the often issue with transfer through files. From 1809, we have option of staging tables. Also from 2020 we can use files and staging tables together.


      Author's profile photo Louis Nicolas Arson
      Louis Nicolas Arson


      the Process records with same key multiple times: with the program DMC_RESET_FM_RESTART_TABLE seem not available in the release 1909 SP02.

      I ran a lot of migration object and the table DMC_FM_RESTART_K is empty !

      any idea ?


      Author's profile photo Soujanya Krishnareddy
      Soujanya Krishnareddy
      Blog Post Author

      Louis - Sorry for the delay. What exactly is the issue? Are you not able to re-run the records with same key?

      Author's profile photo Tom Takumi Okamoto
      Tom Takumi Okamoto

      Running 1809, any changes doesn't trigger creation of workbench transport. Can you please shed some light?

      Author's profile photo Soujanya Krishnareddy
      Soujanya Krishnareddy
      Blog Post Author

      The LTMOM settings won't trigger workbench request. Only the code or underlying structure changes will.

      Author's profile photo Radoslava Marina
      Radoslava Marina

      Dear Soujanya,

      During the migration of FI-GL Customer open item - classic GL and -GL Vendor open item - classic GL through Direct Transfer 1909 following error appeared - "No more memory available to add rows to an internal table". FM for these objects is BAPI_ACC_DOCUMENT_POST.

      Could you please advise how to resolve this issue as it stops me on step "Migration". During the simulation of the data no errors appeared on this step.

      Thanks a lot in advance!

      Radoslava Marina

      Author's profile photo Soujanya Krishnareddy
      Soujanya Krishnareddy
      Blog Post Author

      Try to limit the number of records if you are using file / staging table. If you are using direct transfer, please use some filters to restrict the data as smaller chunks.

      Author's profile photo Vighnesh Vasudevan
      Vighnesh Vasudevan

      Hi Soujanya ,

      When i try to activate the migration object "Product Master" in direct transfer method (2020 version) , we are getting error NOT READY FOR PROCESSING.

      I have executed the Note Analyser and implemented all relevant OSS notes in the system. But still i am getting the same error.

      Note: This error is coming only for specific objects and not for all.

      I am not sure if i am missing any step . 

      Need your expert suggestion. 


      Thanks and Regards,

      Vighnesh Vasudevan.