Enterprise Resource Planning Blogs by Members
Gain new perspectives and knowledge about enterprise resource planning in blog posts from community members. Share your own comments and ERP insights today!
cancel
Showing results for 
Search instead for 
Did you mean: 
In S/4 data migrations, the data migration cockpit became an essential tool. With rapidly evolving changes in the S/4 world, SAP is coming up with new enhancements to the Data migration cockpit almost every week.

Let us look into a few tips and tricks today. Please note, this blog is for the team who is already familiar with data migration cockpit and know the basics of tcodes LTMC & LTMOM.

If you want to have a brief overview of the above tcodes, please let me know in the comments section, I’ll publish one.

Today, we will review the below points:

  1. How to move the Data migration cockpit project from system to system

  2. How to process the records with the same key again

  3. How to overcome the performance issues


Move Data migration project from system to system:

Either within the same environment (Development -> Quality -> Production) or to other project environments, we will have a need to move the existing data migration project. Especially, once we extend the existing objects and /or add new objects to the project, it is very essential to move the project to avoid rework.

 

There are 2 steps involved in this.

  1. When we extend the objects we change the structures, system creates workbench transport. Same goes with creating new objects and adding new source code creates a workbench transport as well. Move the workbench requests for structure changes and source code changes just like any workbench transport.




2. Export and Import the Data Migration Cockpit project. This step is very similar to old LSMW      days. LTMOM mapping and all related things get copied to a local file.To export, in the source system client -> Open LTMC –> Click on Export Content



Upload the file downloaded (from source system client), from local machine.

 

Tip: Make sure to ‘synchronize structures’ and ‘Generate runtime object’ the data objects in every client. This can be done from tcode – LTMOM. Open the project -> data object and execute the 2 steps. Generation of data objects should be done in every client in the system.



 

Process records with same key multiple times:

The full blown change functionality is not supported yet with standard data objects. We can enhance or create new data objects using LTMOM to support that (we can discuss it further in the coming blogs).

There will be situations we want to process the same record again but, data migration stops us with an error. This functionality is a good sanity check. However, if we need to process the same record again, we need to clear the old record from table DMC_FM_RESTART_K



There is a standard report from SAP supports the deletion of old records for the data migration object. Launch SE38 and execute report DMC_RESET_FM_RESTART_TABLE



Tip: If you are building the custom data objects, please make sure you build the similar functionality (not allowing duplicates until explicitly one tries it). This comes very handy while doing audit. Just letting the system update any and every record sabotages the purpose of these checks built in.

Performance issues:

There are many performance concerns in using data migration cockpit. A few tricks can help to improve the performance.

  • Memory Issues


 

If we get the memory errors like the one below, clearing the logs help.

No more memory available to add rows to an internal table.

Go to the tile ‘Application log – Delete Expired Log’ or tcode SLG2. Provide the relevant input and delete the logs which are not needed. If there is a log exist for the project, System tries to read through it and it hinders the performance. Clearing logs helps to resolve that issue.



Once we execute the deletion of expired logs, it generates a back ground program to delete the logs.



 

  • File size


 

While working with files, the default size limit for each uploaded XML file is 100MB, but it depends on parameter icm/HTTP/max_request_size_KB, which controls the size of the http request. You can increase the size limit for each uploaded XML file by changing the system parameter (icm/HTTP/max_request_size_KB).

However, it is very much possible that, even after increasing the file size parameters, the timeouts may happen. At that point, it is required to adjust the Timeout Options for ICM and Web Dispatcher accordingly to avoid timeouts during the file upload.

Please review the SAP Note – 2719524 for more information.

  • Cloning the data object


 

Working with staging tables can be one solution for file size issues. That is the main purpose of staging tables.

However, the processing time issue is the same. The processing times with data migration cockpit are slightly higher than old direct input methods supported by LSMW. There are various reasons for it. The solution working better at this point is, slicing the input file into multiple files and clone the data object to use multiple background processes with those small files.



This step needs to be balanced out always, with the number of background processes available and how many users consuming them at any given point.

 

In further blog posts let us discuss more tips and tricks of data migration cockpit. Please refer to the note 2537549 to learn, new happenings on Data Migration Cockpit.
11 Comments
Labels in this area