Skip to Content
Technical Articles
Author's profile photo VENKATESH GOLLA

SAP Data Migration – SAP LTMC and LTMOM / LSMW vs LTMC

SAP Data Migration – SAP LTMC and LTMOM

 

Hello 😊,

Today we are discussing SAP Data Migration using SAP LTMC and how to use LTMOM for extended customization here.

When We are Implementing the SAP S/4HANA solution, we can migrate our master data and business data from SAP systems or non-SAP systems to SAP S/4HANA. By using SAP S/4HANA migration cockpit.

In past, I have covered different Data Migration Approaches in the following blog post, that covered LTMC detailed steps using file-based approach Master data migration

https://blogs.sap.com/2019/09/24/sap-s4hana-data-migration-cockpit-insights-and-business-partner/

Let’s get into today’s actual topic:

LTMC is a newer version/ enhanced version of LSMW

  • LSMW – Legacy System Migration workbench
  • LTMC – Landscape Transformation Migration Cockpit

For the Master/Transactional data transformation, we used to use LSMW from legacy systems like ECC, any other ERP or tally, whatever data like a cost center, profit center, bank, G/L account, BP.

When we have more data, we cannot do this one by one manually (definitely not 😉 ) – so we use a tool like LSMW in past.

And now we have an enhanced version called LTMC.

LSMW Vs LTMC:LSMW%20vs%20LTMC

LSMW vs LTMC

Note: we do have LSMW still exist in SAP, but not recommended by SAP nowadays.

Well, Data migration using LTMC can be achieved in 2 ways.

  1. File-based approach

LTMC File-Based Approach

  1. Staging tables approach

Here are the pros and cons, prerequisites on high-level:

Consideration Files Staging Tables
Size Limit 160MB limit for SAP S/4HANA Migration Cockpit* No Limit.
System Considerations None. The staging system uses an SAP HANA database.
Data Provisioning Enter data manually in each Microsoft Excel XML file. Fill tables manually or by using preferred tools (for example Data services, Syniti).

 

What is LTMOM

LTMOM – Migration Object Modeler

where it is useful?

Scenario: there are few customization fields that must add, the LTMC can’t help and LTMOM has come to the rescue of the situation here.

Steps:

Open the same project in LTMOM and double click on the Source Structures. This will show you the list of fields already added to the migration template. All of them may not be visible in the template.

LTMOM will allow us to create additional fields and customization fields in the source structure and, mapped to the target structure.

Here are the steps:

  1. Step1: Go to LTMOM tcode, select the project and data object to add additional/customized fields.

LTMOM_TCODE

LTMOM_TCODE

Steps 2: Select the Project

Select%20Project

Select Project

 

Step 3:

Select the Data Object that needs customization fields:

step 4:

 

Go to the Source structure and click ADD FIELD:

Step 5:

enter the Field name, data type and length, other details as shown below:

Field%20details

Field details

Step 6: After adding the required field, generate the object:Generate%20Object

Generate Object

Step 7: wait for the generation:

Generated%20Changes%20in%20Object

Generated Changes in Object

Note: Here we need the ABAPer to add the same field name in the target system Structure as well.

 

Step 8: Go to LTMC screen:

LTMC

 

Step 9:

Select the Proj

Step 10:

Go to the data object in the project

Data%20Object%20-%20Cost%20Center

Data Object – Cost Center

LTMC

Step 11: Go to Template and download

Template%20Download

Template Download

 

Step 12:

Template%20Downloaded

Template Downloaded

Step 13:

Check if the added/customized fields in template:

step 14:

And then, Go to Source and target structures for mapping:

 

and, once finished – fill the template and simulate the process and load the data to the target:

LTMC File-Based Data Migration

 

In the next blog, will see how to deal with staging tables.

In the next blog posts, we can discuss more processing.

That’s all about this blog post.

Thanks for reading, please provide your feedback. ?

Happy Learning, see you in my next blog 🙂

 

Thanks,

Venkatesh Golla

 

 

Assigned tags

      4 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Heinrich JHL
      Heinrich JHL

      What would you recommend to load the India specific data in thousands of  new business partners? The standard templates does not have anything. Would you still think of 100% improvement for ready made templates and consultant dependency?

      I cannot agree at all that the omission of batch input recording is a 100% improvement and adds any value, from my point of view it is the total opposite. With LSMW and recording the job would have been done in 10 minutes while with LTMC the search for an option has not even finished.

      What would you do if you had to separate the house number from the street in old existing data?

      With LSMW recording a 10 minute job, in S4 just impossible. Recording not available for BP, and no mass maintenance for address data.

      Can you elaborate on "changing fields"  where you have put YES in the LTMC column and use LTMOM under remarks?

       

      Author's profile photo Avik Mazumder
      Avik Mazumder

      Heinrich- I do agree with your points.Moreover,to me the table shown on the blog is oversimplification in hindsight. First we need to be hands on with LSMW -deep-dive,then with LTMC and then with LTMOM,finally drafting a Pros & Cons. would have been appropriate.

      Major challenge of LTMC is how do we change mass data- ( Obviously not considering the MASS options,the fields outside that scope).In practical implementations,we still have to build custom ABAP programs for MASS change of master data (Supplier,Customer,BP,article etc.) sets as far as few of my past S/4 implementations are concerned. LTMC must have something for mass data change mode- Business might have given wrong file inputs for few columns,which got loaded by mistake, there might be late entry requirement to add few extra fields across work-stream and how do we mass update all those already created mass data set using LTMC- those questions are still un- answered.

      Author's profile photo Anupam Samanta
      Anupam Samanta

      Dear All,

      Thought of giving my Input here.

      For Data Migration we generally consider entire ETL part. Where Migration cockpit(LTMC) is typically used for data loading and which is meant for one time Data upload only.

      I know using customization or creating custom projects in Object Modeler(LTMOM) you can create change objects.

      So we cannot rule out LSMW / Conversion programs where we need to change uploaded data in S4HANA depending on requirements.

      In Boarder scenario , entire ETL can be handled in a hybrid approach.

      We can think of below approaches for SAP data migration -

      1. Use Data Integrator ( which is bundled with S4HANA license ) as your Extract  and Transformation tool. Then push these data into Migration Cockpit Staging area for data loading using LTMC(Migration Cockpit).

      @Heinrich JHL This approach can address your concern where you are having huge data volume. You may not needed Data extract and Transformation tool, but you need to fill the staging database using your pre-load data in migration cockpit format. and then start loading the data from Staging area.

      Also to cater any new field requirement we need to add those fields in both Source and target mapping . Since LTMC uses BAPI's for data upload, need to check if that field is available in BAPI as well to do the target mapping. If not then you need to enhance the BAPI to migrate data for that field.

      2. Use Data Integrator as your entire ETL tool. It will be an automated data load.

      Avik Mazumder : This approach can cater mass change efficiently in subsequent loads. In this case Migration cockpit will not be used .

      3. Use any third-party ET tool / Manual approach to prepare transformed data, and then fill the staging area with the transformed data. and start loading using Migration cockpit staging transfer method. It can handle large volume of data.

      4. Use any third-party ET tool / Manual approach to prepare transformed data, and then use Migration Cockpit templates to upload the data using file transfer method. In this case there will be a limitation in file size.

       

      Also SAP has introduced Direct transfer method in Migration cockpit which can transfer data directly from legacy SAP systems.

      To summarize, Use Migration Cockpit as a tool for one time data creation.

      if you are looking for data management / maintenance using migration cockpit , you need to customize the standard object or create change objects using Object Modeler(LTMOM).

      ** For LTMOM you can refer below links

      SAP S/4HANA Migration Cockpit - Creating Your Own Function Modules in LTMOM

      SAP S/4HANA Migration Cockpit - Deep Dive LTMOM for File/Staging

      Regards,

      Anupam

      Author's profile photo Heinrich JHL
      Heinrich JHL

      Thanks for this long answer and the links.

      Staging table or file is not the real issue in the moment.

      Creating own objects using BAPIs might be an option where I have to dig deeper.

      Recently I stumbled into another gap with "Extend Supplier" . Wanted to create new purchasing organization data, but then ran into ASSERTION_FAILED dump. Just to get pointed to the docu which says: This migration object is based on the assumption that business partner and supplier are using the same
      partner ID. If you have customized your system in a different way, you must adjust the migration object.

      I wonder if there is any hint or note when you customize your system that tells you that you  should make sure to have the Business partner number synchronous otherwise you cannot use  LTMC  like it is delivered.. I have not seen such hint and the consulting partner has not mentioned it. Probably because we never had a chance to make it happen with a brown field approach.
      But if you have a 30 year old system and data since that time, then you have rarely large volumes for new creation, 90 % are just extensions of existing data.