Migrating data to your (new) SAP S/4HANA
This blog discusses the ways in which your business data can be migrated to your newly deployed SAP S/4HANA system (on premise or cloud).
This is not the first blog about the topic, but I hope it offers slightly different approach/perspective than the existing ones hence decided to post it. Researching for it has certainly helped me understand this space and I hope it will also help you. Please refer to the end of the article for links to other blogs and information sources about this topic.
The blog will be primarily focusing on New System Implementation transition scenario as illustrated:
In this scenario, you decided to deploy new SAP S/4HANA system and migrate selected data from your legacy system. By the way – that legacy system could be an SAP R/3 or ERP system or any 3rd party system. If it is an SAP ERP system, then you could transition to SAP S/4HANA (on premise) using Conversion scenario, but you have decided not to.
If the above sounds like your case, read on…
In the context of this article, we focus on migration of business data and not system data, like for example configuration.
The New System Implementation scenario supports two distinct targets – SAP S/4HANA (lack of any suffix indicates on premise edition) and SAP S/4HANA Cloud.
But we are getting ahead of ourselves – before talking about how the migration can be done, it is prudent to discuss how data should be prepared for the migration. Broadly, the data in your current environment can be classified into following buckets:
- Good quality data which must be migrated to the new transactional system,
- Good quality data which should be retained for analytical and/or auditing requirements.
- Poor quality data which must be migrated to the new transactional system,
- Poor quality data which should be retained for analytical and/or auditing requirements.
- Data which is no longer required.
In this article, we are focusing on data which must be migrated to the new SAP S/4HANA transactional system – represented by buckets 1 and 3 above. So, before we move on (and after discarding bucket 5), what options do we have for buckets 2 & 4? Following could be considered:
- Use of corporate memory concept with analytical capabilities of SAP HANA and/or SAP BW (on HANA or BW/4HANA) combined with data management and tiering capabilities like Data Tiering Optimization, Native Storage Extension, Near-Line Storage, Dynamic Tiering and integration with Hadoop or other repositories.
- Use of SAP Information Lifecycle Management (ILM) and the retention warehouse – refer to https://www.sap.com/products/information-lifecycle-management.html for more information.
In essence, what you should have in place prior to undertaking major migration project should be a strategy or at least a vision for all kinds of information your organization manages. If you aren’t sure where to start, check the blog on Information Governance Model.
Now, we are left with buckets 1 & 3 to deal with. The only difference between them is the quality of the data – the poorer quality, the better tooling and more effort is required to cleanse and/or enrich the data and it may impact the decision regarding choice of migration tools.
The matrix below shows tools available in various data migration scenarios:
|SAP S/4HANA||SAP S/4HANA Cloud|
Let’s get the elephant out of the room straight away – the notable absence of Legacy System Migration Workbench (LSMW) from the matrix above. As per note 2287723 – LSMW in SAP S/4HANA on premise edition:
The use of LSMW for data load to SAP S/4HANA is not recommended and at the customer’s own responsibility. Instead, the SAP S/4HANA migration cockpit is SAP’s solution for data migration to SAP S/4HANA. You should always check if the object is available with the SAP S/4HANA migration cockpit before using the LSMW.
If you still use the LSMW, you have to carefully test the processes so that you can ensure that it is actually working for you. It might not work for you and in any case.
Expect restrictions around transaction recording (as this is not possible with the new SAP Fiori screens) and changed interfaces (for instance the Business Partner CVI). Standard Batch Input programs may also no longer work as transactions may have changed functionality or may be completely removed within some areas. For example, due to security reasons batch import has been limited for RCCLBI02 program. Also, transactions for customer master data (FD*/XD*) and vendor master data (FK*/XK*) cannot be used anymore due to the change to the Business Partner data model in SAP S/4HANA.
Needless to say that LSMW is not available at all in Cloud edition.
For the reasons above, we will not be discussing LSMW in this article. We will be discussing two set of tools and approaches – using built-in SAP S/4HANA Migration Cockpit and Object Modeler and rapid data migration content for SAP Data Services.
Please also refer to the following blog: Comparison: Migration Cockpit, Rapid Data Migration (with Data Services) and LSMW.
SAP S/4HANA Migration Cockpit (MC) and Migration Object Modeler (MOM)
For an easy introduction to SAP S/4HANA MC refer to this video (also check links at the bottom of this blog for more videos).
SAP S/4HANA Migration Cockpit (referred to as MC hereafter) is a new migration tool that is shipped exclusively with SAP S/4HANA – it was initially available for Cloud edition, but since SAP S/4HANA 1610 it is also shipped with on premise edition.
It is accompanied by Migration Object Modeler (referred to as MOM hereafter), which is a design tool for enhancements and modifications of pre-defined migration objects. We will get back to it a little bit later.
Since both of these tools are new, let’s elaborate a bit on their structure and capabilities to give you a better idea on what to expect of them.
MC is delivered with standard deployment of SAP S/4HANA – that is no additional add-ons or special UI activation activities need to take place after you have installed, upgraded or converted to SAP S/4HANA 1610 or higher. It can be launched using Manage Your Solution Launchpad in case of the Cloud edition or using tx. LTMC in case of On-Premise. The MC itself has a browser based (WebDynpro) interface. There are following concepts employed within MC:
- Migration Object Migration Objects are defined and delivered by SAP, and describe how to migrate data from the source system (which tables are needed and the relationships between the tables) to SAP S/4HANA. Custom Migration Objects are not yet supported, but on the roadmap – refer to section on Migration Object Modeler further in this blog.
- Migration Project Migration Projects are used to facilitate the transfer of data from a source system to SAP S/4HANA. In order to migrate data to SAP S/4HANA, you must first create a migration project. You use a migration project to specify the source system and the data that you want to transfer, and to monitor the status of the migration.
Migration Project provides what is based described as “organizational layer” – it allows for grouping of the migration activities to suit project needs. Examples of criteria used to define separate projects (examples only, can be combined):
- source systems,
- company organizational structure (like company code),
- phases of deployment, etc.
Migration Project has a transfer ID associated with it – concept taken from System Landscape Transformation capabilities. The transfer ID acts as unique identifier per project in order to facilitate transfer of project specific settings (including value mappings) between environments in the landscape – for example between QA and Production systems (which is performed using the Export/Import Content function and not using Change & Transport System (CTS)).
Once Migration Project has been created, you can choose which Migration Objects will be utilized within that Project. On the Project’s overview screen, all available Objects are presented with following information:
- object Name,
- Progress – this at the start shows 0%. If the value is higher, it is an indication that particular object(s) has been copied to the Project’s specific repository and potentially other activities have been started.
- Documentation – link to object’s specific documentation,
- Dependent Migration Object – other migration object which should be loaded first or already present in the system. Depending on version of your S/4HANA system, the completeness of that information may vary – that is not all dependent objects may be listed, especially in older versions.
Once you pick particular object for the first time, the associated object template will be copied to the Migration Project – what that means is that this object’s standard template and transfer rules are copied. It should be noted that the template is only applied at the time of copy. This has following consequences:
- Changes to object template (for example delivered with new software update) do not affect Migration Objects within existing projects. Only newly created Migration Objects based on updated template will inherit changes in the new version of the template.
- Changes to Migration Objects within existing projects do not affect the object template.
Now, that you have selected your Migration Object(s) within the Migration Project, you can do the following:
- Access Migration Object documentation.
- Download template – the file, which you will need to populate with your legacy data.
- Upload file – the template above filled with data from your legacy system(s). You can upload several files at the same time, which allows you to control scope of migration on the file level. This action will load the data from the file – as-is, without any transformations – to staging area within the SAP S/4HANA system (tables DMC_FILE_HDR, DMC_FILE_T and DMC_FILE_STORE).
- View and Edit (only on-premise) – you can view data uploaded to staging area and in case of on-premise edition, you can also directly edit the uploaded data. This changes values stored in the staging area (and not in the source file).
- Activate or deactivate file – only files in Active state can be taken to the next stage, that is initiating transfer process. This allows you to work with multiple files and choose scope of migration every time you initiate the transfer.
- Start the transfer – this will initiate the guided process for transfer of data from selected (active) files. There are following stages in the transfer process:
- Data validation (can be sent for execution in background) – data stored in the staging area is subjected to technical validation for issues like lack of data in mandatory field, wrong data length or type, lack of mapping of some key master/config data like country codes or units of measure.
- Value conversion – you can define mapping rules to convert input values where validation encountered an error.
- Import simulation (can be sent for execution in background) – in this step, the data stored in staging area is processed using actual BAPIs which are used during “real” run of the import – the only difference is that the changes are not committed to the database. This allows for much deeper validation – functional validation of interdependencies like existence of material master in case of purchase or sales order.
- Import execution – the data in staging area is submitted for posting in the SAP S/4HANA system using relevant BAPIs. Successfully imported records are persisted in the database.
It is important to understand that APIs used to post data to the target SAP S/4HANA system only support INSERT actions. In other words, there are no UPSERT or UPDATE actions supported. Thus, any data already loaded to the target system cannot be re-loaded. Similarly, if particular data file has been partially successful, you should filter only failed records for subsequent re-run to avoid errors associated with attempts to create duplicate records (for those which were successful during first run).
To summarise key aspects of the SAP S/4HANA Migration Cockpit:
- It is available for both Cloud and ON-premise editions (as of 1610 for the latter), but may vary in some capabilities (like ability to edit data in staging area).
- It is built into the SAP S/4HANA system and does not require any additional deployments or configurations. The same applies to pre-delivered Migration Objects.
- It uses flat files to load legacy data, but there is a plan to provide capability to extract data from ABAP based SAP source systems directly.
- Supported Migration Objects are defined by SAP and at present can only be enhanced (for on-premise edition) using MOM. Creation of new custom objects is planned – refer to details in MOM section below.
Documentation of the SAP S/4HANA Migration Cockpit can be found here:
- For On-premise edition:
- Online help for SAP S/4HANA
- Available migration objects for SAP S/4HANA
- SAP Best Practices for SAP S/4HANA > Solution Scope > Database and Data Management > Enterprise Information Management
- Note 2537549 – Collective note and FAQ for SAP S/4HANA Migration cockpit (on-premise)
- Note 2481235 – SAP S/4HANA Migration Cockpit (on-premise) – restrictions and extensibility of pre-delivered migration objects
- For Cloud edition:
- Online help for SAP S/4HANA Cloud
- Available migration objects for SAP S/4HANA Cloud 1905
- SAP Best Practices for SAP S/4HANA Cloud for Enterprise Management > Solution Scope > Data Management > Enterprise Information Management
- Note 2538700 – Collective note and FAQ for SAP S/4HANA Migration Cockpit (Cloud)
- Note 2470789 – SAP S/4HANA Migration Cockpit – Cloud data migration template samples
SAP S/4HANA Migration Object Modeler (MOM) is a tool delivered in On-premise edition only and can be seen as a design-time tool for definition of the Migration Objects. Its enhancement capabilities are evolving and depending on which SAP S/4HANA version you are working with.
In essence, following activities can be performed using MOM in S/4HANA 1610 up to and including FPS01:
- Display Target Structures You can get an overview of the target structures/fields – these are dictated by the APIs used by the Migration Object to post data to SAP S/4HANA, thus cannot be changed here.
- Edit Source Structure This function allows you to adjust source structure and add a field as required.
- Edit Field Mapping This function allows you to map new or modified fields from source to target structure.
In SAP S/4HANA 1610 FPS02 new features have been introduced:
- Integrate newly created objects – custom ones as well as SAP standard objects which have not yet been delivered as ready-to-use templates in Migration Cockpit.
- Enhance standard delivered objects – for example add new fields.
So, with enhancements introduced recently, you are also able to migrate data into custom built apps (within SAP S/4HANA) as well – as long as there is a suitable API to post to particular data object.
Documentation for the SAP S/4HANA Migration Object Modeler can be found here:
SAP Data Services and associated best practice
Prior to introduction of SAP S/4HANA Migration Cockpit to on-premise world, SAP Data Services were the only tool recommended and fully supported for the purposes of data migration to SAP S/4HANA (on-premise). Furthermore, we have also built and been delivering associated accelerators in the form of Best Practice for “rapid data migration to SAP S/4HANA (on premise)” available at https://rapid.sap.com/bp/RDM_S4H_OP (always check for latest version of the BP).
This best practice is built on the capabilities of SAP Data Services, the market leading data integration tool with full data quality capabilities. The logical architecture and scope of this solution is depicted on the diagram below:
The content delivered with Best Practice includes:
- Detailed documentation for technical set-up, preparation and execution of migration for each supported object and extensibility guide.
- SAP Data Services (DS) files, including IDoc status check and Reconciliation jobs.
- IDoc mapping templates for SAP S/4HANA (in MS Excel).
- Lookup files for Legacy to SAP S/4HANA value mapping.
- Migration Services Tool for value mapping and lookup files management.
- WebI Reporting content for SAP BusinessObjects BI platform.
And before we proceed, it is worth distinguishing two use cases of this best practice depending on your requirements and license in place. As per Note 2239701 – SAP Rapid Data Migration for SAP S/4HANA, on premise edition:
If you own either a runtime (REAB) or full use SAP HANA license, this includes a limited use license of SAP Data Services software restricted to loading data into SAP HANA (called Data Integrator license). This fills the minimum requirement for the SAP Rapid Data Migration to SAP S/4HANA content which includes full ETL (Extract, Transform, and Load) used to extract data from heterogeneous source systems, the transformation and mapping, the validation and the data load.
In other words, the limited use license allows you to take advantage of all scope items in the diagram above. Additional licenses may be useful for two specific extensions to the core functionality – advanced data profiling and advanced data cleansing.
With regards to advanced data cleansing there is a dedicated job to standardise, cleanse, match and de-duplicate Business Partner names and addresses, which uses Data Quality transforms and thus requires the full Data Services license.
The SAP Information Steward comes into the picture to provide advanced data profiling capabilities based on its Data Insight that includes the ability to run extensive types of profiling like:
- Columns – used to examine the values and characteristics of data elements such as minimum, maximum, median, distribution of words, and so on.
- Address – used to determine the quality of an address. This sends data through the address cleansing transforms in Data Services to identify those that are valid, correctable, or invalid
- Dependency – used to identify attribute-level relationships in the data. For example, for each state you can find the corresponding city name
- Redundancy – used to determine the degree of overlapping data values or duplication between two sets of columns
- Uniqueness – returns the count and percentage of rows that contain nonunique data for the set of columns selected.
To take advantage of the capabilities listed above, respective license is required. Henceforward we will focus on capabilities included in the Data Integrator package.
SAP Data Services use IDocs to post data into SAP S/4HANA, therefore respective configuration on the SAP S/4HANA target system is a common requirement for all migration objects. There is a custom program delivered to create required partner profiles for each required message type (refer to building block “Data Migration IDoc Config Guide (W01)”).
Now, we do obviously need to deploy SAP Data Services and associated requirements as well as pre-delivered migration content – this is documented in “RDM_S4H_OP_DS42V2_Quick_Guide_EN_XX” attached to Note 2239701. Worth noting that there are two separate versions of this manual – one for Windows and one for Linux based deployment with support for underlying repository database platforms varying between these deployments. Also, the document may only detail set-up steps for one selected DB – for other supported DB platforms, refer to standard SAP Data Services documentation.
Once standard set-up is complete, delivered content needs to be applied – from that point forward, the platform is ready for preparation and execution of data migration for selected objects.
When sourcing data from your legacy system, you will have a choice of using flat files (text or XLS) as intermediary or to connect directly to your source system or database.
With the documentation and content delivered via referenced Note, you can deliver your data migration project yourself. But, SAP does have packaged service to deliver the described scope. You can get more details from https://rapid.sap.com/bp/#/browse/packageversions/RDM_S4H_OP > Accelerators > Customer presentation. The service has flexible scope and experienced team to deliver. This can be particularly interesting when you do not operate SAP Data Services in your environment and do not have the necessary skills available.
Closing remarks and considerations
The two toolsets and methods for migrating your data described above are quite different when it comes to the tooling, capabilities and associated effort. The matrix below attempts to summarise and compare key aspects of each.
|SAP S/4HANA Migration Cockpit (MC) and SAP S/4HANA Migration Object Modeler (MOM)||Rapid Data Migration with SAP Data Services (DS)|
|Technical deployment||Built into the SAP S/4HANA 1610 and later||Separate deployment and set-up necessary for SAP Data Services, BI Platform and optionally Information Steward.|
|Commercial aspects||Capability provided as part of SAP S/4HANA license.||Core capability included in selected SAP HANA licenses. Advanced functionality (for data cleansing) requires full SAP Data Services license.|
|Data extraction methods||File-based load supported only at this stage.||File-based as well as direct load from source system/database.|
|Delivery method||Best practice documentation and built-in migration object templates delivered as part of the solution.||
Best practice documentation and built-in migration object templates delivered as part of the solution.
Also available as packaged service from SAP.
|Extensibility||Extensibility using MOM allows to go beyond reliance on standard content in Migration Cockpit.||Full extensibility using standard SAP Data Services capabilities.|
|Data quality support||None other than data validation during (simulation) posting.||Yes, but requires SAP Data Services license.|
|Scope of supported migration objects||Decided not to attempt to compare the scope in this blog as it tends to change quite rapidly and depends (especially in case of MC/MOM) on the target SAP S/4HANA version. It is fair to say that up to certain point, SAP Data Services has numerical (number of supported objects) advantage, but this is rapidly changing. And with both toolsets supporting custom build of migration objects and scenarios, sky is the limit…|
The decision, dear reader, is yours.
Other useful links and references
- SAP knowledge articles:
- SAP HANA Academy – S/4HANA RIG: Migration Cockpit
- SAP HANA Academy – S/4HANA RIG: Migration Cockpit and Migration Object Modeler
- SAP S/4HANA Migration Cockpit: How to Migrate your SAP Data in S/4HANA
- SAP S/4HANA Migration Cockpit: Migrate Data Directly from SAP System SAP S/4HANA 2020
- SAP S/4HANA Migration Cockpit: Migrate Data Using Staging Tables, SAP S/4HANA Cloud 2008 and SAP S/4HANA 2020
- SAP S/4HANA Migration Cockpit: Transferring Data to SAP S/4HANA Using Files SAP S/4HANA 1909
- SAP S/4HANA Migration Object Modeler: Create an own mapping rule SAP S/4HANA 2020 FPS00
- SAP S/4HANA Migration Object Modeler: Migrate Data Directly from SAP System – Creating a new migration object SAP S/4HANA 2020 FPS00
- SAP S/4HANA Migration Cockpit: Enhanced error analysis capabilities SAP S/4HANA 2020 FPS01
- SAP S/4HANA Cloud – Data Migration Status – Overview
- SAP Rapid Data Migration with SAP Data Services (please note this video does not explicitly feature SAP S/4HANA, but the concepts presented are common for various data migration target solutions)
- Other informative blogs:
- Best Practice “Rapid data migration to SAP S/4HANA (on premise)”
- Blog “Introducing The Data Migration Guys”
- The SAP Press book “Migrating to SAP S/4HANA”
Excellent blog! Very informative! Thank you very much!
Very much appreciated!!
Nice and helpful
Two remarks from my side:
Cleansing functionality is part of every DS job in the Rapid Data Migration package. All migration jobs can run on top of the default Data Integrator license only. There's only one exception to the rule: a dedicated job to standardise, cleanse, match and deduplicate Busines Partner names and addresses. This job uses DQ transforms and requires the full DS license.
IS cannot cleanse data itself, it only supports the process with functionalities of data profiling, definition of validation rules, displaying validation results (in a dashboard). Also here, one exception: it can be used, in combination with DS, in the deduplication process and the creation of golden records.
Thank you Dirk, I have updated the relevant passage as per our conversation.
Very useful information. In lot's of customer workshops the SAP Data Migration & Landscape Transformation (DM<) team used the TRANSITION NET net - see attached - to understand, what the customer is looking for. Based on the answers of 5 guiding questions, we then had a look on the right scenario for the customer.
Any feedback is helpful.
Interesting could U give more light about this particular situation drawn on it?
Great blog, Very structured and simple to understand
Excellent blog, with clear separation of capabilities for tools, Thanks for the post.
Thanks for the blog. We are already using MC and its capabilities are highly appreciated by the customer in one of our green field implementation.
But, the matter of concern is the strong disclaimer from SAP about use of LSMW tool in note 2287723.
LTMC does not support 'batch input recording'. 'batch input recording' is a powerful feature in LSMW for doing mass changes, which is really quick.
But with the disclaimer in the note like the statement, "The use of LSMW for data load to SAP S/4HANA is not recommended and at the customer’s own risk." makes the customer wary of any use of LSMW especially in Pharma and life science industry, Since they have to comply with GMP.
It would be great if SAP gives a clear message that of LSMW is still in option where ever LTMC/LTMOM is not an option at the moment.
LSMW might propose incorrect migration interfaces, as LSMW is not object oriented it may cause problem that should be tested carefully before using as mentioned "at owner's risk".
Thank you for this blog. It provides a very nice and concise review of Cockpit & Modeler. Also including the Data Services is a much appreciated addition. This helps clarifies the positioning of and scenarios for the different offerings. Furthermore, the links make this even more beneficial. I will be sharing this with our team in NA and including a link in our jamsite.
One question, will this blog (and links) be actively updated periodically?
With Much Appreciation,
Thank you Todd,
I will strive to keep it up to date with changes in MOM and MC. Am planning next update soon to include features in S/4HANA 1709 FPS01 (released yesterday).
very helpful especially the schema. But why your focus in on new implementation as the simplest system conversion - this seems to be close to upgrade both technical and functional but of course data must be migrated.
I consider this from point of my job means project manager and I am especially curious how the solution can be migrated in case of system conversion.
Look I focus on how to make the leap with lowest level of possible disruption - do you have any advice on this? if you found a bit a time please have a look at my entry considering the needed level of disruption thus reflecting on risk, effort for entire organisation and level of governance of this kind of change:
the blog only intended to focus on data migration strategies/tooling in case customer chooses new implementation over conversion. In the process, I did not try to compare nor suggest which transition path is easier/better/quicker/…
There are several other blogs focusing on system conversion or on transition paths in general – for example, recommend https://blogs.sap.com/2016/09/01/elements-for-designing-a-transition-roadmap-to-sap-s4hana/ which contains link to very useful document Elements for Designing a Transition Roadmap to SAP S/4HANA.
For conversion specifically, recommend https://blogs.sap.com/2016/11/02/sap-s4hana-system-conversion-at-a-glance/
Regarding disruption… and this is my personal view only – every project causes some level of disruption. Over the years I have been involved in several projects which attempted to perform “technical transition only”. Yes, simple update or patching could be treated as such. But, if we are talking about adopting a new product, which has been re-designed in several aspects, one should ask a question whether attempting to adopt this new product as “technical transition only” makes any sense? There are things that you need to adopt as part of transition to S/4HANA – user experience for example. So instead of trying to “minimize” the disruption by adopting only select few Fiori apps (“we’re doing it only because old UI was removed”), I would rather go for maximum adoption to get the maximum advantage.
Adoption of SAP S/4HANA should be seen as an opportunity to achieve something meaningful and that cannot be done without some disruption.
Just to be clear – I have seen (and worked on) S/4HANA conversion projects which were treated in minimalistic manner, done in 3-4 months and gone live successfully. Can be done, but I am still of a view that they may have missed several improvement opportunities by trying hard not to make a change.
Just my personal opinion.
...Could U give any experience with minimal downtime? What is the time U need to get the data clean on HANA on PRD?
Can you clarify whether your question relates to data migration as per this blog or system conversion?
For conversions, there are several techniques/services available to minimise the downtime, for example:
- Minimized Downtime Service (MDS) is an umbrella term for specialised services offered by SAP to reduce required downtime for any maintenance activity - the most relevnt one in this case would be
Near-Zero Downtime for SAP S/4HANA Conversion, which was featured in Walmart presentation at ASUG. More details about MDS in Note 693168 - Minimized Downtime Service (MDS).
Apart from that, we are working continuously on reducing downtime through optimisations in the conversion procedure - refer to Note 2351294 - S/4HANA System Conversion / Upgrade: Measures to reduce technical downtime.
As part of these developments we are now supporting special method as per Note 2293733 - Prerequisites and Restrictions of downtime-optimized conversion to SAP S/4HANA.
For transitions involving new SAP S/4HANA build with data migration (as is the subject of my blog), the downtime will be solely dependent on how much data could be pre-loaded during uptime and how much will be left to be migrated in the downtime phase.
wow - this is a kind of answer that is worth 100% to be named "exhaustive"! Need some time to assess but this video about Walmart really impressive - downtime of the business for huge company limited to 25 hours that is something!
And to give you picture i am right now considering how to minimize the disruption at the day 1 so ideally would be to get the same process shape and then agilish explore new potential of S/4HANA. That is why the favorite option is first one: system conversion. You know: "run simple".
In the nice blog about conversion there is exactly what I was looking for: " This is the preferred option for Customers which want to bring their business processes to the new platform and want to adopt new innovations at their speed, then."
Do you have any experience with this option? Curious how practice meets theory in this case!
Thank you! have great day today over there!
Hello Jacek Klatt,
very helpful article. Thank you for that.
Considering the section regarding the "UPDATE" functions of the APIs:
Does that mean, that if I use, let's say the migration-object for Material Master to create master and sales view, it won't be possible to update the material master with this migration update with plant data etc.?
Thank you in advance for your response,
Thank you for the blog.
I have a question regarding transfer ID (I am not fully sure if this is the cause but this could be it)
I have been testing migration cockpit in our development environment, but it seems like the vendor data I have migrated so far, is visible in both in our development and actual environment. What could be the reason? Could it be because of wrong transfer ID? If yes, how can I limit my migration to certain environments?
Good Job!! That's a great article and very informative!!
Yet we are unable to decide on the migration strategy(RDM, Migration Cockpit, DS, LSMW) to be chosen based on the volume of data.
Please throw us some light on the approach to be considered based on data volumes.
Maximum volume, a migration cockpit can handle and the time taken for the same.
What is the next best approach to be considered if the volumes are high.
Hi Jacek Klatt
Since there is some time between this post and the present moment (2018/11), I've been searching for updates to this info and I haven't found any mention to RDM to S4HANA in latest SAP roadmaps, just MC at simplification list. Is SAP still recomending the use of RDM to execute data migration at S4HANA projects?
Another thing I haven't found is a comparison between those two methods (RDM and MC+MOM) on execution performance when tables are. Do you have this comparison or know where can I find it?
Jacek Klatt, an interesting article and covers SAP Data Services, the RDM content and Migration Cockpit well. Did you review the functionality provide by SAP Advanced Data Migration solution that compliments the SAP DS, IS and Migration Cockpit products and is used at many of the largest and most complex migration projects being implemented on both ECC and SAP S/4 globally. I am the product manager for this solution and I am happy to setup a call to discuss, so that you can understand the benefits and differences it brings to the area of data migration. I was also the main architect on the SAP RDM solution before leaving SAP so can provide context to the tooling you have already reviewed. firstname.lastname@example.org.
Hi Jacek, I have a query. If we do a re implementation, all the legacy transnational data can be migrated through MC or we have to continue our legacy system also for reference of historical data? Historical data is required for audit or any legal issues. Kindly reply.
Perfect Blog on "Data Migration" which answers one of the pain areas in digital transformation era.
Very informative blog... Thanks for sharing!
Hello Jacek, witaj Jacku,
Could you please comment on the meaning and practical consequences of your statement in the original blog above :
"Display Target Structures You can get an overview of the target structures/fields – these are dictated by the APIs used by the Migration Object to post data to SAP S/4HANA, thus cannot be changed here."
If indeed so, what then could be the possible reasons for changing an SAP-provided Source Structure that already provides all elements authorized in Target Structure?
In what practical case would I need to enhance such source?
Just for your record, I currently discover both LTMC and LTMOM from purely functional perspective of a long-time SAP consultant not practicing any Abap-related considerations. I use 1809 and much of what you describe above is possible to match in there.
What, if any, actual changes to your original presentation (dating back to 1610) are you aware of when it comes to 1809. Could you share them, if relevant, with us here?
Good summary. Thanks!
it's fantastic article and summary !! thanks for your great contribution and time ;
Jacek, thank you for the blog.
There are two video links mentioned in the section - SAP Data Services and associated best practice. >> To see the process involved further, please check the click-through demo or view the detailed recorded demo.
The first one is not accessible and the other one seems to be a private video.
Please let me know how we can access them.
Thank you Nitin for bringing my attention to this fact.
I have tried to source relevant videos and updated "Links" section at the bottom of the blog with "Videos" sub-section where I linked all the relevant videos that I could find. For Rapid Data Migration with SAP Data Services I found one which is (from memory) similar to the one linked originally.
Overall nice and informative blog.
I see in the PDFs that for S4 HANA Cloud edition, Data Migration Status app is available for post load data validation and reconciliation. However I do not see clarity on this apps availability for Rise with SAP Private Cloud Edition.
If customer implement migrate your data app and there is no good way of data validation and reconciliation offered by SAP, wouldn't it be a deal breaker? There should be a way to validate and reconcile the loaded data using standard SAP tools. Any direction here would be helpful.