- 1. Overview
- 2. Frequently faced problems during Data Migration
- 3. Importance of Planning
4. Factors to be considered while planning
- 4.1. Identify the appropriate masters & fields required during data migration
- 4.2. Tool Development
- 4.3. Tool Testing
- 4.4. Use of reusable assets
- 4.5. Master data awareness sessions to business users
- 4.6. Sample data validation and feedback to Core User
- 4.7. Status Meetings at Regular Interval
- 4.8. Cut off for receiving master Data
- 4.9. Validation and sign off
- 4.10. Uploading and Verifying uploaded data
Friends, this indeed is a huge topic with several edges to discuss upon. You would possibly find many threads on Data Migration which discuss more about technical aspects such as specific problems related to data migration tools. Whether LSMW is good or BDC gives faster and accurate results or is there any other tool for this and so on! Here I shall not be discussing about the technical complexities involved about the tool. The idea is how neatly this entire activity could be planned, what factors should be considered for planning and how this should be executed to have hassle free experience during Data Migration phase!
2. Frequently faced problems during Data Migration
Do you really think the only problems that most of you face in data migration phase, are related to the tools used? I don’t think so! There are other factors also that contribute here. If I look back to the projects I worked in or if you ask to the experienced consultants who worked on data migration, the answers are like this.
- Delay in the receipt of the data from the business users
- No time left for technical validation as the Go Live deadline is approaching faster
- No authorization in production server for uploading the data
- Tool is not working or giving incorrect results when the uploading is in progress
- The concerned master data transaction is locked by some other user at the same time resulting in to errors
- Records could not be loaded as dependent masters do not yet exist in the system
- Incorrect data such as field contains more characters than permissible or sometimes a new field is added by customer at the last moment and there is no provision in the tool to incorporate the same
- Immediately after Go Live, someone from the business team comes with the complaint that the uploaded master data is not correct or the inventories were loaded with incorrect valuation.
- Few records were left or not provided by the user and not uploaded at all in the system
3. Importance of Planning
The above list could go on if we go at very minute levels. And you can observe the variety of issues here. There could be issues from consulting side or from business side covering both technical and non-technical. The important question is – Can the above problems be avoided? Answer according to me is yes. If you have strong plan and execution strategies in place, these all can be suppressed. In planning, you define the check sheet (based on your past experience) for the activities that you are supposed to do along with timelines for every single step. Planning check sheet provides the transparency and clarity to the consultant and to the business users about their individual roles in this phase.
Master data is treated as key data element in any SAP project and that is why it should be given utmost importance. Master data serves as the main base for the success or the failure of the system and hence lots of awareness needs to be spread among the users who deal with this from business side. At the same time consultant should also have the knowledge about its consequences. It is not like that the only data migration phase affects with this but it affects the entire business. Hence accurate master data should be the topmost priority for any organization that would be running on SAP. In my opinion successful data migration is joint responsibility of business and consulting team.
4. Factors to be considered while planning
Many project managers and delivery managers plan this as a special event with very detailed matrix of sequential events to be followed along with responsibility fixation, which really helps in streamlining the project resources. Separate coordinator is also assigned in some cases to monitor this activity closely. Although the detailed level planning is done by most of the managers it is equally important that ever team member should understand and be aliened to the same. I may not be in a position to provide each andevery single step here as it could vary project to project and industry to industry based on project size. However below are the few common factors that can help plan this activity well!
4.1. Identify the appropriate masters & fields required during data migration
The ideal time to start is as soon as Business Blue Prints (BBPs) are signed off. The signing off indicates that the requirement gathering phase is over and there is rear or no chance that the new requirement will be entertained. So at this stage or probably when you design the solution approach you are pretty clear about what types of master data are required from business side. You should start listing down such masters (e.g. Vendor Master, Customer Master, Material Master, Routing, Inspection Plans, BOM, Equipment Master etc.) belonging to each module.
It’s not like that you utilize each and every field on SAP screen for any specific master. This depends upon the business requirements. And as the BBP phase is signed off by now, you can have idea about the fields that are required to be utilized. You should also consider the custom fields if you are planning to develop it as a part of your solutions. List down all such fields master wise and module wise. This all serves as input for data migration tool development. Tool development also takes some time. Hence it is better if you start on it as soon as BBPs are signed off.
4.2. Tool Development
Once you have identified the required masters and fields the next step is to develop a tool for data migration. You should choose the tool wisely.
Although there is no any specific rule that which tool should be used but I personally first try to find if the standard tool is available such as transaction codes like MASS, MM50, QA08, QI06. By opting such standard transaction codes you minimise the risk of tool failure in Production environment. The pity is you do not have standard code for every master you want.
The next featured options are LSMW or BDCs. Some consultant prefer LSMWs while others choose BDCs. There are few examples where people rely on eCATTs also. For some of the objects standard LSMWs are available by batch input method. LSMWs save much time of ABAP development over BDCs. But there are few objects where BDCs prove useful over LSMW. You should wisely opt and finalize as early as possible what you will be adopting. Accordingly you should take into consideration the needed fields that you have already listed down.
4.3. Tool Testing
Testing of these tools is the next important factor. You should test each and every of these tools by uploading some dummy records in to your test server first. Based on the test results you should take corrective actions for the tool. May be you can verify in to SAP tables whether the records are generated or not. It is better to take dump before uploading these masters so that later you can compare for the correctness.
Once you are satisfied with the tool results, you should circulate the master data templates to your core users who will be responsible for filling up the master data. The templates should contain below information
- Field Name
- Field Description
- Maximum Number of characters allowed
- Explanation of each filed
- Information about whether any field is mandatory
- Any other extra information if needed
It is better if you can share sample filled templates to core users. That will help them to understand how the master data needs to be filled.
4.4. Use of reusable assets
Time saving results in to profitability! If you are working in a roll out project then there is a chance that your earlier team or you yourself have already developed the tools for data upload. And these are proven tools as they were already used for upload. So why not to use the same again to upload the data in new plant on the same server? I always recommend to use such tools to the maximum extend you can in the new plant. Assumption is all the master data fields are same in the new plant. You can thus minimize the tool failure threat.
4.5. Master data awareness sessions to business users
The successful data migration does not only mean the successful data uploading in the system without any error. It means more that that. The best measure of this success is very less or merely no tickets immediately after go live related to data. It is thus joint responsibility of business users and consultants. Business users are responsible for the data collection and accuracy of the data provided while it is consultant’s skill is to upload the data as received successfully in the system. If either of above fails then it leads adverse impact on the business.
Having said this it is important that the business users should be well aware about the consequences of providing inaccurate or incomplete master data or providing tentative (to be) data that how badly it will hit to their future businesses. This point suggests that consultants should have healthy discussions with business users for making them aware about these facts and explaining the roles and responsibility.
According to me this phase should cover below points
- Publishing and explaining the data migration plan to business users along with clarity of roles of the user and the consultant
- Arrange the meetings with them to explain the overall importance of accurate master data and how it affects the future business conditions
- Importance of dependent master data. – This would ensure that all the parties refer to the same masters. E.g. Material Master is prepared by MM team but for the same material master Bill of Materials is prepared by PP team. So they both should be using the same material code or description while data collection. This keeps all departments in sync with each other and you can minimise the risk of referring the same code differently.
- With one on one session with the respective users you can explain them each and every field of the master data template. You should also make them aware about the field length and other parameters which are mandatory to be filled in.
- Duplication of records is another threat to data migration. It will be good if the users are made aware about this.
4.6. Sample data validation and feedback to Core User
You should demand for the sample data from your core user for your verification. This will help to ensure that users are filling the template as required. This also helps to avoid rework and clears the understanding between you and the user. Based on your consent the user can start collecting bulk data.
4.7. Status Meetings at Regular Interval
Perhaps this is the most scary thing that most of the folks feel. 😥 But it is always advisable to have meetings are regular intervals to have status review about data collection. It is the forum where both (Core User and consultants) can raise their concerns and the solution can be explored. The more transparency you maintain in these meetings the less problems you face while loading the masters. Such meetings help you to track the progress of data collection especially the dependent masters.
4.8. Cut off for receiving master Data
You should set the cut-off date in planning phase for receiving the data from business users. It is expected that business users should submit the entire or whatever applicable master data by this date. This cut-off date should be mutually agreed well in advance with them. This is really a god practice I would say , as this can suppress many problems that may arise in future. Imagine a situation where you have invested a week on the the validation work on the data received (using your excel expertise) and the file is ready for upload. At this time time an user sends many additional records saying he/she missed them to incorporate in earlier file. Gosh! 😯 So you will think that if the proper idea was given to them well in advance they could have prepared the entire data by the cut-off date without hampering their regular office work.
I accept that every time this theory cannot be true. But if mutually agreed with customer then both the parties can save lots of time and can concentrate on further steps.
4.9. Validation and sign off
Once you receive the data from users you validate it. During validation if some abnormalities observe then you resend the file for correction to the users. The cycle repeats until the final version of file in ready to upload format is reached. Several tools are available these days for validation purpose. You can even develop a new tool for validation purpose in SAP as well which can compare the values in excel or txt file with dependent masters and can give you the list of abnormalities in few seconds. Using excel techniques you can suppress duplicate records. You can thus ask your user just to rectify those entries and resend the corrected version. Back up of the data is also important here. You should always have the practice of storing or taking back up of every file you worked on. Version management helps you to identify the latest file on which work was done. Loss of worked file may lose your hundreds of hours, remember!
Once the validation is done there should be formal sign off from the business side. This is to authenticate the contents submitted are correct from business perceptive. This can avoid many disputes.
4.10. Uploading and Verifying uploaded data
Some teams prefer to upload the data first in mock server or QA servers to double ensure that you are not getting any errors while uploading the data. This depends on the hardware and availability of the server infrastructure for this purpose. If the results are successful you can upload the data in Production environment. Now there could be very detailed check sheet involved when you actually start the data migration. Giving such check sheet here is highly impossible. This is more or of technical sequencing part rather than administrative. For example
- Whether the programs or LSMWs have been transported to Production server?
- Have you got the necessary authorizations for upload in Production server?
- Whether the sequence of the master data to be uploaded is finalized
- Whether all consultants are in sync with this?
It is also advisable that the business team should be available while the data migration is in progress so that any decisions if required in the data correction at the last moment can be possible. To cross check the data uploaded in Production server. You should keep 2 days or some time as placeholder before you actual Go Live. You can use this time to rectify if something went wrong. This ascertains that all is well! 😎
It is bit challenging to provide the detailed check sheet of Do’s and Don’ts here as it may vary dynamically but I am hopeful that if the above factors are considered for data migration planning, you will have less surprises in the data migration phase.
Nothing can replace your own prepared check sheet and your own experience that you have gained or the things that you have learnt from the mistakes. You should include every minute point that you think could be important (doesn’t matter how silly it is) so that it would not be escaped from your mind at later stages. Go on marking it as green once you complete that step. Trust me it helps a lot especially when the dependencies are involved. We all learn from the past. Every project adds new value to our repository. So you should go on refining the check sheet based on your project learning and experiences of your seniors.
Like I mentioned in the beginning this is indeed a huge topic. I might have missed few important points as well. I sincerely invite all your ideas, suggestions or corrections to the document to make this even simpler and neatly organised. Plan wisely! I reiterate – Include every minute point that you think could be important! You will have less surprises at the data migration phase!
Thank you and good luck to you all! 🙂