Introduction: Today organizations need effective SAP implementation in order to create value for their customers’ and at the same time saving cost of services for themselves. For resolving key business issues or for making strategic business decisions the decision maker’s look at data. Hence effective data conversion is gaining importance.
Data migration basically means moving data from one system to another. Data Migration could be driven by several initiatives taken up by the customer like application changes (moving from Oracle to SAP) or upgrade (moving to newer SAP Releases).
Conversion does not simply mean moving data from one system to another, rather it means moving meaningful data.
Just to emphasize more on this fact let’s just consider a simple example; I have data conversion requirement for customer master and during the data load due to unknown reasons there is a digit missing from the customer’s contact number. Just imagine how much impact this small miss is going to have on Customer Service?
Data Conversion Challenges: In general, data migration is considered to a simple task deflating the real risks involved.
- Other major challenge is about knowing the data before it’s late. We may lose both time and resources eventually resulting in loss of money.
- The source of data governs the course of migration but it may itself change as there could be other initiatives within the organization driving of the source system.
- With increase in surge data in terms of volumes and need to migrate data from multiple source systems’ poses multiple migration challenges.
How can data Pre-validation tool help?
This tool is a step closer towards smooth data migration by facilitating the data migration team with an ability to perform checks on the data before we actually start to load the data. We see pre-validation as a step in between the sequential steps defined by the industry; Transform and Load to ensure quality of data and also save time on the migration activity.
This Preload Validation Tool is generic and scalable; it can be used across SAP systems’ for diverse conversion requirements pertaining to data load activities across various functional modules
This tool gives you the flexibility to identify and resolve issues related to data even before it’s loaded to SAP.
- The Power of this tool lies in its ability to incorporate complex business rules to validate the data.
- It comes with ability to carry out DDIC checks harnessing the value of definition attributes.
- It focuses on saving the time and cost addressing the key considerations for ensuring the data Quality.
- Furthermore, it contrasts to the standard SAP Load program as it has ability to capture multiple errors with a single field value.
- This tool Works for both standard and custom conversions requirements.
- In case moving data from legacy system to SAP is a periodic task, Using this tool we can identify the root cause and request the legacy team to fix the issue from their side. Example: Issues like length mismatch between the two systems.
The Idea is to create a generic tool and for that it’s necessary to determine the input file structure at run time.
We will need to create couple of custom tables along with their maintenance views;
- Header Table: To hold unique Conversion ID to help the program uniquely identify the file structure along with description and other unique attributes associated with the conversion.
- Item Table: To hold specifics about the file structure related to the conversion ID like table name associated with each field, Sequence in which the fields are going to appear in the file and also additional attributes like a flag to ignore the field value during validation run.
These tables will serve as backbone to the pre-load validation program.
- Error Classification: We can create a customizing table and store the categories specfied below. We have classified the errors into four categories;
- Length Mismatch
- Type Conflict
- Input Data not defined in SAP
- Input format issue
- Define Output Structure for ALV Display
Next step would be to create a report program with selection screen field conversion identification number and Input File Path as mandatory input and after the execution the report will display an ALV Output to list out errors with each record in excel used as input. The output will be easy to understand with only few fields, it will just tell the user the excel row that has the error, Field Name, Field Value and error description.
The report can also generate a graphical output displaying the errors associated with each field based on the predefined categories.
The core validation logic of the report will hold logic to read the DDIC attributes associated with each field and table name, stored in the item table to perform type check, value table or check table checks, format checks for date or currency fields and length related checks.
Step 1: Create Header Table
Step 2: Create Item Table
Step 3: Define generic output structure for ALV display.
Step 4: Define a customizing table for error categories defined (Non-mandatory Step)
Step 5: Create a Report Program in transaction SE38 with some name like “Z_VALIDATE_DATA_READ_VALIDATE” with Selection screen as described in the above section.
Step 6: Build the code similar to code snapshot in appendix section at the end.
Step 7: The ALV Output/Graphical Output
You can also choose the Chart Type, to club the count for errors classified in predefined categories.
Appendix: Code Snapshot is attached