Recently after implementing LMS Data Migration for one our clients, I realized that the entire process of migrating data is a huge learning experience in itself and should be shared with all of you for simple tips and tricks or lessons learnt. This will follow a series of posts in Parts.
The Success Factor LMS conversion that we carried out for one of our client’s is a huge Consumer Goods multinational with Global presence wherein the key business decision was to migrate 14 years of learning data from Legacy system to Success Factor Learnings.
For any Conversion , few key decisions that needs to be drilled down and decided are:
- The Data Migration Scope
- The Conversion Approach & Strategy
Data Migration Scope: After a discussion with business and understanding the existing learning System, the Implementation Partner and Project Business team arrive to a consensus on the scope of the Data Migration. Focused key decisions taken on:
- Amount of Employee data to be migrated
- Courses to be migrated and the parameters which quantify for their migration.
- Content migration approach & strategy
- History Data
- Challenges which decide the not-in scope details.
The Conversion Approach & Strategy : I would use the most clichéd approach of ETL – Extraction/Transform & Load to form the basis of the Conversion Approach.
- Extract – Reading of data from the specified source system(s) and taking a desired subset to be loaded in Success Factor LMS system.
- Transform – The conversion of the acquired data – using rules, mapping fields into SF or lookup tables, or creating combinations with other data –into the desired state.
- Load – Writing the required data to a target system which is your SF LMS system.
The strategy will follow the approach of defining the items under scope and working in the below sequential process:
Data Extraction and cleansing
- Data extraction
- Data cleansing
- Data/File preparation
- Data migration using LMS connectors
Verification and Final Conversion
- File Transfer via FTP for secure file transfer
- Data File Processing
- Data archiving
After discussing on the Approach and Strategy, moving on to the generic best practices for implementing LMS conversion:
- Collaboration with Extraction POC/Teams :Close collaboration with the extraction team/extraction POC/onshore or from client side is must while working on conversion. Detailed discussion on transformation mappings and rules for correct extraction logic need to be documented.Series of verification steps to be involved in order to validate the files getting processed at extraction end.
Checklist to be prepared for both extraction and conversion teams per each connector being used for proper validation.
- Multiple Success Factor Environments: It is always advisable to have 2 or more Success Factor environments for mock conversions before performing actual conversion in Production.
- Test ID’s number range/ naming Convention to be different from actual data load: While testing , we should always ensure that the test data follows a different naming convention and number range from the actual data load for mock ups.
- Primary Eligibility Checks: Always ensure the flat file is UTF-8 compliant and with proper pipe separators and end delimiters
- Special Character Checks: Open CSV files with OpenOffice or use import data from text method in Excel helps in avoiding any special characters not getting loaded correctly.
- Validation with different levels: Validation for data at extraction level and spot checks for verification.Standard checks for validating field formatting, required fields and reference values. Logical and functional check in order to validate the correctness of transformation and mapping rules from legacy to Success Factors system.
In the next post , we will look into the tips/tricks and the best practices for each connectors used in LMS conversion and lessons learnt for successful implementation, so please stay tuned for yet another post !