Skip to Content

Recently after implementing LMS Data Migration for one our clients, I realized that the entire process of migrating data is a huge learning experience in itself and should be shared with all of you for simple tips and tricks or lessons learnt. This will follow a series of posts in Parts.

The Success Factor LMS conversion that we carried out for one of our client’s is a huge Consumer Goods multinational with Global presence wherein the key business decision was to migrate 14 years of learning data from Legacy system to Success Factor Learnings.

For any Conversion , few key decisions that needs to be drilled down and decided are:

  • The Data Migration Scope
  • The Conversion Approach & Strategy

Data Migration Scope: After a discussion with business and understanding the existing learning System, the Implementation Partner and Project Business team arrive to a consensus on the scope of the Data Migration. Focused key decisions taken on:

  1. Amount of Employee data to be migrated
  2. Courses to be migrated and the parameters which quantify for their migration.
  3. Content migration approach & strategy
  4. History Data
  5. Challenges which decide the not-in scope details.

The Conversion Approach & Strategy : I would use the most clichéd approach of ETL – Extraction/Transform & Load to form the basis of the Conversion Approach.

  • Extract – Reading of data from the specified source system(s) and taking a desired subset to be loaded in Success Factor LMS system.
  • Transform – The conversion of the acquired data – using rules, mapping fields into SF or lookup tables, or creating combinations with other data –into the desired state.
  • Load – Writing the required data to a target system which is your SF LMS system.

The strategy will follow the approach of defining the items under scope and working in the below sequential process:

Data Extraction and cleansing

  1. Data extraction
  2. Data cleansing

Data Migration

  1. Data/File preparation
  2. Data migration using LMS connectors

Verification and Final Conversion

  1. Verification
  2. File Transfer via FTP for secure file transfer
  3. Data File Processing
  4. Data archiving

Data Migration.jpg

After  discussing on the Approach and Strategy, moving on to the generic best practices for implementing LMS conversion:

  • Collaboration with Extraction POC/Teams :Close collaboration with the extraction team/extraction POC/onshore or from client side is must while working on conversion. Detailed discussion on transformation mappings and rules for correct extraction logic need to be documented.Series of verification steps to be involved in order to validate the files getting processed at extraction end.

       Checklist to be prepared for both extraction and conversion teams per each connector being used for proper validation.

  • Multiple Success Factor Environments: It is always advisable to have 2 or more Success Factor environments for mock conversions before    performing actual conversion in Production.
  • Test ID’s number range/ naming Convention to be different from actual data load: While testing , we should always ensure that the test data follows a different naming convention and number range from the actual data load for mock ups.
  • Primary Eligibility Checks: Always ensure the flat file is UTF-8 compliant and with proper pipe separators and end delimiters
  • Special Character Checks: Open CSV files with OpenOffice or use import data from text method in Excel helps in avoiding any special characters not getting loaded correctly.
  • Validation with different levels: Validation for data at extraction level and spot checks for verification.Standard checks for validating field formatting, required fields and reference values. Logical and functional check in order to validate the correctness of transformation and mapping rules from legacy to Success Factors system.

In the next post , we will look into the tips/tricks and the best practices for each connectors used in LMS conversion and lessons learnt for successful implementation, so please stay tuned for yet another post !

To report this post you need to login first.

9 Comments

You must be Logged on to comment or reply to a post.

  1. Silvia Strümper

    Nice process overview.

     

    The truth for cloud scenarios is that extraction of legacy data and conversion is often within the responsibility of the client (as also described in your section “Collaboration with Extraction POC/Teams”). This is a tough challenge, why I recommend using a tool supporting the data conversion process… and there are several tools out there.

    One example you might want to have a look at is Accenture Data Comparison Manager (DCM) for which I’m Product Manager.

    It specifically focuses on data migration from SAP HCM to SuccessFactors (SF).

    Therefore DCM already comes with many scenario-specific little helpers that assist you throughout the entire migration process, e.g.:

    • Analyzer tool to scan and understand your SAP HCM system,
    • Data Collectors for extraction of SAP HCM data,
    • Generator for SF import templates,
    • powerful Conversion Engine with pre-defined and easy to configure field mappings, conversion rules, data translations and validations against target values (e.g. picklists, foundation objects, etc.),
    • an exporting functionality that creates the csv import files for you.

    As DCM runs within SAP, no additional hardware is required, existing SAP authorizations are applied, no skills in new technologies need to be acquired and you work in an easy to use, familiar environment.

    After your data is loaded into SuccessFactors, validation reports are available to support your testing activities.

    To be fair: For SF Learning there are currently no pre-defined migration templates delivered with the product, but e.g. available for Employee Central.

    (0) 
    1. rekha shukla Post author

      @Silvia: Agreed, extracting part is definitely challenging as that forms the base of all the data transformations and mappings. Definitely tools make our job easier and verification process is more reliable.

      Thanks for talking about DCM, but most of the times we see that the legacy system for LMS is not SAP Learning but other 3rd party tools like SABA/Newton etc where we need to 1st automate the process of file extraction in order to correctly churn the data transformation

      (0) 
  2. rekha shukla Post author

    @Sharon: Thanks for sharing insights on data migration SAP nexus but here we are talking about migrating data from any legacy(say SAP for example) to cloud (Success Factors) using Connectors , a different approach right from file preparation to load to verification and re-verification , not referring to data migration to SAP per say.

    (0) 

Leave a Reply