Skip to Content
Author's profile photo Shireesh Mitragotri

Understanding critical path on a HR Payroll data migration

I must admit, I am a big fan of “critical path”. It is all the more valuable to use this on a Data Migration project in SAP. In this blog, I will illustrate how I have used critical path to identify and control the data migration loads for a SAP HR Payroll within an estimated timeframe.

We start with Organisational Management HRP1008 Account Assignment on the Org Units, HRP1013 Employee Group / Employee SubGroup & HRP1005 Planned Compensation.

Not being a Greenfield, we had to run RPUDELPP to remove existing assignments and reload with new information on several infotypes. The challenge around RPUDELPP was to enable it to mass delete, so we created a wrapper ABAP tool which created multiple background jobs.

The next set of data loads on the critical path were the Actions of Initial Hire, Pre Go-Live, Go-Live and Future movements.

Subsequently the PA0712, PA0007, PA0008, PA0009, PA0188, PA0227, PA0041, PA2012, PA2013, PA2006 form the blocks of critical path sequence of data loads leading into running Time Evaluation. There could be other ways a critical path must have been formed in other HR/Payroll data migration projects and I would be keen to understand what they were and the pros and cons.

Workers Compensation data was maintained in T558A and we had to develop a wrapper ABAP tool to parallel process millions of records on this table. This tool was built using Asynch RFC mechanism in SAP ABAP.

The successor task to T558A is to run the Pre Go-Live Pay Run using standard SAP payroll driver program for Australia, RPCALCQ0_CE. Depending on the volumes and issues encountered in loading this data, this could become the critical path leading into running Time Evaluation.

Lessons Learnt:

  1. Educating stakeholders and team members from ETL (Extract, Transform & Load) teams, on data migration critical path is important. Often the focus on the critical path data load objects is ignored and instead other objects are picked on priority. This will invariably increase the length of the entire data migration test cycle.

 

  1. Volume of each and every data objects need to be assessed in the design phase. I would recommend that you assume larger volumes and get wrapper ABAP tools built to feed standard SAP programs such that we can run multiple threads in parallel.

 

  1. Develop robust tools to be able to download BDC errors from multiple BDC split sessions. I would like to understand from others who have used any ABAP tools/code and if this can be shared, in this context.

 

  1. Automation of pre-load validation tools is an absolute must, given the fact that Actions are focussed heavily on getting begin and end dates correct.

 

Questions:

  1. Are there any BAPIs for loading actions which are normally created through PA40? We would like to avoid screen based BDC processing for larger volumes.

 

  1. Is there a standard way of scheduling BDC sessions one after another, so they get kicked off as soon as the previous one finishes? Note that for BDC1, there could be about 30 sessions running in parallel and we do not want to start running BDC2 sessions unless all 30 of BDC1 have finished.

 

Assigned Tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Poorna Ravichandra
      Poorna Ravichandra

      actions can be loaded via FM HR_INFOTYPE_OPERATION.