Skip to Content

BW QA Refresh Strategy

Most of the BW landscape that we have across various clients is a 3 -tier approach (i.e., Dev, QA, and Prod). There is always a challenge in having a healthy, consistent data in QA for performing an accurate User acceptance testing.
In this document, I will be discussing on various options/ scenarios and what’s the best cost effective and optimal approach to perform the QA Refresh for a particular functional area or all the modules( Spend, Finance, SRM etc).
The following different approach are considered :

  • Suppose the Production has 2 TB of data. Assume that the Basis/DBA team takes a complete online backup of the data\database on a daily basis. This backup is at the DB level (applicable to SQL/Oracle), so we cannot refresh specific tables or tablespaces. In this case, if they have to refresh specific functional area data on QA, then it is not possible, because the backup data is in the compressed DB level. So, in this case it has to be a complete refresh to QA. 
    Disadvantage:  We definitely need at least 2 TB of data in QA(Quality) and supporting infrastructure (RAM, processor) should be very much similar to production, and maybe there would be data security issue, if we refresh the complete production data ( for example : HR data).
    Benefits: With this approach, the end users can perform some of the following type of test cases, i.e., harmonization, change in the query/report, stress/performance testing etc.
  • Next approach is to make Production as a source system for Dev or QA system. With this approach, we extract specific year records or month records and for a specific functional area.
    The following steps can be followed:
    Create the specific Module acquisition DSO’s as an “export data source”, Replicate in D, transport to QA (or directly replicate in QA).
    Create the specific module Master data (attribute and text) as a “export data source”, replicate in D and transport to QA-(or directly replicate in QA).
    Create Transformation, Info package…etc., to extract the data from prod to PSA-QA.
    Run the DTP up to acquisition DSO in QA.
    Execute the regular load in QA i.e., A-DSO->T-DSO -> Reporting Cube  , this way we will ensure the harmonization testing that sourcing team will do is taken care.
    Build a process chain for this approach/flow in QA.So that, on adhoc basis, this process chain can be scheduled based on the user request.

       Benefits: This is useful to perform testing when there is a change in BW – transformation layer or change in Queries, where user wants to perform testing.

Couple of points that we need to take care here is:

Accurate space calculation in QA is by identifying the /BIC/ tables of DSO’s,master data etc from Prod, and then download the table size list from DB02 transaction and calculate the required space.
Take the count of records on DSO, and then identify the record length of each DSO. So, Count of records * record length = space occupied in bytes.

Please note, this refresh procedure will not help us to perform UAT, when we do any enhancement to a data source in source system (ECC, SRM etc). For this scenario we have to follow the next approach i.e.

  • This approach is a traditional refresh procedure. This refresh is convenient, when your landscape is one to one i.e., ECC1 -> BW1 i.e., the ECC will be refreshed regularly from Prod to QA, and simultaneously the BW – QA will be refreshed by running the extraction process chain for ECC-QA environment.
    Benefits: This is a far better and convenient approach. By this way we can test different functional scenarios, harmonization, data source enhancements at the source system etc.
Be the first to leave a comment
You must be Logged on to comment or reply to a post.