Skip to Content

During my interaction with clients (I love client feedback on their challenges and opportunities and how they are dealing with them) one topic that frequently comes up is impact of bad data or lack of data on testing. Call it the side effect of data management or rather data mismanagement it is more pervasive than acknowledged.

 

It manifests in one or all of the following scenarios –

 

1. Not being able to test at all in trial (development or QA systems)

2. Findings defects in production when there were none in Development and QA

3. Inconsistencies within various test systems that yield inconsistent test results

 

The combined effect of these symptoms is lower quality data and processes, inability to measure the effectiveness of QA organization and increased time-to-market of their software solutions. Thus, the trust in software and its data is diminished and the business cannot assess accurately the impact of forthcoming software releases. Ever changing regulations and stringent compliance atmosphere makes it imperative that QA is able to instill this trust in quality and robustness of applications under test.

 

Some real life examples-

 

  • A workflow in your QA system causes several thousand applicants in your HCM application to receive an email stating that they have been shortlisted for an interview (Hey, you can always have someone call them back and apologies for the slight misunderstanding. Right? But there is only 20, 000 of them and it is going to take some time to get to all of them).

 

  • A major SAP enhancement involving multiple applications was tested and worked like designed in the QA systems, but failed upgrade in production leading to countless hours of rework and production systems downtime. Houston, there is a problem!

 

Look, when a root cause analysis of these issues is completed, it uncovers several small points of failure that could have been easily prevented by effective testing. But lack of good, testable data meant a lack of good, effective testing.

 

Traditional QA organizations have little or no control over the source of their test systems. Data is provisioned as is for testing by the basis teams or developers using migration tools, full system copies and scripts. This leads to disconnects between what the testers really need and what the system administrator thinks is required. In absence of relevant data, testers – both during functional and regression test cycles – spend a large amount of time manually “setting” up test data.

 

SAP has provided numerous solutions to copy and move data between SAP systems like client copies, systems copies, LSMW. Some of these solutions are specifically provided to create and maintain development and test environments as well as training systems – such as Test Data Migration Server (TDMS).

 

Having the right tool and knowledge to operate these is only a start in the right direction. I present below best practices that have always kept us in good stead with respect to setting up DEV and QA systems for our clients. 

 

Take a process centric approach to test data: Remember, who are you provisioning the data for and what is the context in which it would be used? I may be asked to copy just a few tables or a handful of rows in a table, but there could be downstream effects. Always ask and understand the business processes using this data. Involve your users, data stewards when structuring a data provisioning process. Take an end to end approach, this will ensure you do not create or modify data that will cause inconsistencies or duplications in near future.

 

Have a test data team: no one knows their test data better than the test team. Have representation from test team, along with folks from DBA, security and Basis.

 

Don’t forget security: When real, sensitive data is involved, there is also a chance that this data could be compromised. Work with your enterprise security team to have controls put in place how data moves between various systems and what data needs to be privacy protected. Work with your business to create the right concepts that can scramble (or mask) sensitive data.

 

Consistency is the key: Keep your entire landscape in sync. If you have 6 QA systems, make sure they all have the same configuration and master data. This includes any custom changes or code that you may have created. Exceptions to this rule should be limited.

 

Revisit your standards: Characteristics of data change over time. New entities are added, processes are modified or discontinues. Revisit your test data management procedures to avoid redundancy.

To report this post you need to login first.

5 Comments

You must be Logged on to comment or reply to a post.

  1. Luke Marson
    Hi Harmeet,

    An excellent blog and a very worthy point of discussion. I have lost count of the number of projects that have got problems in PRD because the data in QAS and DEV was poor, unrealistic and/or nothing like PRD data.

    I can understand in a brand-new implementation how adequate test data can be difficult, giving that the client has never used the live solution, but if SAP already exists at a client there really is no excuse. Any PM or consultant who doesn’t raise the issue of test data really should consider if they are in the right line of work.

    Keep up the good blogging!

    Luke

    (0) 
    1. Harmeet Sandhu Post author
      Hi Luke,

      Thanks for your feedback. What makes the situation worst and rather unfortunate is with just a little planning and effort test data does not have to be poor and unrealistic.

      However, I think issue sits right at the core of overall data quality for most organizations. QA and test data is and should be part of a wider data quality initiative.

      Harmeet 

      (0) 
  2. Dewang TRIVEDI
    Nice blog , stating the pain areas that most of the implementations face. I have experienced setups where in the users are unable to simulate a scenario due to a minor change in the IMG settings .

    Over a period of time , the strategy of refreshing your test systems or QA system with Production – PRD has worked wonders. As the users get to have good data along with a feel of the real test.

    Cheers ,
    Dewang

    (0) 
  3. Kumud Singh
    Hi Harmeet,
    Good effort to bring importance of test data.
    However,there are various factors affecting it:
    1. Test data availability timeline. If test data is made available at a later point of time and developer is not provided adequate time to test, what’s the use?
    2.How about when a project is all together started newly.No one can forecast the exact type of test data required for effective testing.Its only gradually that people understand the type of data required.
    3.Deliverable timline. What you have talked about seems very ideal to me.Do you think the budget and timelines are always so flexible to make it happens like this. However if its an established team, thats good.I am not talking negative but I am being trying to be pragmatic here.

    4. We face the same issue over and time again but  then to come up with a solution for all scenarios is little difficult.

    5.How about,in the design phase,when FS is being prepard, test data too is prepared.Is that possible?
    Regards,

    Kumud

    (0) 
    1. Harmeet Sandhu Post author
      Kumud,

      I agree that these factors you highlighted amongst others impact test data. By planning ahead and allocating time and effort for test data, most of these risks can be mitigated.

      Take for example, your response about inflexible timelines. Most clients I have worked with have multiple landscapes. Example, N and N+1 landscapes where different lines of development and testing are carried out. Depending on type of testing to be carried out in the box, available  tools such as SAP TDMS, EPIUse etc can prep up the client in 3-5 days. And this is for larger-in-scope test cycles such as integration and regression. For ad-hoc, kick the tires type testing lean test clients can be built in as little as 2 hours or less.

      My whole point is technology is available to beef up and refresh test systems if you know what your requirements are and have technical infrastructure in place.

      What are the specific issues you face with test data other than the time lines? I would try to answer some of those concerns.

      Harmeet

      (0) 

Leave a Reply