During my interaction with clients (I love client feedback on their challenges and opportunities and how they are dealing with them) one topic that frequently comes up is impact of bad data or lack of data on testing. Call it the side effect of data management or rather data mismanagement it is more pervasive than acknowledged.
It manifests in one or all of the following scenarios –
1. Not being able to test at all in trial (development or QA systems)
2. Findings defects in production when there were none in Development and QA
3. Inconsistencies within various test systems that yield inconsistent test results
The combined effect of these symptoms is lower quality data and processes, inability to measure the effectiveness of QA organization and increased time-to-market of their software solutions. Thus, the trust in software and its data is diminished and the business cannot assess accurately the impact of forthcoming software releases. Ever changing regulations and stringent compliance atmosphere makes it imperative that QA is able to instill this trust in quality and robustness of applications under test.
Some real life examples-
- A workflow in your QA system causes several thousand applicants in your HCM application to receive an email stating that they have been shortlisted for an interview (Hey, you can always have someone call them back and apologies for the slight misunderstanding. Right? But there is only 20, 000 of them and it is going to take some time to get to all of them).
- A major SAP enhancement involving multiple applications was tested and worked like designed in the QA systems, but failed upgrade in production leading to countless hours of rework and production systems downtime. Houston, there is a problem!
Look, when a root cause analysis of these issues is completed, it uncovers several small points of failure that could have been easily prevented by effective testing. But lack of good, testable data meant a lack of good, effective testing.
Traditional QA organizations have little or no control over the source of their test systems. Data is provisioned as is for testing by the basis teams or developers using migration tools, full system copies and scripts. This leads to disconnects between what the testers really need and what the system administrator thinks is required. In absence of relevant data, testers – both during functional and regression test cycles – spend a large amount of time manually “setting” up test data.
SAP has provided numerous solutions to copy and move data between SAP systems like client copies, systems copies, LSMW. Some of these solutions are specifically provided to create and maintain development and test environments as well as training systems – such as Test Data Migration Server (TDMS).
Having the right tool and knowledge to operate these is only a start in the right direction. I present below best practices that have always kept us in good stead with respect to setting up DEV and QA systems for our clients.
Take a process centric approach to test data: Remember, who are you provisioning the data for and what is the context in which it would be used? I may be asked to copy just a few tables or a handful of rows in a table, but there could be downstream effects. Always ask and understand the business processes using this data. Involve your users, data stewards when structuring a data provisioning process. Take an end to end approach, this will ensure you do not create or modify data that will cause inconsistencies or duplications in near future.
Have a test data team: no one knows their test data better than the test team. Have representation from test team, along with folks from DBA, security and Basis.
Don’t forget security: When real, sensitive data is involved, there is also a chance that this data could be compromised. Work with your enterprise security team to have controls put in place how data moves between various systems and what data needs to be privacy protected. Work with your business to create the right concepts that can scramble (or mask) sensitive data.
Consistency is the key: Keep your entire landscape in sync. If you have 6 QA systems, make sure they all have the same configuration and master data. This includes any custom changes or code that you may have created. Exceptions to this rule should be limited.
Revisit your standards: Characteristics of data change over time. New entities are added, processes are modified or discontinues. Revisit your test data management procedures to avoid redundancy.