SAP has made a number of acquisitions in recent years. To help deal with this wealth of acquisition data and maintain timely delivery, the group established a Standardization and Matching program (SAM). This program uses SAP Data Services, SAP HANA, and SAP Information Steward.
The business purpose of the program is to provide a single matching tool with good quality results, for use across multiple technical platforms and business processes. One tool. The team wanted to standardize and automate data verification and matching against reference records (or within a submission) to avoid duplicate creation.
The team needed to build a fast and repeatable process and avoid unnecessary costs. These costs were determined across multiple dimensions:
- Impact on customer satisfaction (both positive and negative)
- Cleansing of a duplicate accounts costs between 1-4 hours of manual work and once in the Financial process might not be reversible at all.
- Financial loss/risk, for example invoices not paid due to addressing issues and credit exposure (across multiple “customers”)
- Sales pipeline, where the potential for upsell and avoiding conflict between sales channels must be managed
The Data Migration Quality Process was identified.
With the larger process in the place, the group focused on the specific matching method that needed to support the process. The process uses SAP Data Services as part of the SAM project.
For example, this is the how the approach worked on the SuccessFactors data. Notice the distinct funnel nature of the process. Where the team started with 715,000 records the potential customer accounts were reduced to 83,000.
Notice the sheer number of applications that use the routines developed as a part of SAM.
The results of this program are amazing. In fact, the SAP Data Governance Organization won two awards this year: the Nucleus ROI Award and a TDWI Best Practices award. Check out this blog for more details on how the metrics were calculated.
Related blogs on SAP’s Data Governance program: