It is not, that the businesses and implementers are not aware of the challenge. It is rather a lack of accountability, as well as of methodology and tools (or the knowledge of those) for validation. There is an obvious lack of standards, how and what to check. I see three primary reasons for the lack of standards:
- IT landscapes look quite different at every implementation (even when the same solutions are deployed).
- Data quality standards vary by each organization.
- Focus of data criticality changes often and dynamically (e.g. while accuracy of finance data is always important, the criticality peaks at the time of financial closing). Given all the above, it is clear that a rigid implementation of data quality checks would be expensive and inflexible, and its ROI often questionable.
A service-oriented architecture (SOA, or ESA) deals effectively with those challenges, and hence serves as an excellent infrastructure for data quality assurance. While not as ‘sexy’ as the average composite application, its business benefits in this context are immense, and quite obvious. Let me quickly lay out the building blocks:
- Data check points (‘hotspots’) exposed as web services throughout the system landscape. In its simplest case, it could be ‘value of purchase order’, or ‘sales for product xyz in current month’.
- A central registry for those services.
- Intelligent (web) services to support data checks. In its simplest case, a service to compare the value of the same purchase order, from two systems.
- A graphical modeling (business process management) tool to deploy and monitor those services in a very flexible manner.
- A methodology for deployment.
SAP NetWeaver could serve as an excellent platform for realization. BAPI’s, RFC’s, XML Queries are being exposed as web services. SAP web services, as well as external web services, are registered in WAS’s UDDI. XI BPM allows the flexible graphical modeling of processes. Finally, Alert Framework and EP are tools of choice for result visualization.