Skip to Content

I don’t know about you but I love cloud solutions! The independence from software downloads, package installs, system management, the constant upgrade planning, oh my! But with all this new found freedom I have started to see a trend from customers where they find it hard to keep a grasp on all the great innovations and improvements that are magically appearing in their cloud tenants. Well lucky for you, I wanted to write this blog to help you get your grip back.

It has been close to 18 months since SAP launched SAP Data Quality Management, microservices for location data as a generally available solution. If you have not heard about DQM microservices in a nutshell it offers cloud-based microservices for address cleansing, geocoding, and reverse geocoding. Since it is a cloud based RESTful API  you can embed address cleansing and enrichment services within any business process or application so that you can quickly reap the value of complete and accurate address and location based data. SAP also provides several prebuilt integrations into several applications such as Hybris C4C for accounts, CRM, ECC, MDG, S/4, Data Services, and SAP Data Hub (2.3 Beta). As you can see this one cleaning and enrichment solution has the power improve data across your enterprise landscape applications . All hail the cloud!

I could go an about the features like support for 240+ countries, how it can be used for uncovering geographical patterns, ease of configuration but that is not the focus of this blog post. If that is what you are looking for I suggest reviewing the previous blog outlining a more in-depth description of the DQM microservices key capabilities here.

What is new since General Release of the SAP DQM Microservice and why should you care? In an effort not to weigh you down with every change to the product I have elected to focus on key improvements since the general release that I feel are most significant.

New for 2018

  1. United Kingdom and New Zealand address cleansing now available.
  2. Address cleansing for China is now available.
  3. Increased current geocode coverage to include United Kingdom, Macao, Singapore, and Taiwan
  4. The process for consuming suggestion lists has been improved.
    • Customers can now work with increased number of suggestion list responses; now 200 suggestions. This was previously limited to 50. The increase will allow you to provide to end users a larger suggestion dataset to choose in the cases where address input data might be ambiguous.
    • A new property was added for use when an integrator desires use of a page navigator. This enhancement works hand in hand with the suggestion list response increase to ensure the larger datasets can easily be navigated and paged. (addr_sugg_more_suggs_available)
    • Several suggestion list properties have been consolidated into a single property. This is part of a simplification effort so that customers no longer have to manage six individual data sets. The information previously provided in sugg_addr_prim_number_low/high, sugg_addr_floor_number_low/high, and sugg_addr_unit_number_low/high are now all provided by a new property named sugg_addr_range_type.
  5. A new output field is now provided that returns the official locality name. This is called std_addr_locality_official.  For most cities there is usually no difference between the data returned in the locality and locality_official fields. However there are cases where these values may be different for the same address. A few examples of when this could occur:
    • When the city in the input request is an alias city name such as “Hollywood” where the official locality is different; in this case the value for official locality would be “Los Angeles”
    • Where there may be multiple variations due to local language differences than the input value. The input request of “Geneva” is a valid locality; however the locality official data would be returned as “Genève”

If you are unsure of what to use in your specific integration the most common field scenario is to use the locality or locality_full options.

New for 2017

  1. Batch address cleansing and geocoding API – we now provide a new endpoint which allows for processing 1-n records per API Call (HTTPS POST request) and provide ability to compress the request/response as well.  This offers more optimal performance if you’re processing data in batch using this service.  NOTE that customers are still billed on a per record basis in this case and not truly each API call do pricing doesn’t change when using this API.   You can find more information on how billing and API work here.
  2. Support various forms of country input – in early releases the service required that you enter the Country as an ISO-2 Country code.  We now support many different forms of input and can normalize country data on output as well.
  3. Pass-thru fields – we now support passing through data fields which are not processed/changed in anyway.  Most common reason to do this is to pass-thru the primary key(s) for each record and is necessary when using the batch processing endpoint.
  4. Usage Information – the DQM, microservices UI now includes a way for customer to access their own usage information and view that based on service endpoint (API), country, date.
  5. Geocoding is now supported for Russia and Mexico
  6. Minimum assignment level for address cleansing – when cleansing an address the “assignment level” output tells me to what level the address was validated.  For example, for a given country we might be able to validate all the way to the apartment level.  But based on the input we might be able to only match that address to the locality (city) level.  If this isn’t valuable enough for you can set the minimum assignment level such that you’ll only get back a fully cleansed address if your minimum assignment level is met and avoid a billing charge for that address. NOTE: The service can still provide parsing/standardization of addresses even if the assignment level isn’t met so there is value still in using the service even when a high quality match (validation) to the reference data isn’t possible.

Become an expert user

Beyond this bullet list of feature innovations I also wanted to bring to your attention additional useful Data Quality Microservices information that is relevant. Too often we get caught up in what is shiny and new in the product and overlook the raw understanding of key product features and how to better maximize their usage. Ron Dupey has recently posted an excellent blog that details the Functional Use and capabilities of the DQM microservice.  He has done this through a mixture of media (Blog and video) and it is sure to help you become an expert user. – SAP Data Quality Management, microservices for location data: Functional use and capabilities

Related blogs:

Related videos:

Data Quality Management, microservices Information:

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply