Skip to Content
Technical Articles
Author's profile photo Former Member

Merchandising Master Data Enrichment Use case

With the advent of Hyperscalers and their available services, the most obvious question one might have is which Hyperscaler is best for their landscape, how to leverage and integrate services seamlessly to build independent functions.

Recently I got a chance to be part of a customer implementation, involving SAP S/4HANA integration with legacy systems, leveraging Google Cloud Platform (GCP) services.

Brief Background of Customer –

Customer is home improvement retailer in the United States, supplying tools, construction products, and services. They actively invest in new technology to improve its online and digital presence, its front-end retail operations, and its back-office activities such as inventory and supply-chain management.

Over past few years, the company embarked on journey of “Inter Connected” Retail, in conjugation of “Omni Channel” Retail. They required a single repository for all channel sales at the most granular transaction level with the ability to aggregate real-time at any level to achieve market campaign effectiveness and increase customer satisfaction. This brought major transformation along with many challenges due to existing complex landscape, and complex business scenarios.

Possible Scenarios of Operational inefficiency –

  • “In-Store” have ability to have multiple UPCs on a SKU vs “Online Store” have only single UPC for SKU.
  • Online channel currently supports only one active vendor for an item, as compared to In-Store process.
  • Duplicate data maintenance in multiple systems

With over 10+ years SAP solution and operation experience, company already has a long and continuous engagement history with SAP. Company already owns HANA licenses and due to the success of BW on HANA and CAR, they wanted to expand the usage of HANA to other areas of the business and increase HANA user adoption. In parallel to ongoing initiatives, company started Merchandising Master Data project.

This Project focused on Supplier & Product Data as well as representing the Relationship Intersection with location. We started with design discussion sessions with business stake holders to understand their existing data, available functions supported by SAP, possible areas (data) of consolidation and company specific scenarios that didn’t align with SAP best practices (custom development scope).

Prior to this project initiation, company had cFin and SAP Customer Activity Repository (CAR)  implementation planned in their timelines. Current project was kicked off with below key considerations –

Business Benefits: 

  • Process frequent changes to copy variation SKUs from a base SKU; Utilize grouping function to aggregate financial reporting.
  • Single identifier for a supplier regardless of channel
  • Channel independent terms and references, to ensure uniform experience both in-store and online for customers
  • Since online doesn’t support multiple UPCs on a single SKU, they must use two SKUs and transition from old SKU to new SKU by monitoring DC inventory.
  • One-time transition of in-store inventory; existing UPCs to their new unique SKUs. This included converting 104 UPCs to unique SKUs


Technical Impact:

  • Ensure “near” real time replication of changes from DB2 to SAP landscape.
  • Correctly capture all relevant attributes, timestamp and user credentials for identified master data – Business partner, Vendor Discounts, Shipment Terms, Articles (SKUs).
  • Data enrichment with Merchandise relevant values of already existing Article data (loaded/ created through parallel prior initiatives/ projects)
  • Build a SKU UI frontend as a single point of data entry for SKU creation or updates.
  • Changes to SKU creation tool to generate a family relationship to handle multi UOM scenarios.
  • Ability to leverage Item Data Management (IDM) for suppliers to view, setup, and maintain product data
  • Flexible data retention structures
  • Simplify the data across all omni channel
  • Ensure SAP S/4HANA is integrated using best SAP capabilities, such as OData API, OAuth2.0 authentication, GCP API Management
  • Minimal interruption to ongoing business and parallel projects.


Landscape, Technical Architecture –

This project implementation can be broadly categorized into 2 parts, namely – Vendor and Articles, with integration between different DB Tables, and SAP S/4HANA system. Google Cloud Platform (GCP) microservices are heavily used to initiate Inbound OData API interfaces in SAP, while outbound interfaces are transmitted with the help of PI/ PO as middleware.


a) Vendor Integration –

Integration included all Vendor related master data such as Business partner maintenance, Vendor Discounts maintenance and Shipment Terms maintenance. Whenever user triggers a create/ change to a vendor record in Legacy Application db2 (step 1), the changes are queued in WMQ queue (step 2). Listener on Cloud Foundry (step 3) reads the queue and stores the changes in BigQuery for Historical purposes, for Analytics and Audit purposes. The changes are also sent to GCP DataStore (step 5), where the data is cached and subsequently picked up by Publisher (step 6). Upon event publication to Pub/Sub Topic (step 7), push subscription is delivered to the Publisher app event (Step 8), which in turn triggers SAP OData API. Subsequently, db2 data gets transformed and stored in HANA database, along with all relevant checks, validations and dependencies.

Benefit of the above microservice architecture helped to simplify and to be largely independent and autonomous of each function. Also, this approach provided robust mechanism for retry/ retrigger messages incase of any function point failure.


b) Article (SKU) Integration –

Integration included all Article related master data such as SKU, UPC, UoM, GTIN, and relationships such as Collections, Hierarchy, Assortments.

SKU UI User Frontend is built using React web framework and integrated with SAP S/4HANA using OData API. Both SKU UI and SAP S/4HANA system are hosted on Google Cloud Platform and within company’s secure network. Different checks and validations were put in place at different integration points, to ensure data integrity and data enrichment of Article master.


Primary usage of this integration is to –

  1. Enrich existing articles with merchandizing relevant attributes – Company is going through business transformation and has many parallel projects lined up, with planned GoLive one after other. With each GoLive, company releases a new functionality and feature within SAP. In this specific instance, CAR Retail and Central Finance project are planned to Go Live prior to Merchant Master Data project; and hence, enrichment of existing Article data was necessary to enable its relevance and usage for Sales, Purchasing, Discounts processes, which were lined up for subsequent GoLive.
  2. Create the Article Master Records in SAP S/4HANA for the non-existing SKUs that fall out of prior project transformation relevancy plus structured articles.
  3. Enable SAP as single source of truth, once all attributes are captured from Base Table.


Conclusion –

For this customer, using Google Cloud Platform (GCP) services proved to be easier and flexible than other Hyperscaler competitors. We faced many challenges in the integration and implementation of above architecture; however, with good references and robust development team, we were able to overcome most of the issues. There were many decision points, to name a few –

  • Which tool provides ease and flexibility to evaluate data stream – GCP Pub Sub vs Kafka?
  • How GCP services will interact/ trigger SAP S/4HANA OData services, retaining the security protocols?
  • If SAP S/4HANA is capable to handle OpenID token (JWT, OIDC)?
  • How user identity is cascaded and authenticity is maintained through different GCP services, till SAP S/4HANA services make database commit.


In case you want to know more about the connectivity and integration options used in above scenarios, please check out below references and documentation.

I hope this article helps you to decide interconnect strategy and to leverage best of the available Hyperscaler services. Please feel free to post your Hyperscaler integration experiences, and queries.


References –

  1. Pub/ Sub Topic –
  2. Cloud Armor –
  3. Using OAuth2.0 from a Web Application –

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Pratiksha Agarwal
      Pratiksha Agarwal

      very well explained Hyperscaler services with the help of the use - case.

      Author's profile photo Yehudi Edwards
      Yehudi Edwards

      Very interesting and educational article, well-written, it contains a good visual and practical advice, well done.

      Author's profile photo Jasmin Goswami
      Jasmin Goswami

      Very informative and well explained with perfect business case