CRM and CX Blogs by Members
Find insights on SAP customer relationship management and customer experience products in blog posts from community members. Post your own perspective today!
cancel
Showing results for 
Search instead for 
Did you mean: 

Overview


Retail sectors dealing  groceries and food supplies have to deal with frequent price changes. Unlike a typical commerce application, price rows are not static in this industry and subject to change almost everyday. This imposes a serious design challenge caused by data volume, accuracy and overall system performance.

A small change  in external factors  results in millions of  price rows getting impacted and needs to be updated to all integrated systems. The system needs to update millions of price rows in a short span of time without impacting overall system performance and data accuracy. Things get much more complicated if commerce application supports customer specific contract pricing. Contracts can be based on the individual product, category , customer segments or industry specific categorization. If we consider all possible products and categories combination, it results in millions of price rows for eighty to ninety thousand products. If we consider the customer specific contract, number of price rows can easily jump to couple of billions. This imposes serious design challenges for any commerce application, which cannot handle this volume of price rows with a traditional approach.

The obvious question comes from business, “How we can support dynamic pricing scenario without losing system agility and flexibility.

Design Approach


Pricing Engine is a building block of a commerce platform which serves the pricing needs for Product Display Page (PDP), Product Listing Page (PLP), Cart & Checkout and Search Result page. Commerce ships with a pricing engine. If pricing conditions are static, Pricing Engine along with Promotion Engine can meet most of the commerce pricing requirement. If price rows are changing too often, commerce out of box Pricing Engine needs customization to meet dynamic pricing requirements. A right design approach needs to be thought through for a scalable solution to support billions of price rows without impacting overall system performance.

There are several feasible design options to meet above business requirements. However, following are two preferred design approach adopted by industry.

 

Externalize Pricing Engine


As part of this approach, Pricing Engine is kept outside commerce platform (preferably in an ERP system). ERP system exposes Pricing Engine as a service to be consumed by commerce application. This approach works best if commerce and ERP system are in same cloud infrastructure to avoid any network latency. It provides a centralized Pricing Engine to be used by commerce and any other enterprise application. The centralized Pricing Engine is good in the sense that business need not maintain multiple copies of price rows.

Technically, it seems a sound approach. However, the scalability of this design has been always in question. It becomes more critical when business expects PDP, PLP and search results page to display calculated price rows in real time with an SLA of 2 - 3 seconds. It is difficult to meet such a stringent SLA. If commerce site supports wish list and allow customers to add 80-100 line items to wishlist, which is a common practice in B2B scenarios, the externalized pricing engine may become a performance bottleneck. Considering these limitations, the externalized pricing engine approach may be a good fit for commerce sites with a small set of products or where customers are not expected to maintain a big wishlist or carts.

 

 


Externalized Pricing Engine


 

 

 

Following are Pros & Cons of the externalized pricing engine approach.

Pros.

  1. Centralized Pricing Engine. It will be easy to maintain as business has to deal with single centralized pricing engine.

  2. Pricing Engine can be exposed as service and it can be used across the enterprise.

  3. Pricing API's performance cane enhanced further by clustering and caching.


Cons.

  1. No fallback mechanism. If ERP system hosting pricing engine is down, it will bring down commerce site.

  2. Hard to meet page response SLA of 3-5 seconds for PDP, PLP, cart & checkout, wish list  and search results  which are critical for any commerce site.


 

Apache Solr as Pricing Engine


Other approaches to implement pricing engine in commerce is by indexing price rows in a separate Solr core.  As part of this design , price rows are managed in an ERP system  and indexed to a separate Solr  core through a middleware. Traditional approach is to get price rows in the commerce database and leverage commerce out of box index jobs to index price rows to Solr. However, as per this approach, price row is not brought into commerce database. The middleware communicates directly to Solr to index price rows in a separate Solr core. The objective is to avoid bringing  billions of price records to commerce database. The ERP system sends data to middleware and middleware calls Solr API to index price row to Solr.

 


Apache Solr as Pricing Engine



Characteristics of this design approach:



  • Price is indexed directly to Solr by skipping intermediate commerce database.

  • This approach allows to index close to a billion price rows to Solr in hours without impacting application performance.

  • Usually price row documents are small in size, assuming 10-15 attributes. Solr index is comparatively fast. However Solr index batch size needs to be adjusted to optimize index process.


This approach results in a Solr node with a large index size. If the Solr index is too large to handle by one node, it  needs to be broken  up and store in sections by creating multiple shards.  I observed that keeping  10-15 million price rows in a single Solr shard was giving optimum performance.

I was using SAP commerce cloud environment (CCV2) as a commerce platform. SAP commerce cloud CCV2 in general  allows 10 GB of disk space per Solr node. I was not able to index all the price rows in 10 GB disk space. I end up asking SAP to increase disk space to 30 GB which was good enough to index a billion price rows.

 

Following are other design considerations to improve overall Solr Pricing Engine performance.

 

  • Leverage dynamic attribute definition approach to add new attributes. This allowed me to add new attributes dynamically.

  • I was dealing with high volume of price rows; we often need to optimize price rows for better Solr performance. As Solr uses soft delete to delete records. I defined a batch job  to hard delete all soft deleted records and optimize index process. This eventually boosts search performance. This also cleans up Solr index, which eventually helps in overall Solr performance.

  • I defined a custom  unique ID  to uniquely identify a document with-in Solr and create/update/delete indexes based on ID.


Once indexing is complete, It is time to develop an API to serve price rows. For this, I used commerce out of the box implementation to query and filter price rows from Solr. There was bit customization here to filter price rows by considering category, contract and segments. I was dealing with 11 layers of hierarchical pricing filter criteria. The price data structure was a key design factor here as I didn’t want to have multiple calls to Solr. My objective was to make a single call to Solr which will return all price rows relevant to a product and I will filter with-in commerce to get desired price rows.The bottom line is, we are not making multiple call to Solr otherwise it will outweigh the performance gained by implementing Solr as price engine.

Further optimization opportunities


Above  design approach helped to make a scalable Pricing Engine capable of handling  billions of price rows without impacting system performance. I observed that there are further improvement opportunities. If I enable price row  caching in commerce, it will give better system performance.

The user makes a call to get price row for a product. The system will first check if price row of selected product is already cached. If yes, the system will get price details from cached data and will skip Solr call. If price row is not available in Solr cache, System will make a call to Solr APIs get the desired price row, cache the price row in commerce and return price data to the customer. This helped me to further optimize performance of the overall pricing engine.

I leveraged caching mechanism to cache pricing details of hero products which were accessed by majority of customers. Hero products are  most searched/visited products by customers.

Downside of this approach


Overall, this design approach sounds quite promising and has many advantages. However, there are a few downsides of this approach and needs to have a proper mitigation plan to address these challenges.

If the Solr node is down, it will bring down the whole commerce site. A distributed Solr set up managed by ZooKeeper leveraging Solr shard needs to be properly configured to ensure Solr server is running in high availability mode and  if one node goes down, Zoo Keeper is able to bring up a second node to serve commerce engine.

If due to any reason Solr index gets corrupted, indexing billions of records from scratch will take time. Therefore a process should be developed to back up and restore Solr index, which will restore old price rows. I leveraged devOps team to setup jobs to back-up Solr index on a daily basis and restore last successful index in case of any Solr crash or other issues.

Other operational support issues observed during Solr implementation with multiple nodes were a mismatch of a number of documents in multiple Solr nodes. This created a major issue with Pricing Engine as the system was returning different price based on Solr node it has hit to retrieve data .This specific situation arises if a particular Solr node has infrastructure issues and not was able to ingest newly indexed price rows. To address this issue I looked at Solr logs carefully for exceptions carefully and addressed the root cause. I also defined a proper error handling mechanism to track errors and exceptions at Solr and middleware end to mitigate this problem.

Conclusion


Pricing Engine is a core part of any commerce implementation. Every industry has specific pricing needs and commerce pricing engine needs to be customized to meet industry specific requirements. I had to deal with huge volumes of pricing data (~billions) which were subject to change almost every day. There are several design approaches feasible to support this volatile high volume pricing scenario. However, I realized that the Solr based Pricing Engine is one of the best and preferred design approach. It requires minimum customization to SAP commerce and can help to meet performance SLA which is a bit challenging with other design approaches. Solr based price engine approach is also not tightly coupled with commerce set-up and can be scaled by the increasing number of Solr nodes.

 
1 Comment