At the very onset of a customer call, we always try to understand what kinds of problems the customer would like to solve and often they come back with an optimization scenario. Whether it be price optimization (“What price should I sell my product for that will yield the highest profit?” or inventory optimization (“How much inventory do I need in stock to ensure I meet customer demand and my customers don’t go elsewhere?”), when it comes to solving problems with predictive analytics, many customers equate ** optimization** with

**.**

*prediction*While it would be extremely convenient for a single tool to build predictive models *and* find those optimal sweets spots, most tools do one (prediction) or the other (optimization). Many customers simply don’t have the necessary historical data to build predictive models for doing optimization and they rely on employees with PhDs in Applied Math or Operations Research to develop the model(s) and hope they get it right. And how easy is it to find those employees with PhDs in Applied Math and Operations Research? Not so easy!

BUT, for customers who *do* have historical data around sales for various prices, inventory levels, and so on, it IS possible to perform simple optimization using predictive modeling algorithms and an Apply-In dataset that contains a set of pricing scenarios. As an example, let’s do an optimization exercise where the goal is to figure out the ‘best price,’—the price that maximizes Sales. In pricing optimization there will be a sweet spot where the price isn’t so high that customers are deterred from buying, or so low that the company is leaving potential money on the table. Let’s say the product is a box of candy that would be sold in different types of stores with quite different price points.

**There are three high-level steps for doing a simple pricing optimization analysis: 1) Segmentation 2) Predictive Modeling 3) Optimization**

## Step One: Segmentation

The first step, Segmentation, simply refers to how many models should be built. For example, should separate models be built for every SKU? How about every combination of SKU/Customer? Is there even enough historical data to build meaningful models at this level or should models be built higher up in a product hierarchy? Remember, predictive models are built using historical information where the Sales figures associated with different prices must be known.

In our candy example, let’s say the customer has enough historical data to build models for every combination of Product and Store Type. Store Type in this case refers to Convenience, Grocery, Wholesaler… As I’m sure you can imagine, Costco, 7-Eleven and Safeway may all stock the same candy, but they do it at very different price points consistent with customer demand. This is some example data for two different Store Types:

## Step Two: Predictive Modeling

The second step, Predictive Modeling, is all about building models for each identified Segment using an appropriate algorithm with all potential predictors. In reality, Price and Store Type are only two predictors in our scenario, however many things should also be considered such as geographical area, time of year, economic indicators, competitor product availability, promotions and new product launches.

We’re going to keep our example simple and just use Price and Store Type along with the SAP Predictive Analytics Automated Regression algorithm. Once we have our predictive model, it’s on to the final step—Optimization!

## Step Three: Optimization

Optimization consists of applying all possible scenarios to the model we built in Step 2. For our example, we generate a dataset containing two columns: Store Type and Price that would include all possible prices we want to test out for each Store Type. You would constrain the prices to realistic scenarios so that you don’t end up predicting Sales that are out of the realm of possibility.

In the dataset below, you can see that for the convenience Store Type (7-Eleven), we’re testing out prices that range from $6.36 to $11, while Costco ranges from $4.99 to $7.56.

Applying the model to these Prices and Store Types gives us Sales *predictions* for each. To find the ‘best price’ it’s just a matter of sorting the Sales predictions (like Predicted Revenue) within each Store Type and aligning the Price to the highest value. Depending on what historical data you have access to, you may have to predict Quantity Sold and then multiply the predicted Quantities Sold by the prices associated with them to find the optimal price that maximizes Revenue. That’s a simple calculation you can do in a business intelligence tool of choice.

In the predicted scenario below, we can see that the optimal price for Convenience Store Type is $9.89, which is calculated based on a model that predicts Quantity Sold. Predicted Revenue is simply Predicted Quantity Sold * Price:

So, there you have it! This entire pricing optimization analysis can be accomplished using SAP Predictive Analytics and is possible because the historical data includes various prices within different Store Types for the products being analyzed.

## Learn More

For more information on how SAP Predictive Analytics can help solve your more advanced analytical problems, visit our product pages product pages and read the rest of our blogs on predictive topics.

Great blog!!!

Is there any optimization algorithms available directly in SAP PAL? i have a long list of predictive analysis algorithm in PAL, wondering if there is anything on optimization algorithm.

Hi Rakshit – thank you for your question! Unfortunately the PAL doesn’t include any optimization algorithms. We tell customers to devise their optimization analyses outside of HANA, and host those engines in their container of choice. As an example, customers can use FICO XPRESS Optimization Suite – we can host tools like that in HANA applications but the optimization is performed outside of HANA. Hope this helps!