Supply Chain Management Blogs by SAP
Expand your SAP SCM knowledge and stay informed about supply chain management technology and solutions with blog posts by SAP. Follow and stay connected.
cancel
Showing results for 
Search instead for 
Did you mean: 
thomas_wright1
Explorer
0 Kudos
Presales, my role at SAP, provides a variety of interesting customer engagements, and as I've shared earlier on this blog channel, in a best case outcome there is learning for us and for our customers.

That's certainly the case in a recent experience, where I led a team providing a customized demonstration of IBP Demand and Inventory using a customer provided dataset.

We don't always - or even often - do this.  There needs to be a win/win outcome to justify the mutual investment.

In this case, we had a customer executive with deep demand planning experience but newly with the company who's been through this often enough to have a bit of "prove it" in him, and a couple newer team member on our side that I wanted to enable to get into the details of our tools.  So we agreed.

Working through the process highlighted a number of the key value drivers of IBP, and provided a few lessons... in this post, I'll share both.

Configuration and Data Load:

  • This was a custom demo, not a Proof of Concept (POC).  I'm professionally focused on the difference, which to me means no significant change to an existing SAP sample Planning Area and no significant change to the supply chain structure (if IBP for supply is in scope).  In this case, we found sample Planning Area SAP3 (Inventory Optimization) provided the master data structures needed.

  • Understand the provided data.  This is critical in a demonstration situation where we have limited time to prepare and present.  Our input data set included 10MM detail shipment records for over 25K active items contained in an MS-Access database.  We were fortunate to have on the team a colleague with solid Access skills who's ability to define joins to validate data before loading were invaluable, even though the data provided was "pretty good".  He also created relevant .csv files from the Access data for loading.  As with almost all demonstrations and most early stage implementations, we used the Data Integration process in IBP to load data.

  • Understand your time profiles.  In this case, we leveraged the Technical Week construct already defined for SAP3, but with a new Time Series Profile.  Technical Weeks allow proper aggregation to calendar months while - in parallel - enabling calculation and reporting at a calendar week level.  This is critical in demand planning.

  • Make sure you get the Planning Levels right.  I've noted this in my book, and in other blog posts.  Fortunately this was not an issue here, as the structure in SAP3 matched exactly what we needed.  This is an important point - SAP has put alot of work into the standard sample Planning Areas, and for the initial stages if not full implementation almost every effort can leverage the work done here.

  • Do not seek to store daily data.  At first, I did.  If you've read my prior blogs you know that "fail early and often" is one of my mantras.  I have colleagues who will say that "ready, fire, aim" is another, particularly in projects like this one.  But I know how simple it is to start over with IBP, so I don't feel the need for a fully detailed plan.  Drives the former consultants nuts.  In this case, I initially thought it would be interesting to store history on a daily level as provided and let IBP/Hana aggregate.  Yes, it worked just fine.  But on a small server (our demo systems are provisioned with a configuration significantly smaller than the smallest customer instance) it proved a bad idea. I found I needed to scope forecast runs to a subset of the data, and even then performance was not what I wanted.  Of course, let's keep that in perspective.  I could still forecast hundreds of parts with millions of data points in less than 30 minutes.  But the key lesson learned: if you're not configuring Demand Sensing, which we were not, let the load process aggregated daily data to Technical Weeks.


Initial Processes

  • One of the first things we did was run an A-B-C analysis on the data provided.  IBP's ability to configure and implement ABC is a great help in understanding the scope of data.

  • The data set we were provided included items with regular demand (including trend and seasonality) and many items with intermittent demand.  While it's possible to throw a best fit model at a data set that includes all algorithms and let the model figure it out, we wanted to test a variety of parameters and so sought to use different models for intermittent and series demand.  ABC classification provided a tool to enable this  We define a KF to test for the presence or absence of demand in any particular cell and set a zero / one value.  We then added an item master attribute (Intermittency Flag) and ran an ABC process using the counter key figure as input.  The results provided a filter for use when running a forecast model.

  •  As a reference, here's what the counts looked like:




  • We also wanted to test the value of forecasting at different levels of granularity - product / location, or product / customer (group in this case) / location.  IBP is well capable of any, the level of aggregation is specified at execution time.  So we created a version to store a set of forecast, and forecast error results, as an alternative (product / location) for comparison against a baseline (product / location / customer group).  As expected, we found - in general - that the higher level of aggregation yielded a better result as measured.

  • Experienced readers will see one problem with the aggregate approach.  If we forecast at the Product/Location level, since there is no set of existing values IBP will disaggregate to the Customer group level on an equal distribution basis.  This is wrong.  So we created a new KF, used the Copy Operator to shift shipment history forward one year (1/16-1/17...) and used this new KF as the basis for disaggregation to the Customer Group level.

  • In parallel, a colleague was loading the master data needed for IBP for inventory.  As we, of course, leveraged the same basic master data there was only a small set of additional data needed.  So as soon as we had a basic forecast cone, we were able to use the Copy operator to load the IO demand and variation key figures and Inventory planning was enabled.


Results

One of my favorite quotes is "in order to achieve greatness, two things are necessary - a plan, and not quite enough time".  If that's true, this was great.  The objective was not the specifics of the forecasts generated, but the adaptability of the process and identification of problem items for improvement.

We generated a forecast using three models for series, mixed, and intermittent items respectively. Each model utilized a Best Fit selection of candidate algorithms appropriate to the demand pattern of the items to which it was applied.

The true power of IBP was showcased in the presentation when our customer noted that in many cases it's more important to forecast at the customer group / product level - ignoring location - than one of the processes we'd used.  We were able to do so interactively for a group of items "live" in less than a minute, using the same set of models.

For those seeking to explore or implement IBP, perhaps the key here is not the details but the impact and productivity of the tool.  Within three calendar weeks, with limited investment of time, we were able to produce a model capable of generating production quality forecasts and inventory plans.  No doubt there is opportunity to improve results through tuning of parameters and processes, but ....

Finally, on behalf of my colleagues in the consulting community, please don't read this as "implementation best practice..."  Among other things, we did this asynchronously with limited contact with the customer through the process.  And took many shortcuts.  But the overall experience fully substantiated the representation of IBP as a tool that enables an iterative, agile approach to deployment with the opportunity for testing and assessment of results throughout.

For those interested, here's a rough assessment of the time required: