Supply Chain Management Blogs by SAP
Expand your SAP SCM knowledge and stay informed about supply chain management technology and solutions with blog posts by SAP. Follow and stay connected.
cancel
Showing results for 
Search instead for 
Did you mean: 
Domnic
Advisor
Advisor

The flexibility of bringing your own logic into your planning cycles encourages developers to work closely with planners and experiment different algorithms. This also raises a question — where do you like to run your external algorithms?. It is at the liberty of the customer or developer, which platform he or she chooses. In this blog, I would like to share my experience on using Microsoft’s Azure Machine Learning s a platform. In the 2205 release of SAP Integrated Business Planning for Digital Supply Chain, new oData API's were introduced to extract planning data and apply external algorithms to it and influence a forecast plan. This blog describes it in detail.  I already explained how one could use Google's Vertex AI to do this job as an external platform. In this blog, I am using Microsoft's Azure Machine Learning to do the forecast with a similar data set. The aim of this task is definitely not focused on the performance validation of different platforms, rather,




  1. Enable the developer to understand the usability of SAP IBP data in the external platform.

  2. Validate the technical feasibility of exchanging forecast data and results of the calculations.

  3. Explore the service and its offerings that can be potentially applied for a different dataset.


Landscape overview


For the experiments, we took an SAP Integrated Business Planning instance with the sample planning model SAP6 (Step 1). We defined a simple external forecast model and triggered (Step 2) the data exchange via SAP Cloud Integration Suite (Step 3). On the Microsoft Azure platform, we build a Python based script to calculate a simple average (Step 5). This was exposed as an endpoint to inference (Step 4). The result of the calculations were then sent to SAP IBP as a response to the trigger - making this an online forecasting process.


Landscape Overview



SAP Integrated Business Planning


We created a planning area based on the sample planning model SAP 6. We are using the Excel plugin to create a planning view with two standard key figures - Demand Planning Quantity and Statistical Forecast Quantity. In the planning models app, we created an external planning model. The Forecast model is configured to estimate 6 future months while it is given a historical period of 12 months. We also defined the communication scenarios with end point from an iFlow that is running on SAP Cloud Integration Suite instance. The setup is pretty much following the explanation from SAP Notes 3170544.

When we connect to the SAP IBP system from the Excel plugin, we can see the External Forecast model available from the Simulate menu item in the SAP IBP tab. We have some sample data that extends till September 2022.



SAP Cloud Integration Suite


When we simulate the External forecasting via the Excel plugin, a trigger is sent from the SAP IBP System to the configured endpoint. This endpoint refers to an iFlow which is built on a SAP Cloud Integration Suite. This flow, gets the request ID of the call from the SAP IBP instance. Using the request ID, it retrieves the metadata and the forecast algorithm input needed for the python logic. The iFlow also prepares the payload needed for the endpoint exposed on the Microsoft Azure Machine Learning platform where the python code is running. This is a synchronous call, which means the response coming from the forecast calculation is forwarded back to the SAP IBP system in the same flow.


iFlow on SAP Cloud Integration Suite


 

The endpoint on Microsoft Azure Machine Learning is authenticated by the iFlow Request-reply adapter. It is also possible to do a standard oAuth client credentials flow or use the certificates. For reasons of simplicity we use the token that was generated by the Azure platform. We created an Exchange property on a content modifier with this token which was then passed as a bearer authorization, set by a Groovy script.

Microsoft Azure Machine Learning


Instead of writing how the platform works from an apprentice point of view, I would rather point to the official documentation there. Detailed tutorials on how to work with Azure Machine Learning are also available from Microsoft. But it make sense to give an understand how this was used in the context of SAP Integrated Business Planning. On the Azure Machine Learning platform one can use the Studio to author projects. Below is the screenshot from the Microsoft Azure Machine Learning Studio which highlights some of the main sections.


Microsoft Azure ML Studio


In this case however, we use a custom Python script to compute the forecasts — hosted on a so-called Managed Online Endpoint, provided by Azure Machine Learning. When we send the data from the Cloud Integration Suite, we merge the metadata as well as the algorithm input data in one JSON payload. That JSON document is then sent to the Managed Online Endpoint running our Python script, which then computes the forecasts for us. There are multiple ways to develop and deploy the script, such as in Azure ML’s Notebooks, JupyterLab, PyCharm or in VS.Code.



Visual Studio Code as an alternative


Once deployed, the script is then started on the compute instance. Once testing is completed you can submit the code to a compute cluster. The compute instance and the cluster are defined during the project setup in Azure Machine Learning. Here is where an endpoint is generated with a valid oAuth token from the key vault. This token can be used by the caller to authenticate against the endpoint. For the full code and more details check out this repo.



Endpoints generated on Microsoft Azure Machine Learning


From the usability perspective, it was possible to get the results of the forecast from this external algorithm in a synchronous manner. The results were updated into the SAP IBP instance via OData calls from the same iFlow that got triggered by the Excel planning view. As a reference, I have provided the iFlow, groovy scripts, metadata, python script which implemented the external algorithm as well as a simple JSON payload that was used, in this github repository.

We are aware of the fact, this methodology might not be applicable for huge and complex data sets which require training a machine learning model. This is something we are currently exploring for our next blog.

I hope it gives the reader an idea of how easy and simple it is to implement a synchronous inference process using the Microsoft Azure Machine Learning studio for an external forecasting model in the SAP Integrated Business Planning. Using the SAP Cloud Integration Suite, exchanging the payloads, transforming the data and authenticating the difference services were quite trivial.

A big thanks to Tamas K. from SAP and Timo K from Microsoft Azure ML team for creating the Python scripts and testing after working hours. Special thanks to Bartosz, Oliver, Rene and Lei for making things easy during the experiments.

Domnic Savio Benedict