Skip to Content
Business Trends

Architecture and deep-dive into leveraging Predictive Intelligence for S/4HANA

Part 2 of the blog series:

As a continuation from my previous blog, Leveraging Predictive Intelligence with S/4HANA, let us now do a deep-dive into the architecture and the different approaches involved here. Typically in enterprise processes where the right information must be provided to the right person, at the right time and right place, predictive algorithms must be integrated into these business processes. However there is also need of unforeseen, exceptional, and irregular usage of these Predictive algorithms to discover hidden insights and relationships in data.

Let us now focus on the different approaches and the architecture behind these approaches.


Use cases like forecasting, key influencer identification, trending, relationship analysis or anomalies can be solved with algorithms like regression, clustering, classification, or time series analysis. Usually these algorithms are not resources intensive in terms of memory usage and CPU time. These can be implemented within the SAP S/4HANA stack or called embedded ML (building into the core) where both the application data for model training and ML consuming business processes are located.

The embedded ML architecture is very simple and powerful at the same time as illustrated in the figure below where embedded ML architecture is based on HANA ML and PAi(Predictive Analytics Integrator – now upgraded to ISLM and refer to the blog). While HANA ML provides the required algorithms with PAL (Predictive Analysis Library) and APL (Automated Predictive Library). PAi is in charge of life cycle management of ML models, integration into ABAP, and training of models at customer side. The embedded ML has very low TCO and TCD.

Use cases like image recognition, sentiment analysis, or language recognition require deep learning algorithms based on neural networks. For model training etc., these algorithms require huge amounts of data and CPU time. Therefore the model training for these kind of scenarios is outsourced from the SAP S/4HANA stack to the Leonardo Foundation (now upgraded to the AI foundation) platform on SCP (SAP Cloud Platform) or called side-by-side ML (expanding around the core). Anyway the requested data for these scenarios like the images, audio files, video files, text documents, historical data are stored not in SAP S/4HANA but on a big data solution leveraging the SCP. The Leonardo Foundation library complements the overall solution architecture where specific algorithms are not provided on the SAP S/4HANA stack since the classic methods (eg., regression, classification) consume too many resources of the transactional system, or huge volumes of external data(eg., twitter, facebook etc.,) are required for model training. Hence the SAP S/4HANA extensions consume the Leonardo Foundation services (now called AI business services) and HANA ML capabilities as application data and business processes are founded on SCP thus bringing the golden rule of algorithms to the data applies.


Building into the core (Embedded ML):

Now let us dive into the embedded ML approach in SAP S/4HANA, as explained in the figure below, the solution is based on 2 main architecture decisions: using CDS views and making use of the ML techniques provided by SAP HANA. The algorithms for embedded ML can be performance intensive as high volumes of application data must be processed. For performance optimization, the algorithms should be processed closer to the data – SAP HANA provides the Predictive Analysis Library (PAL) and Automated Predictive Library (APL) application function libraries that offer statistical and data mining algorithms, additional algorithms can be implemented as required. These algorithms require application data as input for model training. The trained models are exposed to business processes by wrapping them with CDS views. These CDS views can be combined with other VDM CDS views and can then be exposed to the consumers. By consuming ML models through CDS views, existing content and concepts are re-used which results in a simple and very powerful solution architecture. The purpose of Predictive Analytics Integrator (PAi now upgraded to ISLM) is to provide a common interface for the consumption of ML models, independent of the underlying predictive engine. PAi contains information regarding the installed SAP HANA libraries. It provides a repository for ML models that includes, for example, information concerning model types (eg., regression, classification or time-series), model data sources (e.g., tables or views), model training data, or model quality figures etc. PAi shall also provide a pluggable infrastructure and adapters for automated (eg., APL) and expert (e.g., PAL, R or EML) libraries. Models created with SAP Predictive Analytics (automated or expert mode) can be imported into the PAi repository and executed.

Embedded ML shall be applied when the following criteria are valid:

  • Business & ML logic reside on the SAP S/4HANA platform
  • Use case is simple like forecasting or trending where the algorithms have low demand for data, RAM and CPU time
  • Data located in SAP S/4HANA is sufficient for model training, no need for huge external training data
  • Required algorithms are provided by HANA ML (e.g. PAL, APL, Text Analysis) and handled by PAi/HEMI in terms of ML model lifecycle management and ABAP integration

Expanding around the core (side-by-side ML):

While embedded ML targets scenarios where the business and ML logic reside in the SAP S/4HANA stack, the term side-by-side ML is used in the following use cases:

  • SAP S/4HANA ML app based on SCP: The SAP S/4HANA application and the according business logic are based on SAP Cloud Platform. Such applications should consume the required ML services directly from Leonardo ML (now AI foundation)respectively HANA ML following the rule of bringing the algorithms to the data.
  • SAP S4HANA ML app based on ABAP AS: The SAP S/4HANA applications and the according business logic are based on the SAP S/4HANA stack. However, the required ML capabilities are not available on the S/4HANA stack, e.g., image & language recognition or sentiment analysis. These features are consumed from Leonardo ML (now AI foundation) remotely based on exchange of trained models.

Hence side-by-side ML scenarios are usually based on Leonardo ML (AI foundation or the SAP Data Intelligence running on SAP Cloud Platform). The architecture of Leonardo ML is illustrated in the figure below. Leonardo ML is organized into 3 different development initiatives – Foundation, Functional services and Business services.

  • Leonardo Machine Learning Foundation (LMLF) defines a framework for creating and executing “intelligent” algorithms in the context of the overall SAP echo system.
  • The ML functional services execute on top of the LMLF and provide ready-to-use implementations of a selected set of standard use cases such as image classification, sentiment analysis etc.
  • Finally the ML business services implement complex use cases that are typically tightly integrated with SAP application business processes in S/4HANA.

Side-by-Side ML shall be applied when the following criteria are valid:

  • ML logic resides on the SCP platform while the business logic can be based on SAP S/4HANA or SCP
  • Use case is complex like image recognition or natural language processing where among others neural networks with high demand for data, RAM and CPU/GPU time
  • Huge volume of external data is required for model training, main focus is on processing unstructured data
  • Required algorithms are not provided by HANA ML, but by other libraries for e.g. TenserFlow, SciKit-learn


Extending the core:

Further you should be able to enhance, modify the current predictive models embedded in SAP S/4HANA processes by leveraging the “SAP Analytics Cloud – Smart Predict”, this is being evaluated and shall be communicated in 2021 or later. You are already able to create new predictive models using “SAC – smart predict” and push them back into the SAP S/4HANA processes. In the meanwhile with the inception of the upgraded PAi into ISLM (Intelligent Scenario Lifecycle Management), the concept of updating and modifying the models would be possible. Please check the blog and blog series in this regard.

The below figure explains the different ways of leveraging end-to-end predictive scenarios:

In the next blog, we discuss a bit more details on the different approaches and process flows involved in leveraging Predictive Intelligence.

Here are some quick links to the blogs in this series to give you a complete understanding of how Predictive Intelligence is infused into SAP S/4HANA.

Happy predicting the future!!


You must be Logged on to comment or reply to a post.