Skip to Content
Business Trends
Author's profile photo Venkata Raghu Banda

Architecture and deep-dive into leveraging Predictive Intelligence for S/4HANA

Part 2 of the blog series:

A podcast on the architecture concepts is here.

As a continuation from my previous blog, Leveraging Predictive Intelligence with S/4HANA, let us now do a deep-dive into the architecture and the different approaches involved here. Typically in enterprise processes where the right information must be provided to the right person, at the right time and right place, predictive algorithms must be integrated into these business processes. However there is also need of unforeseen, exceptional, and irregular usage of these Predictive algorithms to discover hidden insights and relationships in data.

Let us now focus on the different approaches and the architecture behind these approaches.

Architecture:

Use cases like forecasting, key influencer identification, trending, relationship analysis or anomalies can be solved with algorithms like regression, clustering, classification, or time series analysis. Usually these algorithms are not resources intensive in terms of memory usage and CPU time. These can be implemented within the SAP S/4HANA stack or called embedded ML (building into the core) where both the application data for model training and ML consuming business processes are located.

The embedded ML architecture is very simple and powerful at the same time as illustrated in the figure below where embedded ML architecture is based on HANA ML and PAi(Predictive Analytics Integrator – now upgraded to ISLM and refer to the blog). While HANA ML provides the required algorithms with PAL (Predictive Analysis Library) and APL (Automated Predictive Library). PAi is in charge of life cycle management of ML models, integration into ABAP, and training of models at customer side. The embedded ML has very low TCO and TCD.

 

Use cases like image recognition, sentiment analysis, or language recognition require deep learning algorithms based on neural networks. For model training etc., these algorithms require huge amounts of data and CPU time. Therefore the model training for these kind of scenarios is outsourced from the SAP S/4HANA stack to the Leonardo Foundation (now upgraded to the AI foundation) platform on SAP Business Technology Platform (SAP BTP) or called side-by-side ML (expanding around the core). Anyway the requested data for these scenarios like the images, audio files, video files, text documents, historical data are stored not in SAP S/4HANA but on a big data solution leveraging the SAP BTP. The AI Foundation library complements the overall solution architecture where specific algorithms are not provided on the SAP S/4HANA stack since the classic methods (eg., regression, classification) consume too many resources of the transactional system, or huge volumes of external data(eg., twitter, facebook etc.,) are required for model training. Hence the SAP S/4HANA extensions consume the Leonardo Foundation services (now called AI business services) and HANA ML capabilities as application data and business processes are founded on SAP BTP thus bringing the golden rule of algorithms to the data applies.

Building into the core (Embedded ML):

Now let us dive into the embedded ML approach in SAP S/4HANA, as explained in the figure below, the solution is based on 2 main architecture decisions: using CDS views and making use of the ML techniques provided by SAP HANA. The algorithms for embedded ML can be performance intensive as high volumes of application data must be processed. For performance optimization, the algorithms should be processed closer to the data – SAP HANA provides the Predictive Analysis Library (PAL) and Automated Predictive Library (APL) application function libraries that offer statistical and data mining algorithms, additional algorithms can be implemented as required. These algorithms require application data as input for model training. The trained models are exposed to business processes by wrapping them with CDS views. These CDS views can be combined with other VDM CDS views and can then be exposed to the consumers. By consuming ML models through CDS views, existing content and concepts are re-used which results in a simple and very powerful solution architecture. The purpose of Predictive Analytics Integrator (PAi now upgraded to ISLM) is to provide a common interface for the consumption of ML models, independent of the underlying predictive engine. PAi contains information regarding the installed SAP HANA libraries. It provides a repository for ML models that includes, for example, information concerning model types (eg., regression, classification or time-series), model data sources (e.g., tables or views), model training data, or model quality figures etc. PAi shall also provide a pluggable infrastructure and adapters for automated (eg., APL) and expert (e.g., PAL, R or EML) libraries. Models created with SAP Predictive Analytics (automated or expert mode) can be imported into the PAi repository and executed.

Embedded ML shall be applied when the following criteria are valid:

  • Business & ML logic reside on the SAP S/4HANA platform
  • Use case is simple like forecasting or trending where the algorithms have low demand for data, RAM and CPU time
  • Data located in SAP S/4HANA is sufficient for model training, no need for huge external training data
  • Required algorithms are provided by HANA ML (e.g. PAL, APL, Text Analysis) and handled by ISLM(PAi)/HEMI in terms of ML model lifecycle management and ABAP integration

The following figure explains in detail the architecture of embedded ML at a high level. You will notice the SAP HANA layer where the SQL views are built based out of the application tables and the ISLM repository accesses the ML models. The PAi framework has now been updated to the ISLM (Integrated Scenario Life cycle Management) framework and we did discuss in the part 6 of the blog series while explaining the embedded ML in detail. In the SAP S/4HANA layer, you will observe that the CDS views are built based out of the SQL views from the SAP HANA layer and the corresponding CDS views for the ML model are created as well. The Analytical engine on top of this processes these results and delivers them to the built-in Fiori app for the consumer.

Expanding around the core (side-by-side ML):

While embedded ML targets scenarios where the business and ML logic reside in the SAP S/4HANA stack, the term side-by-side ML is used in the following use cases:

  • SAP S/4HANA ML app based on SAP BTP: The SAP S/4HANA application and the according business logic are based on SAP Business Technology Platform. Such applications should consume the required ML services directly from Leonardo ML (now AI foundation)respectively HANA ML following the rule of bringing the algorithms to the data.
  • SAP S4HANA ML app based on ABAP AS: The SAP S/4HANA applications and the according business logic are based on the SAP S/4HANA stack. However, the required ML capabilities are not available on the S/4HANA stack, e.g., image & language recognition or sentiment analysis. These features are consumed from Leonardo ML (now AI foundation) remotely based on exchange of trained models.

Hence side-by-side ML scenarios are usually based on the AI foundation or the SAP Data Intelligence running on SAP Business Technology Platform. The concept of AI foundation is illustrated in the figure below explained as in the form of building blocks and how it is leveraged into the SAP offerings.

  • AI Functions can be realized as AI Services (e.g. Document Information Extraction), via Packages (e.g. Generic Line Item Matching), or Code using frameworks of choice (e.g. TensorFlow, R).
  • Consumption of all AI Functions is unified by an SAP-governed AI API, regardless whether they are deployed on SAP (SAP HANA, SAP Data Intelligence, SAP AI Core) or partner technology (e.g. Azure, GCP, AWS).
  • Management and operations of AI Functions (versioning, deployments, monitoring) can be handled across SAP via SAP AI Launchpad.

Side-by-Side ML shall be applied when the following criteria are valid:

  • ML logic resides on the SAP BTP platform while the business logic can be based on SAP S/4HANA or SAP BTP
  • Use case is complex like image recognition or natural language processing where among others neural networks with high demand for data, RAM and CPU/GPU time
  • Huge volume of external data is required for model training, main focus is on processing unstructured data
  • Required algorithms are not provided by HANA ML, but by other libraries for e.g. TenserFlow, SciKit-learn

Now that we understood the background behind the AI foundation used for the side-by-side ML, let us now touch base briefly into the architecture of these side-by-side ML models. Here you notice that you could leverage one of the 3 approaches using SAP Data Intelligence, SAP AI business services or the other partner offerings based on the hyper-scaler platforms such as the AWS, GCP or Azure.

Continuing a bit further, I would also like to provide a brief architectural representation of how side-by-side ML models can be built exclusively using SAP Data Intelligence. The architecture below explains how the ML models can be built using SAP DI which leverages the pipeline engine and the other data science tools. The business logic stays in SAP S/4HANA and the ML service is consumed by the SAP S/4HANA app as shown below. We discuss in detail in the part 7 of the blog series on how the consumption of these ML services happen in the context of SAP S/4HANA business processes.

The following figure also explains how the side-by-side ML models can leverage the re-usable AI business services. We shall explain this in detail in another blog later. As you can see the AI business services utilize the underlying AI foundation layer to retrieve the AI functions and build re-usable business services that are consumed by the SAP Apps.

 

Extending the core:

Further you should be able to extend the current predictive models embedded in SAP S/4HANA processes by leveraging the “SAP Analytics Cloud – Smart Predict”. You are already able to create new predictive models using “SAC – smart predict” based out of the SAP S/4HANA business processes and visualize the predictions in SAP Analytics Cloud dashboards. In this approach, the business user has the added opportunity to try out or create a few templates or user stories based on the white listed CDS views from SAP S/4HANA. While some of the machine learning functionality is already embedded into the SAP S/4HANA business processes, in some instances the business user might want to just create or showcase new BI dashboards based on the available functionality in SAP S/4HANA. A quick prediction or simulation leveraging the Smart insights and Smart Predict functionality in SAP Analytics Cloud would help the business user present this functionality to the end users and executive management. If some of this functionality is needed to be embedded into the business process, the request could be raised and developed into the SAP S/4HANA business process as an embedded ML model/process.

In the meanwhile with the inception of the upgraded PAi into ISLM (Intelligent Scenario Lifecycle Management), the concept of updating and modifying the models would be possible already with the embedded ISLM in S/4. Please check the blog and blog series in this regard.

The below figure explains the different ways of leveraging end-to-end predictive scenarios using SAP Analytics Cloud:

In the next blog, we discuss a bit more details on the different approaches and process flows involved in leveraging Predictive Intelligence.

Here are some quick links to the blogs in this series to give you a complete understanding of how Predictive Intelligence is infused into SAP S/4HANA.

Happy predicting the future!!

 

Assigned tags

      2 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Ashish Srivastava
      Ashish Srivastava

      Excellent blog, it clarified many doubts which I had.

      Author's profile photo Shishupalreddy Ramreddy
      Shishupalreddy Ramreddy

      The blog series has given the holistic view on how SAP is leveraging Intelligent Technologies. It is the starting point , looking forward to stich threads from here on  Thanks a lot for the share