Open source Machine Learning applied on SAP ERP Central Component
After reading the article “How to use machine learning for anomaly detection and condition monitoring”, completing openSAP courses “SAP Data Intelligence for Enterprise AI” and “SAP Leonardo – Enabling the Intelligent Enterprise“, participating in kaggle competitions and working many years with continually growing SAP landscapes, I wondered how to use an open source Machine Learning (ML) if a a company is rooted to SAP ERP Central Component or even previous versions. I sketched the folowing architecture.
The goal of ML is to solve a growing in complexity problem. In my mind came a problem I have seen over time from SAP R/3 Edition 4.0 to SAP ECC 6.0, for instance, in material management from time to time occur sever problems from incorrect postings of material documents. Mostly users mess up with number of zero for quantity or price or other sever mistakes. If the mistake founded sooner it is easier to correct all linked transaction to reverse and correct. A standard way to prevent such problems is to create a validation or broadcasting a developed report based on deterministic rules for all cases. This is an ongoing process because is difficult to predict all the cases from the beginning. When internal audit detects and reports new mistakes, analyst and developer has to analyze and include findings in validation as additional rules.
Even if your company does not have a entire landscape SAP S/4HANA Data Intelligence and SAP Leonardo there are solutions to start a proof of a concept with open source ML to detect mistakes using anomaly detection. When the time comes, transfer the concept to a native SAP Data Lake to scale it to a real-time pipeline model.
The architecture for a proof of concept is following:
- Develop an ABAP program to save in table ZTAB_FACT factual (posted) material documents data of relevant fields (features).
- Develop an ABAP Remote Function Call (RFC) ZRFC_FACT to return data from table ZTAB_FACT.
- Create, train, validate and save a Python ML model ml_material_anomaly for anomaly detection which calls ZRFC_FACT to return data from table ZTAB_FACT.
- Develop an ABAP program to save periodically in table ZTAB_ACTUAL actual (posted) material documents data of relevant fields (features).
- Develop a RFC ZRFC_ACTUAL to return data from table ZTAB_ACTUAL.
- Develop a RFC ZRFC_PREDICTED, which saves predicted data in table ZTAB_PREDICTED.
- Develop a Python program which runs periodically on a server every hour, calls ZRFC_ACTUAL to return data from table ZTAB_ACTUAL, executes ML model ml_material_anomaly to detect (predict) anomaly and calls RFC ZRFC_PREDICTED to save data in table ZTAB_PREDICTED.
- Develop a report, which displays detected anomalies from table ZTAB_PREDICTED.
- Develop an ABAP report, which broadcasts to a distribution list detected anomaly from table ZTAB_PREDICTED.
One can apply this idea to other domains; payment orders for instance, could provide detection of mistakes or even frauds.
To facilitate consumption of RFCs they can be configured as OData RESTful APIs.
SAP Netweaver Gateway comes with design-time tools to facilitate modeling OData services for consumption. These tools improve user experience and provide automatic connectivity to the SAP backend that reduces development efforts and improves productivity.
SAP Netweaver Gateway sits on top of the existing SAP Backend infrastructure. There are set of add-on components that needs to be installed on a SAP Backend system to enable SAP Netweaver Gateway services. It is this architecture that enables consumption of SAP data by variety of interfaces such as BAPIs and RFCs.
Hi Sergio, that's an interesting idea. I'm not so sure about allowing outside applications to manipulate data on database via RFC. Wouldn't a simple REST API provided by the python model with the results of the prediction requested at runtime of your ABAP report be a bit more elegant? Or did you want to avoid creating a database for it?
works with RFC.
I added: SAP Netweaver Gateway comes with design-time tools to facilitate modeling OData services for consumption.
Hello Sergiu Iatco
did you perform some tasks on your project ?
I am sketching a proof of concept and if data and model are promissory I will move further to development.