Skip to Content
Product Information

SAP Predictive Asset Insights, machine learning engine extension

Please note that this way of enabling Machine Learning extensibility is not actively developed further.

SAP Intelligent Asset Management solutions enable digital transformation of asset management using machine learning and real-time data from IoT-enabled smart assets to bring intelligence to your entire supply chain. You can continuously improve processes and predict outcomes, collaborate across networks and offer differentiated service, while generating new revenue streams and developing new ways to compete and disrupt your industry (see https://www.sap.com/about/announcement/intelligent-asset-management.html).
SAP Intelligent Asset Management names the group of the five cloud-based SAP offerings within the overall EAM portfolio. The five offerings are: SAP Asset Intelligent Network, SAP Asset Strategy and Performance Management, SAP Predictive Asset Insights, SAP Predictive Engineering Insights, and SAP Mobile Asset Management.

SAP Predictive Asset Insights combines sensor data with business information to help improve service profitability, reduce maintenance costs and increase asset availability.

Machine Learning is being used as part of SAP Predictive Asset Insights to detect failure patterns in the collected sensor data and thus alert about necessary maintenance. To this end, the solution contains a variety of standard algorithms which were developed in close collaboration with our customers and are suitable for a generic approach to predictive maintenance, supporting all asset types.

However, the algorithms offered as part of the standard are not suitable for all use cases – they focus on Anomaly Detection and Failure Prediction. To enable addressing additional use cases and to allow customers and partners to use their existing – potentially asset-specific – algorithms in conjunction with SAP Predictive Asset Insights, a way of extending the standard functionality is required.

Overview

To address this need, the SAP Predictive Asset Insights, machine learning engine extension has been released. It comprises a set of client tools that can be used to extract the fusion of the sensor and the maintenance data and to write back the analysis results into SAP Predictive Asset Insights. The tool may be used from command line, but we also provide wrappers for R and Python such that the tool’s usage may be embedded in the user’s analysis scripts.

 

The process that is followed using the SAP Predictive Asset Insights, machine learning engine extension, is depicted in Figure 1. Using the provided client tools, the user may extract a combination of sensor / indicator and master data from the SAP Predictive Asset Insights backend. First of all, a dataset definition stating for which Equipment resp. Equipment Model data is to be retrieved. The dataset definition also contains information regarding which indicators to read from and how to aggregate their values. Optionally, the dataset definition may also contain a failure flag, indicating at which points in time the equipment has experienced a failure.

Based on this definition together with the time frame for which data should be extracted, a dataset can be retrieved using the client tool.
The extracted dataset may be used for exploratory analysis, development/testing of a new algorithm or productively for applying a previously defined model. Using the tool resulting scores (in the form of time series that are linked to an equipment) can be written back to SAP Predictive Asset Insights. Inside SAP Predictive Asset Insights the results are modelled as indicators. Thus, it is possible to define rules on top of the results to detect threshold violations and trigger follow-up actions to create a notification or alert.

Technical Details

The SAP Predictive Asset Insights, machine learning engine extension offers three different interfaces: a Java command line tool, an R package which offers a binding to R data.table, and a Python package offering a binding to pandas dataframes.

In this blog post, examples will focus on the Python interface. For more details on the other interfaces as well as installation instructions, please refer to the README file that is provided as part of the delivery.

Credentials

Before downloading or uploading data, you have to establish a connection to the SAP Predictive Asset Insights system. This requires API access keys for the underlying Asset Central Foundation and SAP Internet of Things (SAP IoT). You can create the service keys from the CF Marketplace in the subdomain that is subscribed to Asset Central Foundation and SAP IoT using the commands cf marketplace, cf create-service, cf create-service-key and retrieve it using cf service-key.

Establish connection

To establish the connection to the backend you need the following access keys: AC_SERVICE_KEY, IOT_AE_SERVICE_KEY, PDMS_SERVICE_KEY. You may then use the following code snippet to configure access:


from mle_connector import MLEConnector
connector = MLEConnector(jar_path = "mle-cli.jar", asset_central_credentials = ac_key, iot_ae_credentials = iot_key, pdms_credential = pdms_key)

Retrieve data

When collecting features from the system, the connector requires a dataset definition (dataset). This dataset is the JSON representation of the dataset in the ‘Health Indicator Data Set Configuration’.

You can use the UI to create a dataset with your data selection and aggregations and copy the JSON to your Python or R IDE.

To obtain the dataset JSON, use this URL:

https://<IAM Launchpad URL>/comsapdsciamhic.comsappdmsdssdataSetConfigs/mle/api/v2/dataset-configs/<dataset configuration name>

If for example your dataset is named test123, visit the following URL,

https://foo-bar-unified.iam.cfapps.eu10.hana.ondemand.com/comsapdsciamhic.comsappdmsdssdataSetConfigs/mle/api/v2/dataset-configs/test123

and copy the JSON to Python or R.

dataset = '{dataset = '{"name":"n",
"description":"d",
"equipmentModelName":"200 Series",
"equipmentModelId":"9E2E",
"nullValueStrategy":"Ignore",
"features": [{"templateId": "GH0100304AEFE7A616005E02C64AE887", "indicatorGroup":"Rotating_Equipment_Measurements", "indicatorGroupId": "SA0100304AEFE7A616005E02C64AE827",
"indicator":"Temperature", "indicatorId": " IA0100304AEFE7A616005E02C64AE998",
"aggregateFunction":"AVG","bucketSizeMultiple":1,"bucketOffsetMultiple":0}],
"duration":"PT2M"}''

data = connector.collect("2018-01-01T00:00:00Z", "2019-10-01T00:00:00Z", dataset)

The result is a table with the following structure:

Equipment, EquipmentModel, Timestamp, Rotating_Equipment_Measurements.Temperature_AVG_PT2M

A sample dataset with Alerts as Labels looks like this:

{"name":"n",
"description":"",
"equipmentModelName":"n",
"equipmentModelId":"1CEE3",
"nullValueStrategy":"Ignore",
"features": [{"templateId": "GH0100304AEFE7A616005E02C64AE887", "indicatorGroup":"Rotating_Equipment_Measurements", "indicatorGroupId": "SA0100304AEFE7A616005E02C64AE827",
"indicator":"Temperature", "indicatorId": " IA0100304AEFE7A616005E02C64AE998",
"aggregateFunction":"AVG","bucketSizeMultiple":1,"bucketOffsetMultiple":0}],,
"labels":{"predictionWindowMultiple":1,"leadTime":"PT5H","alerts":[{"type":"EDBEAF86D3D547C58E46DCBDA0B285BE", "severity":1},]},
"duration":"PT2M"}

Once you have retrieved the data, you might use it in your environment for exploration purposes, for training or for scoring a model.

Persist Scores

Once you have finished your analysis, you may upload the results to SAP Predictive Asset Insights.
In this example scores is a pandas dataframe with columns: Equipment (string), EquipmentModel (string), Timestamp (datetime64[ns, UTC]), score (float64)

Mapping defines mapping from column to Indicator.

mapping = [{"name": "score","templateId": "GH0100304AEFE7A616005E02C64AE887", "IndicatorGroup": "Health_Indicators", "indicatorGroupId": "SA0100304AEFE7A616005E02C64AE827", "Indicator": "Health_Score", "indicatorId": " IA0100304AEFE7A616005E02C64AE998"}]

connector.persist(scores, mapping)

The persist method has no return value and throws a ValueError in case something goes wrong.

 

Credentials Security

You use API access keys to establish a connection to the SAP Predictive Asset Insights system in order to download and upload data (for more information, see the Credentials section). The authentication of these keys is handled by the SAP Business Technology Platform and we achieve this authentication by binding our service to the User Account and Authentication (UAA) service.

For the security of the API access keys, you need to consider that you are responsible to protect the keys and take the security of PCs into account to avoid unauthorized or illegal use.

This also includes that you may use devices that might not be fully under the control of SAP or IT experts of your company. When you are using different devices, you need to be aware that features, for example, screen lock or storage media encryption work differently on different devices. You are responsible to store the keys on a device that is secure and that in case of theft or loss, your device is protected by appropriate encryption mechanisms.

Otherwise – if attackers get control of your PC – they can by-pass protection measures, access the keys and unauthorized data, change your source code and potentially perform a denial of service (DoS) attack.

Conclusion

Using the newly released SAP Predictive Asset Insights, machine learning engine extension you can extend the currently available machine learning capabilities and thus include your own algorithms and analysis methods into the enhanced maintenance processes enabled by SAP Predictive Asset Insights.

Resources

The SAP Predictive Asset Insights, machine learning engine extension can be obtained here.
You find more detailed documentation of the functionality in the README provided as part of the delivery.
We also plan to deliver another blog post with an end-to-end example of the process.

/
1 Comment
You must be Logged on to comment or reply to a post.
  • Actually, this article is best explanation of SAP's approach and foundation for Predictive Maintenance. I believe, next step here should be about business-challenges, that could be overcome with newly built models. I assume it's a wide high-variability topic, but still I'd be happy to read your thoughts as a next step after this explanation.

    Thank you!