Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Chandni
Product and Topic Expert
Product and Topic Expert

Introduction

We are all aware of SAP Analytics Cloud(SAC) powerful prediction capabilities which help to generate Time Series, Classification and Regression forecast. But, for ML enthusiasts who appreciate control over prediction algorithms and making forecast even more robust, without negotiating on SAC offerings, requires flexibility and openness for seamless SAC integration. SAC Data Export and Import Service can be an aid to these scenarios

SAC offers various REST APIs and ODATA endpoints which helps enduser to interact with SAC objects. You can find details about all APIs at SAP API Hub . In this blog, we will focus on SAC Data Export Service(DES) and Data Import Service(DIS).

Scenario

A Waste Management organisation is looking for ways to predict waste generation in coming quarters. They have waste statistics for different waste type. But looking at the Recycling Rate(Waste Recycled/Waste Generated), plastic waste recycling is not up to the company defined thresholds. Before company takes any action on recycling techniques, they want to predict plastic waste generation for 2023Q2-2024Q1 across 5 regions they operate in.

Chandni_2-1709726208758.png

Lets look at the model "Waste Management" that will used going forward.

Chandni_0-1709725447444.png

What to achieve?

Predict plastic waste generation for period 2023Q2-2024Q1 using external prediction engines and bring predicted data to SAC via Data Export and Import services

How to achieve?

For this scenario, we are leveraging the The SAP Analytics Cloud Model REST API Wrapper, which is a Python wrapper for SAC model import and export APIs.
To start with, you need to import required modules followed connecting to SAC using OAuth connection. If you want to learn how to create OAuth Clients in SAC, Please refer OAuth clients

 

 

 

import pandas as pd
import json
import sacapi
import globalConfig
import numpy as np 
import pandas as pd 
import matplotlib.pyplot as plt
from statsmodels.tsa.seasonal import seasonal_decompose
import warnings
from statsmodels.tsa.api import ExponentialSmoothing
from statsmodels.tsa.api import Holt
from pandas import json_normalize
from sacapi import SACConnection

 

 

 

globalConfig holds all the required URLs to make a connection with SAC. Once the connection is successful , we will call provider api(data export) to get the model id for the model "Waste Management".

 

 

 

sac = sacapi.SACConnection(globalConfig.tenantName,globalConfig.dataCenter)
sac.connect(globalConfig.clientId, globalConfig.clientSecret)
modelId = sac.searchProviders('Waste Management')['Waste Management']
print(f"model Id - {modelId}")

 

 

 

Once we get the model id, we can check the dimensions available in the model.

 

 

 

metaData = sac.getModelMetadata(modelId)
dimensionKeys = metaData.dimensions.keys()
print([key for key in dimensionKeys])

 

 

 

We can apply filters to focus on the relevant dataset. In my case, I am restricting waste type to "Plastics" and taking one region at a time. Also taking training data from 2018 rather than the whole dataset. You can apply filters based on your requirements.

 

 

 

sac.addLogicalFilter(modelId, 'Waste_Type', 'Plastics', sac.filterOperators.EQUAL)
sac.addLogicalFilter(modelId, 'Date', '201801', sac.filterOperators.GREATER_THAN_OR_EQUAL)
sac.setFilterOrderBy(modelId, 'Date', 'asc')
sac.addLogicalFilter(modelId, 'Region', 'Central', sac.filterOperators.EQUAL)
print('Filters are applied!')

 

 

 

Once filters are applied, we can get the factdata from the model and convert it to DataFrames for further processing. We also have to make sure which version we will be used prediction. Here I am using Forecast version.

 

 

 

factdata = sac.getFactData(metaData)
dataFrame = pd.DataFrame.from_records(factdata)
publicBaselineDataFrame = dataFrame[dataFrame['Version']=='public.Forecast']

 

 

 

DataFrame now can be used for generating predictions, make sure data format is maintained. In my case, I am using Holts - Winters Seasonal Method for Time series forecasting, considering seasonality adjustment with exponential smoothing. To fit our problem statement, we will get prediction for 12 months 2023Q2-2024Q1.

 

 

 

fit = Holt(publicBaselineDataFrameSeries, damped_trend=True, initialization_method="estimated"). fit(smoothing_level=0.8, smoothing_trend=0.2)
fcast = fit.forecast(12)

 

 

 

Output:

2023-04     7625.740556
2023-05     7865.936682
2023-06     8103.730847
2023-07     8339.147070
2023-08     8572.209131
2023-09     8802.940572
2023-10     9031.364698
2023-11     9257.504582
2023-12     9481.383068
2024-01     9703.022769
2024-02     9922.446073
2024-03    10139.675144
Freq: M, dtype: float64

Before pushing data to our SAC model, we need to make sure that DataFame has corresponding data to all the dimensions in the model. For my scenario, date format should be YYYYMM acceptable by SAC.

 

 

 

forecast_dict = fDataFrame.to_dict(orient='records')
sac.upload(metaData, forecast_dict , factOnly=True)
print("Records pushed to SAC Model")

 

 

 

If data format is correct and there is no missing values for dimensions, data will be imported to the SAC model using data import service. You can also plot the predicted data using matplotlib library.
Pleas repeat the steps in case you need to predict for different datasets by applying filters. Once data is imported into the model, refresh your story to see predicted data.

Chandni_1-1709726173979.png

Reference

For detailed steps and understanding, please refer Devtoberfest session which explains much extensive use of DES and DIS.

Thanks for the read! Hope you find this blog helpful. Please share your thoughts and questions via comments. I request you to follow for future interesting blogs on SAP Analytics Cloud.

1 Comment