SAP Analytics Cloud: Live Smart Predict with SAP HANA
In this blog I will describe the steps needed to use the SAC Live Smart Predict aka Live Predict, that is now available with SAP Analytics Cloud (SAC). This allows using Smart Predict with live (remote) datasets that reside on-premise in SAP HANA. This data will not be imported into SAC, it stays on-premise.
For this HANA connection, you require the SAP Cloud Connector (SCC), an SAC data repository, and an SAC (live) dataset. This is required in addition to the SAP HANA CORS (HTTPS) configuration that I have described previously.
The official SAP documentation for this feature can be found here, including an architecture diagram.
- SAP HANA (on-premise) with the Automated Predictive Library (APL)
- SAP Cloud Connector (you will install this shortly)
- SAP Cloud Platform Account
- SAP Analytics Cloud CF Tenant (us10, eu10, jp10) and not Neo (us1, eu1, jp1)
- Link SAC Tenant to SAP Cloud Platform Account
- Verify SAP HANA (on-premise) has the Automated Predictive Library installed
- Install the SAP Cloud Connector
- Configure the SAP Cloud Connector
- Add SAC Data Repository
- Create SAC Reference to Dataset
- Create Predictive Scenario
1. Link SAC Tenant to SAP Cloud Platform Account
In the SAC environment you need to link the tenant to a SAP Cloud Platform user account.
This is done by simply setting your username (email) in the system administration, datasource configuration.
Once complete you will see a Subaccount and region host that will be entered into the Cloud Connector Configuration later.
2. Verify SAP HANA (on-premise) has the Automated Predictive Library installed
Within HANA, the Automated Predictive Library (APL) v4, 1906 or higher is required. You can verify this with the SQL call statement below.
If you don’t have this installed, are missing permissions or don’t have the correct version you should fix that. I have previously described How to Install the Automated Predictive Library, that was for APL v1.1, so it’s probably best to check Andreas’s blog that includes an APL v4 update. Installing the Automated Predictive Library (APL) on SAP HANA Express
3. Install the SAP Cloud Connector
For simplicity I installed the linux version of the SAP Cloud Connector (SCC) on the HANA box.
You download the SCC from the SAP Development Tools site
The cloud connector requires a JVM, to ensure compatibility I downloaded the SAP JVM.
You should have 2 files downloaded onto the linux box.
Unzip the the JVM by extracting it to the target dir
Check if there is a JAVA_HOME already set
Install the Cloud Connector package
If you do see any errors during installation, it is easiest to correct these uninstall and re-install with rpm commands similar to the following.
## Query for scc package rpm -qa |grep scc ## Erase / Uninstall rpm -e com.sap.scc-ui-2.12.1-5.x86_64 ## Install Package rpm -ivh ./com.sap.scc-ui-2.12.1-5.x86_64.rpm
The SAP Cloud Connector (scc_daemon) can be controlled with the systemctl commands.
systemctl status scc_daemon systemctl restart scc_daemon
4. Configure the SAP Cloud Connector
You need to open the Cloud Connector Homepage, you may see some warnings because the SSL certificate is self-signed. I had to use Safari in private browsing mode to get it to open and accept the insecure (untrusted) SSL cert. This certificate can be replaced with a signed one to avoid this issue.
The default SCC username password is Administrator / manage, this will be changed on first login.
You connect to the SAC tenant Sub-account from step 1, using your SCP credentials (usually S ID or email). For SAP employees, this is your global password.
With the Subaccount is defined, you add a “Cloud To On-Premise” connection
The wizard guides us, mapping the internal physical host to a virtual host
You should see the mapping is added and the host is reachable.
5. Add SAC Data Repository
Using this Cloud Connector connection mapping defined, you can define this as a remote data repository
Verify the connection works
6. Create SAC Reference to Dataset
In the HANA environment I have already import the “APL_SAMPLES” schema and datasets.
Using the APL Samples, I can see the AUTO_CLAIMS_FRAUD Table
The table is populated with sample data.
In SAC, I create a reference to this data
Connect to Live Data Repository
Choose the data repository
Select the desired table or SQL view.
If your data is in Calculation View, you will need to create a SQL view pointing to this.
This becomes our dataset, but the data remains in the source HANA environment
You can preview the data from the remote dataset
7. Create Predictive Scenario
In just a few clicks you can create a predictive scenario using the live dataset.
Select the required dataset
You can set the model parameters, select the target variable and input fields.
Once the model has been trained you get some useful output
From here you can chose to keep the model, modify the model parameters. You can chose to add a new model to our scenario or apply the existing model to a separate dataset.