Skip to Content
Technical Articles

SAP Analytics Cloud: Live Smart Predict with SAP HANA

In this blog I will describe the steps needed to use the SAC Live Smart Predict aka Live Predict, that is now available with SAP Analytics Cloud (SAC).  This allows using Smart Predict with live (remote) datasets that reside on-premise in SAP HANA.  This data will not be imported into SAC, it stays on-premise.

For this HANA connection, you require the SAP Cloud Connector (SCC), an SAC data repository, and an SAC (live) dataset.  This is required in addition to the SAP HANA CORS (HTTPS) configuration that I have described previously.

The official SAP documentation for this feature can be found here, including an architecture diagram.

Components Needed

  • SAP HANA (on-premise) with the Automated Predictive Library (APL)
  • SAP Cloud Connector (you will install this shortly)
  • SAP Cloud Platform Account
  • SAP Analytics Cloud CF Tenant (us10, eu10, jp10) and not Neo (us1, eu1, jp1)

Steps required

  1. Link SAC Tenant to SAP Cloud Platform Account
  2. Verify SAP HANA (on-premise) has the Automated Predictive Library installed
  3. Install the SAP Cloud Connector
  4. Configure the SAP Cloud Connector
  5. Add SAC Data Repository
  6. Create SAC Reference to Dataset
  7. Create Predictive Scenario

1. Link SAC Tenant to SAP Cloud Platform Account

In the SAC environment you need to link the tenant to a SAP Cloud Platform user account.

This is done by simply setting your username (email) in the system administration, datasource configuration.

Once complete you will see a Subaccount and region host that will be entered into the Cloud Connector Configuration later.

2. Verify SAP HANA (on-premise) has the Automated Predictive Library installed

Within HANA, the Automated Predictive Library (APL) v4, 1906 or higher is required. You can verify this with the SQL call statement below.

If you don’t have this installed, are missing permissions or don’t have the correct version you should fix that.  I have previously described How to Install the Automated Predictive Library, that was for APL v1.1, so it’s probably best to check Andreas’s blog that includes an APL v4 update. Installing the Automated Predictive Library (APL) on SAP HANA Express

call "SAP_PA_APL".""(?)



3. Install the SAP Cloud Connector

For simplicity I installed the linux version of the SAP Cloud Connector (SCC) on the HANA box.

You download the SCC from the SAP Development Tools site

The cloud connector requires a JVM, to ensure compatibility I downloaded the SAP JVM.

You should have 2 files downloaded onto the linux box.

Unzip the the JVM by extracting it to the target dir

Check if there is a JAVA_HOME already set

Install the Cloud Connector package

If you do see any errors during installation, it is easiest to correct these uninstall and re-install with rpm commands similar to the following.

## Query for scc package
rpm -qa |grep scc
## Erase / Uninstall 
rpm -e
## Install Package
rpm -ivh ./

The SAP Cloud Connector (scc_daemon) can be controlled with the systemctl commands.

systemctl status scc_daemon
systemctl restart scc_daemon

4. Configure the SAP Cloud Connector

You need to open the Cloud Connector Homepage, you may see some warnings because the SSL certificate is self-signed.  I had to use Safari in private browsing mode to get it to open and accept the insecure (untrusted) SSL cert.  This certificate can be replaced with a signed one to avoid this issue.


The default SCC username password is Administrator / manage, this will be changed on first login.

You connect to the SAC tenant Sub-account from step 1, using your SCP credentials (usually S ID or email).  For SAP employees, this is your global password.

With the Subaccount is defined, you add a “Cloud To On-Premise” connection

The wizard guides us, mapping the internal physical host to a virtual host

You should see the mapping is added and the host is reachable.

5. Add SAC Data Repository

Using this Cloud Connector connection mapping defined, you can define this as a remote data repository

Verify the connection works

6. Create SAC Reference to Dataset

In the HANA environment I have already import the “APL_SAMPLES” schema and datasets.

Using the APL Samples, I can see the AUTO_CLAIMS_FRAUD Table

The table is populated with sample data.

In SAC, I create a reference to this data

Choose datasource

Connect to Live Data Repository

Choose the data repository

Select the desired table or SQL view.
If your data is in Calculation View, you will need to create a SQL view pointing to this.

This becomes our dataset, but the data remains in the source HANA environment

You can preview the data from the remote dataset

7. Create Predictive Scenario

In just a few clicks you can create a predictive scenario using the live dataset.

Select the required dataset

You can set the model parameters, select the target variable and input fields.

Once the model has been trained you get some useful output

From here you can chose to keep the model, modify the model parameters. You can chose to add a new model to our scenario or apply the existing model to a separate dataset.


You must be Logged on to comment or reply to a post.
  • Great blog Ian! I think one point that is worth highlighting as well when it comes to Smart Predict based on live SAP HANA data is the ability to create live BI stories in SAP Analytics Cloud. These stories effectively combine actuals on one hand and predictions on the other hand, which fully meets the promise of Augmented Analytics. 

  • Hi, great blog, is really helpful. I have only one question, this work only with HANA On Premise or can I make a Dataset from a Live Data Connection to a HANA Cloud (Cloud Foundry or Neo)?

    Thanks and Regards

    • Thanks Nicolas,

      I haven't tested, but I expect you can get a SAC Live Dataset from either HANA Cloud using the SAP Cloud Connector, but we do not yet support the Automated Predictive Library (APL) v4 in these environments.

  • Great Blog.

    I'm getting an error that "Data repository is not reachable from SAC" when I try to preview the data, even though I get a message that "Live dataset is created successfully"

  • Hello Punitha, feel free to raise a support ticket with SAP Support if you face problems. The component that should be used is LOD-ANA-PR. Kind regards, Antoine

  • Hi Ian,

    Excellent blog! I have set up the Data Repository without any issues in the past but recently it stopped working. In the meantime, we have moved to new CF tenants and I want to reconfigure the connection but I get an authorization error.

    {packageId: "", objectName: "message_dataRepository",…}
    args: []
    bUIMessage: false
    bWarning: false
    httpStatus: 403
    message: "You are not authorized to create data repository."
    objectName: "message_dataRepository"
    packageId: ""
    stack: "Exception@/sap/fpa/services/core/system/Exception.xsjslib:122↵RestException@/sap/fpa/services/dataRepository/RestException.xsjslib:37↵throwUnauthorizedError@/sap/fpa/services/dataRepository/helper/DataRepoAuthCheck.xsjslib:86↵assertPrivilegeOnDataRepo@/sap/fpa/services/dataRepository/helper/DataRepoAuthCheck.xsjslib:56↵RestDataRepository.prototype.doPost@/sap/fpa/services/dataRepository/api/RestDataRepository.xsjslib:65↵Rest.prototype.dispatch@/sap/fpa/services/core/rest/Rest.xsjslib:177↵handleRequest@/sap/fpa/services/GetResponse.xsjslib:184↵main@/sap/fpa/services/GetResponse.xsjslib:216↵@/sap/fpa/services/GetResponse.xsjs:9↵"
    status: 403

    Do you have any idea?

    Kind regards,

    Martijn van Foeken | Interdobs

  • Hi Ian,

    After going through your blog, which found it very informative while exploring the Live Smart Predict. However I have one query regarding the live data connection.

    EX: As in your use case "APL_AUTO_CLAIMS_FRAUD"  has been used as live data for creating the predictive scenario. but if data is saved locally in SAC then How can it be considered as live data or can it automatically update data inside SAC locally if data is updated inside HANA?



    Chandra Bhushan

    • Hi Chandra,

      The dataset APL_AUTO_CLAIMS_FRAUD is still "live", meaning in SAC we just have a reference to the data set stored in HANA. You preview the data via SAC, but we do not copy the data.  When the predictive scenario is created we push the processing to the data (HANA), using the APL in the source HANA system and only the summary results and metadata are kept in SAC.

  • Hi Ian,

    Thank you for your response. I was having query how "APL_AUTO_CLAIMS_FRAUD" is stored in HANA(manually or through an automatic SQL query)?

    What is definition of SAC coming to Acquired and Live data( in both the cases we are manually importing the data either to SAC or HANA) ?



    Chandra Bhushan

    • Hi Chandra,

      Sorry, I think I misunderstood your question. To make this clearer, I have just added some additional screenshots and descriptions to Step 6. Create SAC Reference to Dataset.

      Acquired data is imported into SAC (I am not describing this scenario)

      Live data remains in the source HANA and we only preview/visualise that in SAC. (This is the scenario I am using)

      Please let me know if that clears things up?

      • Hi Ian,

        Thank for clarification, but was having one query

        If the data is manually imported in HANA and then used in SAC for predictive scenario... do we call that as Live?
        Chandra Bhushan
        • Ah, that's different question I suppose. 🙂

          For me it is live as that is the "source" of the data and could be updated/modified.  We are still connecting to that source and not duplicating data.  Yes, in this example I used a sample dataset that is static, but the same configuration would also apply for data that has been loaded into HANA in real-time through replication. Alternatively you may have an application that runs on HANA and that data could also become a dataset for Smart Predict.

  • Hi Ian,

    thanks for the instructions and easy to follow guide. We did configure this, but get a connection error. I was wondering if this could have something to do with the mapping of the virtual/local host, since from what I understood our virtual host seems to be the same as the local, but in your instructions you use the virtual one in step 5, where you set up the repo in SAC. Is this step of creating a seperate virtual host mandatory?


    Kind regards


    • Hi Jasmin,

      You can use the same virtual and local host as long as it connects to the SQL port of your HANA instance. Can you share the configuration you have made in the Cloud Connector for the subaccount of your SAP Analytics Cloud tenant?

      Kind regards,

      Martijn van Foeken | Interdobs