Skip to Content
Technical Articles
Author's profile photo Gonzalo Hernan Sendra

How to run a HANA Stored Procedure in SAP Data Warehouse Cloud from SAP Analytics Cloud

There are situations where one might need to run some backend calculations from a story in SAP Analytics Cloud (SAC). For example, when we want to execute complex logic or calculations to data or parameters that have been updated.

In this post I will show how to use the multi-actions in SAP Analytics Cloud to call a Stored Procedure residing in SAP HANA Cloud or SAP Data Warehouse Cloud (DWC).

For this purpose I will use the recently published blog about how to create an API in Cloud Foundry to call Stored Procedures in SAP HANA Cloud, using the SAP Cloud Application Programming (CAP) model. In that post I created a stored procedure that adds a record to a table CALL_HISTORY with the current timestamp and some additional information. The same procedure will be used here, but deployed in SAP Data Warehouse Cloud. I will use the Multi-Actions in a SAP Analytic Cloud story to call that API. A view will be created in SAP DWC to display the content of CALL_HISTORY in the same SAC story.

Project Setup

The HANA stored procedure will be deployed in a HDI container that can reside in SAP HANA Cloud or within SAP Data Warehouse Cloud. This container will have the HANA native artefacts: table and procedures.

On top of that, SAP Cloud Application Programming model will be used to provide the API service as well as the user authentication in Cloud Foundry.

Option 1: HDI container in SAP HANA Cloud

This scenario is fully covered in the blog: how to create an API in Cloud Foundry to call Stored Procedures in SAP HANA Cloud, using the SAP Cloud Application Programming (CAP) model.

In the next sections I will used the name DWC_API_Service_Test for the application and services, which is exactly the same as the HANA_API_Service_Test created in that blog, but pointing to a HANA Cloud instance within SAP Data Warehouse Cloud, as described in the next option.

Option 2: HDI container within SAP Data Warehouse Cloud

In this case the same blog (how to create an API in Cloud Foundry to call Stored Procedures in SAP HANA Cloud, using the SAP Cloud Application Programming (CAP) model), will be used as baseline. However, as the project will be deployed in the HANA Cloud instance within SAP Data Warehouse Cloud, there are some additional considerations to take into account:

  • The BTP space needs to be connected to the SAP DWC tenant.
  • The HDI container needs to be deployed in the HANA Cloud instance of the SAP DWC tenant.
  • The HDI content (table and stored procedures) needs to become accessible from SAP DWC by adding roles to the project. These roles should provide SELECT and SELECT METADATA privileges to read the table and EXECUTE privilege to run the store procedures.

All these considerations are explained in the blog: Develop on SAP Data Warehouse Cloud using SAP HANA Deployment Infrastructure (HDI), by Nidhi Sawhney

API credentials

Once the API service is deployed in Cloud Foundry, it can be found in the BTP Cockpit, within the space applications. My application is named DWC_API_Service_Test-srv, which appears like:

BTP%20Cockpit%20-%20Applications

SAP BTP Cockpit – Applications

By opening the DWC_API_Service_Test-srv application, the API URL can be found in the Application Routes.

BTP%20Cockpit%20-%20API%20URL

SAP BTP Cockpit – API URL

Apart from that, the user authentication information will be needed, which is described in the services instances. In my case, this is the DWC_API_Service_Test-auth service:

BTP%20Cockpit%20-%20Services%20Instances

SAP BTP Cockpit – Services Instances

By clicking on View Credentials (top-right), all the details are displayed. Only the first three parameters are needed: clientid, clientsecret, and URL.

BTP%20Cockpit%20-%20Service%20Credentials

SAP BTP Cockpit – Service Credentials

With all these information we are ready to create a connection to the API in SAC.

HTTP API Connection in SAP Analytics Cloud

In SAP Analytics Cloud I will create a new connection for HTTP API:

SAP%20Analytics%20Cloud%20-%20New%20Connection

SAP Analytics Cloud – New Connection

The Data Service URL is the Application URL from the previous step.

For this API, the authentication type is OAuth 2.0 Client Credentials. The next three parameters are from the authentication service:

  • OAuth Client ID = clientid
  • Secret = clientsecret
  • Token URL is the user authentication URL with the addition of /oauth/token

SAP%20Analytics%20Cloud%20-%20HTTP%20API%20connection%20details

SAP Analytics Cloud – HTTP API connection details

After clicking OK, the connection should be successfully created.

Multi-Actions in SAP Analytics Cloud

The next step is to create a new Multi-Actions, by clicking on Multi-Actions on the left menu. I named it DWC_API_Service_Test. Then I added an API step,

SAP%20Analytics%20Cloud%20-%20Multi-Action

SAP Analytics Cloud – Multi-Action

A name should be given to the API step. In my case I used Stored_Procedure_API.

At this moment in time, only POST APIs are allowed in Multi-Actions. Therefore, I use the service: act_register_call_with_param (see previous blog: how to create an API in Cloud Foundry to call Stored Procedures in SAP HANA Cloud, using the SAP Cloud Application Programming (CAP) model)

In the API URL, the application URL is written with the addition of /catalog/

All parameters should be in the body:

{
    "comment": "Calling from SAC",
    "name": "Hernan",
    "value": 1
} 

For the execution results, I selected “Synchronous Return”, and I saved it

SAP%20Analytics%20Cloud%20-%20Multi-Action%20API%20Step

SAP Analytics Cloud – Multi-Action API Step

View in SAP Data Warehouse Cloud

In order to access the table CALL_HISTORY, the HDI container needs to be added to the SAP DWC space:

SAP%20Data%20Warehouse%20Cloud%20-%20Space%20Management

SAP Data Warehouse Cloud – Space Management

Once the HDI container is added, a graphical view can be created in the Data Builder. When creating a view, the table should be visible by going to sources. I created an analytical view with VALUE as a Measure in order to consume it from SAP Analytics Cloud.

SAP%20Data%20Warehouse%20Cloud%20-%20New%20Graphical%20View

SAP Data Warehouse Cloud – New Graphical View

After its deployment, it should be accessible from SAP Analytics Cloud.

Story in SAP Analytics Cloud

In SAP Analytics Cloud, I am going to create a basic story that connects to SAP Data Warehouse Cloud. After creating the new canvas, I added data from data source, in this case live data from SAP Data Warehouse Cloud.

SAP%20Analytics%20Cloud%20-%20Data%20Source

SAP Analytics Cloud – Data Source

In this example I just created a table to display the data and the Multi-Actions. The Multi-Action appears in the insert options as a Planning Trigger.

SAP%20Analytics%20Cloud%20-%20Insert%20Multi-Action

SAP Analytics Cloud – Insert Multi-Action

In the Multi-Actions settings I selected the Multi-Action name previously defined.

Then I added a table to display the View_CALL_HISTORY from SAP DWC.

SAP%20Analytics%20Cloud%20Story%20-%20Table%20configuration

SAP Analytics Cloud Story – Table configuration

Now, by clicking on the Multi-Action control, the API will be triggered. Once its execution finished, the data can be refreshed by clicking the Refresh button in the toolbar.

SAP%20Analytics%20Cloud%20story%20-%20Refresh%20data

SAP Analytics Cloud story – Refresh data

Summary

This blog post showcase an example on how to trigger the execution of a HANA Stored Procedure via API from SAP Analytics Cloud. The Stored Procedure can reside in SAP Data Warehouse Cloud as well as in an independent SAP HANA Cloud.

In this way, business users can execute logic in the backend directly from a report. Therefore, more complex use cases can be implemented. For example, Monte Carlo simulations leveraging the SAP HANA Predictive Analysis Library (PAL) could be triggered directly from SAP Analytics Cloud.

I want to thank Ian Henry, Maria TZATSOU and Nektarios Vasileiou for all the discussions, shared experiences and received support in relation to this blog post!

Additional Resources

Check community resources and ask questions on

 

 

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.