Skip to Content
Technical Articles

Data Intelligence integration with SAP Analytics Cloud

In this blog I will describe how we can push datasets from SAP Data Hub or SAP Data Intelligence (they have this same capability) to SAP Analytics Cloud (SAC).

In Data Hub, we have 2 new pipeline operators, SAC Formatter and SAC Producer.
There is a great example graph Push to SAP Analytics Cloud, to help get you started.

To achieve this we used these capabilities

  • SAP Data Hub Modeler
    • Decode Table
    • SAP Analytics Cloud Formatter
    • SAP Analytics Cloud Producer
  • SAP Analytics Cloud
    • App Integration – Data Set API
    • Dataset consumption
  • Dataset Limitations
    • 2 billion rows
    • 1000 columns
    • 100 MB per HTTP request

Previously I have shared How to Automate Web Data acquisition with SAP Data Hub, now we can extend that, by pushing that dataset to SAC.

The previous graph looks as below and runs in Data Hub 2.7

Before we can push the data to SAC, we need to format the data into the new message.table format.  For this we use the Decode Table operator. Set input format as CSV

Using the SAC Cloud Formatter (Beta), we specify how SAC should create the dataset.

In the exchange rate data download from the web, we do not have usable column headings, those need to be provided in the Output Schema.

Switching to SAP Analytics Cloud.

SAC has API access that we need to enable, we navigate to App Integration.
This is used to authenticate and publish the data to.
We need to add a new client enable this access through by adding a new OAuth Client.

Complete the OAuth Client request as below, replacing the Data Hub host name

https://<SAP_Datahub_host>/app/pipeline-modeler/service/v1/runtime/internal/redirect

Press the Add button, and it will generate a Client ID and Secret to use in our Data Hub operator.

Copy and paste this information into our SAP Analytics Cloud Producer (Beta).

Our completed pipeline looks like this

Run the pipeline and check the WireTap connected to the SAP Producer.
It shows an openapi.status_code 401 Unauthorized message.

To resolve the openapi.status_code 401 Unauthorized, we must authorize the API Token access.  This is available through the Open UI option in the SAP Analytics Cloud Producer

Opening the UI, brings you here, click on the link, which grants authorization.

Granting permission shows you the Access Token

We should stop and re-run the pipeline, and check the Wiretap once more.

If you see other messages such as openapi.status_code 403 Forbidden then you will likely need to get the SAC dataset API enabled on your SAC tenant, this can be done by logging a support ticket or a Jira if you are SAP employee.

If we do have the dataset API enabled, the output should be similar to tthe Wiretap below showing we received 2 API responses.

At 11:07:57 we see the existing SAC datasets and their IDs.

At 11:08:01 we see the creation of our new dataset DH_WEBDATA_EUR_GBP

Switching to SAC we can verify the dataset has been created under “My Files”

Opening the dataset shows our data safely in SAC

From here we can quickly build some cool visualisation and even use that data with Smart Predict.

Conclusion

We can now seamlessly push data into SAC from either SAP Data Hub or SAP Data Intelligence. This could be simple data movement or to integrate the output of machine learning models from Data Hub or Data Intelligence in SAP Analytics Cloud.  I hope this blog has helped you better understand the integration.

5 Comments
You must be Logged on to comment or reply to a post.
  • Hi Ian,

    Thanks for your article!

    I’ve recently got a question from customer regarding integration from SAP Analytics Cloud, but it was about exporting the data from SAC and not pushing it. As an example, in planning scenario the customers may export the forecast and publish it somewhere else or somehow process it further. What would be the right way to do it? I’ve heard that CPI-PI iFlow operator may be used here, but it’s not quite clear for me how to do it.

    Thanks,

    — Maria

  • Hi Ian,

    First of all, great blog! I do have a question however: Is there a parameter to specify in which folder in SAP Analytics Cloud the dataset is created/updated? If the dataset ends up in the root folder of the user specified in the ‘job’ it will be an additional step to setup sharing for each individual dataset.

    Thanks for your reply!

    Kind regards,

    Martijn

    • Hi Martijn,

      Thanks for the feedback.  You are correct, the dataset will be stored in the root folder of the user who authenticated to SAC.  Currently moving the dataset would be an additional step required.

      Thanks, Ian.

      • Hi Ian,

        Thanks for your reply. If you would move the dataset will it break the flow in Data Intelligence? Or does it refer to the internal ID of the dataset in SAC?

        Kind regards,

        Martijn van Foeken | Interdobs

        • Yes, you can move the dataset once created.

          When you append to a dataset, you are required to specify the Dataset ID.  The Dataset ID is being used to refer to the existing datasets.
          Beware, currently DataHub only supports new and append and not overwrite for datasets.