Skip to Content
Technical Articles

Data Intelligence integration with SAP Analytics Cloud

In this blog post I will describe how we can push datasets from SAP Data Intelligence to SAP Analytics Cloud (SAC).

In Data Intelligence, we have two pipeline operators, SAC Formatter and SAC Producer.
There is a great example graph Push to SAP Analytics Cloud, to help get you started.

To achieve this we used these capabilities

  • SAP Data Intelligence Modeler
    • Decode Table
    • SAP Analytics Cloud Formatter
    • SAP Analytics Cloud Producer
  • SAP Analytics Cloud
    • App Integration – Data Set API
    • Dataset consumption
  • SAC Dataset Limitations
    • 2 billion rows
    • 1000 columns
    • 100 MB per HTTP request

Previously I have shared How to Automate Web Data acquisition with SAP Data Hub, now we can extend that, by pushing that dataset to SAC.

The previous graph looks as below and runs in Data Hub 2.7

Before we can push the data to SAC, we need to format the data into the new message.table format.  For this we use the Decode Table operator. Set input format as CSV

Using the SAC Cloud Formatter, we specify how SAC should create the dataset.

In the exchange rate data download from the web, we do not have usable column headings, those need to be provided in the Output Schema.

Switching to SAP Analytics Cloud.

SAC has API access that we need to enable, we navigate to System Administration – App Integration.
Here we see the parameters used to authenticate.
We need to add a new client enable this access through by adding a new OAuth Client.

Complete the OAuth Client request as below, replacing the Data Hub host name

https://<SAP_Data_Intelligence_Hostname>/**

Figure%20x%3A%20OAuth%20Client%20Information

Figure x: OAuth Client Information

Press the Add button, and it will generate a Client ID and Secret to use in our Data Hub operator.

Figure x: OAuth Client Config

Copy and paste this information into our SAP Analytics Cloud Producer.

Our completed pipeline looks like this

Run the pipeline and check the WireTap connected to the SAP Producer.
It shows an openapi.status_code 401 Unauthorized message.

To resolve the openapi.status_code 401 Unauthorized, we must authorize the API Token access.  This is available through the Open UI option in the SAP Analytics Cloud Producer

Opening the UI, brings you here, click on the link, which grants authorization.

Granting permission shows you the Access Token

We should stop and re-run the pipeline, and check the Wiretap once more.

If you see other messages such as openapi.status_code 403 Forbidden then you will likely need to get the SAC dataset API enabled on your SAC tenant, this can be done by logging a support ticket or a Jira if you are SAP employee.

If we do have the dataset API enabled, the output should be similar to the Wiretap below showing we received 2 API responses.

At 11:07:57 we see the existing SAC datasets and their IDs.

At 11:08:01 we see the creation of our new dataset DH_WEBDATA_EUR_GBP

Switching to SAC we can verify the dataset has been created under “My Files”

Opening the dataset shows our data safely in SAC

From here we can quickly build some cool visualisation and even use that data with Smart Predict.

Conclusion

We can now seamlessly push data into SAC from either SAP Data Hub or SAP Data Intelligence. This could be simple data movement or to integrate the output of machine learning models from Data Intelligence in SAP Analytics Cloud.  I hope this blog post has helped you better understand another integration option.

30 Comments
You must be Logged on to comment or reply to a post.
  • Hi Ian,

    Thanks for your article!

    I’ve recently got a question from customer regarding integration from SAP Analytics Cloud, but it was about exporting the data from SAC and not pushing it. As an example, in planning scenario the customers may export the forecast and publish it somewhere else or somehow process it further. What would be the right way to do it? I’ve heard that CPI-PI iFlow operator may be used here, but it’s not quite clear for me how to do it.

    Thanks,

    — Maria

  • Hi Ian,

    First of all, great blog! I do have a question however: Is there a parameter to specify in which folder in SAP Analytics Cloud the dataset is created/updated? If the dataset ends up in the root folder of the user specified in the ‘job’ it will be an additional step to setup sharing for each individual dataset.

    Thanks for your reply!

    Kind regards,

    Martijn

    • Hi Martijn,

      Thanks for the feedback.  You are correct, the dataset will be stored in the root folder of the user who authenticated to SAC.  Currently moving the dataset would be an additional step required.

      Thanks, Ian.

      • Hi Ian,

        Thanks for your reply. If you would move the dataset will it break the flow in Data Intelligence? Or does it refer to the internal ID of the dataset in SAC?

        Kind regards,

        Martijn van Foeken | Interdobs

        • Yes, you can move the dataset once created.

          When you append to a dataset, you are required to specify the Dataset ID.  The Dataset ID is being used to refer to the existing datasets.
          Beware, currently DataHub only supports new and append and not overwrite for datasets.

  • Hello Ian,

    Thank you so much for the article! I am trying to get this to work, but when I select “API Access” for the Purpose it does not ask for a “Redirect URI”, but when I select “Interactive Usage” it does. What option should I go for?

     

    “Interactive Usage”:

    “API Access”:

     

    Thank you so much!

    Mary Liu

  • Hi Mary Liu,

    I checked my SAC tenant today, I am on 2020.3.1
    Your SAC OAuth Client screens appear to be different to my tenant.
    For me both Interactive usage and API Access have the Redirect URI.

    I would try Interactive Usage looks the most similar to my configuration.

    Let me know how you get on.

  • Hello Ian,

     

    i tried to do the same thing but in my wiretap i get a 400 error

     

    I get an acces token in the SAP Analytics Producer but dont know why i get “HTTP 400 Bad Request” ..the Host of SAC should be right

    /
    • Hi Jurgen,

      HTTP 400 Bad Request, suggests to me that SAC doesn’t like what you are sending it.

      I would add a multiplexer and review how the Table Decoder is outputting the message.table.

      You may have the wrong definition or not specified the column headings and/or data types correctly.

    • I also got many 400 errors and could not find a comprehensive description of what is allowed by SAC.

      • CSV file heards can contain only english letters, numbers (no asian characters) and some special characters such as – or _
      • They must start by an english letter.
      • To use the headers from your CSV directly, the data type must be “string”. Then you have to switch it back to an integer in the SAC model.
  • Hi Ian,

    Thank you for the article, it is very helpful.

    I get the error:

    {“message.request.id”:””,”openapi.header.content-type”:”application/json”,”openapi.status_code”:”403″}{“status”:403,”message”:””}

    You mention: “If you see other messages such as openapi.status_code 403 Forbidden then you will likely need to get the SAC dataset API enabled on your SAC tenant, this can be done by logging a support ticket or a Jira if you are SAP employee.” .

    What needs to be done exactly (if not an SAP employee), I do not find any documentation/note for dataset API.

    Thank you,

    Ana Maria

    • Hi Ana Maria,

      Thanks.  The dataset API toggle needs to be enabled your SAC tenant.  The process is actually the same.  You should log a support ticket and ask for the datatset API toggle to be enabled.

      Mention that you are attempting to push a dataset from Data Intelligence.

  • Hi Ian,

    Thank you very much for this blog!

    I have one question: Does it work in DI on-prem version as well ? If I have a DI 3.0 op deployed on AWS EKS, how should I config the connection to SAC ?

    BR, James.

    • Thanks James,

      Yes, it is exactly the same with DI on-prem or DI cloud edition. I have configured both with no changes required other than the redirerct url hostname for the SAC OAuth Client.

  • Hi,

    the connector creates a dataset in SAC. How would we create a model and update the model on a regular basis from DI ?

    Thanks for any hint.

     

    Regards

    Marcus

    • Hi Marcus,

      The Data Intelligence integration happens at the dataset level.  You would build a model on top of the dataset. You could then use Data Intelligence integration to update the data in this model.

      • Hi Ian,

         

        thanks for the reply, we were told that a model (with planning) could not be updated by a schedule. We only find the typical list of connections in the menu (e.g. Google Drive, which we actually use to write to from DI and update the model from in SAC). Could you add a bit more on the procedure to schedule a model data upload from DI ?

        Many Thanks

        Marcus

  • Hello Ian,

     

    Any clue on this message, guessing maybe something to do with SAC Authorizations ?

    I’m using SAP DI cloud instance.

    {“message.request.id”:”initialRequest”,”openapi.header.content-type”:”application/json”,”openapi.status_code”:”500″}{“status”:500,”message”:”Access to V1 datasets is not allowed.”}

     

    Regards,

    Vinay

    • Hi Vinay,

      In the “SAC Cloud Formatter” Are you creating a “New” dataset or trying to “Overwrite” an existing dataset named V1?

      Do you see the existing SAC datasets retrieved?

      Cheers, Ian.

  • Hi Ian,

    I got a message as below. I’ve successfully granted authorization like you mentioned above. Could you please give me some hints?

    “{“message.request.id”:””,”openapi.header.content-type”:”application/json”,”openapi.status_code”:”500″}{“status”:500,”message”:”The resource does not exist, or you do not have permission to read it”}”

    Thanks in advance,

    Nina

  • Hi Nina,

    Do you see anything before this in the wiretap?
    You would usually see your existing datasets being retrieved.

    I suspect one of the URLs is incorrect, perhaps the tenant name in the SAC Formatter or the host in the SAC Producer?

    Have you logged a ticket to enable the Dataset API?

  • hi Ian,

    I have my instance in Azure, but I keep getting this error. Even after I stop and restart the pipeline.

    I see my DI server is Not Secure in the browser URL Not sure, if it is b/c of that.

    when I view the Wiretap, it is saying lookup error / no such host on the (error image below in the 2nd image). Not sure, what I can do to fix this. Thanks,

    Not%20Secured

    SAC Producer UI Error

     

    wiretap%20error

    Wiretap error

     

    • This could be a networking issue.

      Perhaps the NAT gateway or similar on your cluster is not correct?

      Can you connect to other external sites using a full qualified name?

      You could test with the HTTP Client.

      • Thank you Ian.

        Yes. I am connecting to this FQDN (DI launchpad) using my SAP system though the instance is in customer’s azure.

        The cluster load balancer is open to public, though it is inside the customer’s on-premise azure. For e.g., I was able to connect to the SAP HANA instance (using connection manager on port 30015) and it worked fine.

        Will http connection be a problem even with that?

        Thank you.

        • I was thinking of the networking going out to the Internet and not in to DI.

          Does the DI system understand external (internet) traffic/addresses?

          • if it is name resolution, (for sacgo-1.us10.sapanalytics.cloud) i doubt that it can do that. Not sure if I can put the IP address from nslookup of the same host.

          • In case anyone else experiences this issue, it was caused by the tenant parameter in the SAC Producer.

            It should be the hostname without any https://