Skip to Content
Technical Articles
Author's profile photo Ian Henry

Data Intelligence integration with SAP Analytics Cloud (SAC)

In this blog post I will describe how we can push datasets from SAP Data Intelligence to SAP Analytics Cloud (SAC).

In Data Intelligence, we have two pipeline operators, SAC Formatter and SAC Producer.
There is a great example graph Push to SAP Analytics Cloud, to help get you started.

To achieve this we used these capabilities

  • SAP Data Intelligence Modeler
    • Decode Table
    • SAP Analytics Cloud Formatter
    • SAP Analytics Cloud Producer
  • SAP Analytics Cloud
    • App Integration – Data Set API
    • Dataset consumption
  • SAC Dataset Limitations
    • 2 billion rows
    • 1000 columns
    • 100 MB per HTTP request

Previously I have shared How to Automate Web Data acquisition with SAP Data Intelligence, now we can extend that, by pushing that dataset to SAC.

The previous graph looks as below and runs in Data Intelligence.

Before we can push the data to SAC, we need to format the data into the new message.table format.  For this we use the Decode Table operator. Set input format as CSV

Using the SAC Cloud Formatter, we specify how SAC should create the dataset.

In the exchange rate data download from the web, we do not have usable column headings, those need to be provided in the Output Schema.

Switching to SAP Analytics Cloud.

SAC has API access that we need to enable, we navigate to System Administration – App Integration.
Here we see the parameters used to authenticate.
We need to add a new client enable this access through by adding a new OAuth Client.

Complete the OAuth Client request as below, replacing the Data Intelligence host name

For CF SAC Tenants use this format for the redirect URI

https://<SAP_Data_Intelligence_Hostname>/**

For Neo Tenants use

https://<SAP_Data_Intelligence_Hostname>/app/pipeline-modeler/service/v1/runtime/internal/redirect

Figure%20x%3A%20OAuth%20Client%20Information

 SAC CF OAuth Client Information

Press the Add button, and it will generate a Client ID and Secret to use in our Data Intelligence operator.

SAC CF OAuth Client Config

 

If you have a Neo Tenant the New OAuth Client screen is slightly different, you are required to enter the OAuth Client ID and Secret that we will require in Data Intelligence.

SAC%20Neo%20OAuth%20Client%20Configuration

SAC Neo OAuth Client Configuration

 

Copy and paste your OAuth URLs from SAC and your OAuth Client ID and Secret information into the SAP Analytics Cloud Producer.

SAP%20Analytics%20Cloud%20Producer%20Node%20Config

SAP Analytics Cloud Producer Node Config

Our completed pipeline looks like this

Run the pipeline and check the WireTap connected to the SAP Producer.
It shows an openapi.status_code 401 Unauthorized message.

To resolve the openapi.status_code 401 Unauthorized, we must authorize the API Token access.  This is available through the Open UI option in the SAP Analytics Cloud Producer

Opening the UI, brings you here, click on the link, which grants authorization.

Granting permission shows you the Access Token

We should stop and re-run the pipeline, and check the Wiretap once more.

If you see other messages such as openapi.status_code 403 Forbidden then you will likely need to get the SAC dataset API enabled on your SAC tenant, this can be done by logging a support ticket or a Jira if you are SAP employee.

If we do have the dataset API enabled, the output should be similar to the Wiretap below showing we received 2 API responses.

At 11:07:57 we see the existing SAC datasets and their IDs.

At 11:08:01 we see the creation of our new dataset DH_WEBDATA_EUR_GBP

Switching to SAC we can verify the dataset has been created under “My Files”

Opening the dataset shows our data safely in SAC

From here we can quickly build some cool visualisation and even use that data with Smart Predict.

Conclusion

We can now seamlessly push data into SAC from SAP Data Intelligence Cloud or on-premises. This could be simple data movement or to integrate the output of machine learning models from Data Intelligence in SAP Analytics Cloud.  I hope this blog post has helped you better understand another integration option.

Assigned Tags

      44 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Maria Laricheva
      Maria Laricheva

      Hi Ian,

      Thanks for your article!

      I've recently got a question from customer regarding integration from SAP Analytics Cloud, but it was about exporting the data from SAC and not pushing it. As an example, in planning scenario the customers may export the forecast and publish it somewhere else or somehow process it further. What would be the right way to do it? I've heard that CPI-PI iFlow operator may be used here, but it's not quite clear for me how to do it.

      Thanks,

      -- Maria

      Author's profile photo Martijn van Foeken
      Martijn van Foeken

      Hi Ian,

      First of all, great blog! I do have a question however: Is there a parameter to specify in which folder in SAP Analytics Cloud the dataset is created/updated? If the dataset ends up in the root folder of the user specified in the 'job' it will be an additional step to setup sharing for each individual dataset.

      Thanks for your reply!

      Kind regards,

      Martijn

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Martijn,

      Thanks for the feedback.  You are correct, the dataset will be stored in the root folder of the user who authenticated to SAC.  Currently moving the dataset would be an additional step required.

      Thanks, Ian.

      Author's profile photo Martijn van Foeken
      Martijn van Foeken

      Hi Ian,

      Thanks for your reply. If you would move the dataset will it break the flow in Data Intelligence? Or does it refer to the internal ID of the dataset in SAC?

      Kind regards,

      Martijn van Foeken | Interdobs

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Yes, you can move the dataset once created.

      When you append to a dataset, you are required to specify the Dataset ID.  The Dataset ID is being used to refer to the existing datasets.
      Beware, currently DataHub only supports new and append and not overwrite for datasets.

      Author's profile photo Mary Liu
      Mary Liu

      Hello Ian,

      Thank you so much for the article! I am trying to get this to work, but when I select "API Access" for the Purpose it does not ask for a "Redirect URI", but when I select "Interactive Usage" it does. What option should I go for?

       

      "Interactive Usage":

      "API Access":

       

      Thank you so much!

      Mary Liu

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Mary Liu,

      I checked my SAC tenant today, I am on 2020.3.1
      Your SAC OAuth Client screens appear to be different to my tenant.
      For me both Interactive usage and API Access have the Redirect URI.

      I would try Interactive Usage looks the most similar to my configuration.

      Let me know how you get on.

      Author's profile photo Jürgen Wiebe
      Jürgen Wiebe

      Hello Ian,

       

      i tried to do the same thing but in my wiretap i get a 400 error

       

      I get an acces token in the SAP Analytics Producer but dont know why i get "HTTP 400 Bad Request" ..the Host of SAC should be right

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Jurgen,

      HTTP 400 Bad Request, suggests to me that SAC doesn't like what you are sending it.

      I would add a multiplexer and review how the Table Decoder is outputting the message.table.

      You may have the wrong definition or not specified the column headings and/or data types correctly.

      Author's profile photo Maxime Simon
      Maxime Simon

      I also got many 400 errors and could not find a comprehensive description of what is allowed by SAC.

      • CSV file headers can contain only english letters, numbers (no asian characters) and some special characters such as - or _
      • They must start by an english letter (not a number).
      • To use the headers from your CSV directly, the data type must be "string". Then you have to switch it back to an integer in the SAC model.
      Author's profile photo Bowen Ren
      Bowen Ren

      Thanks for the summary, it really helps out!~

       

      Author's profile photo Tom Hu
      Tom Hu

      Besides the limitations you shared, the column names have to be at least 2 characters long. Column names with a single letter also cause 400 error.

      Author's profile photo Ana-Maria TARCA
      Ana-Maria TARCA

      Hi Ian,

      Thank you for the article, it is very helpful.

      I get the error:

      {"message.request.id":"","openapi.header.content-type":"application/json","openapi.status_code":"403"}{"status":403,"message":""}

      You mention: "If you see other messages such as openapi.status_code 403 Forbidden then you will likely need to get the SAC dataset API enabled on your SAC tenant, this can be done by logging a support ticket or a Jira if you are SAP employee." .

      What needs to be done exactly (if not an SAP employee), I do not find any documentation/note for dataset API.

      Thank you,

      Ana Maria

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Ana Maria,

      Thanks.  The dataset API toggle needs to be enabled your SAC tenant.  The process is actually the same.  You should log a support ticket and ask for the datatset API toggle to be enabled.

      Mention that you are attempting to push a dataset from Data Intelligence.

      Author's profile photo James Yao
      James Yao

      Hi Ian,

      Thank you very much for this blog!

      I have one question: Does it work in DI on-prem version as well ? If I have a DI 3.0 op deployed on AWS EKS, how should I config the connection to SAC ?

      BR, James.

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Thanks James,

      Yes, it is exactly the same with DI on-prem or DI cloud edition. I have configured both with no changes required other than the redirerct url hostname for the SAC OAuth Client.

      Author's profile photo James Yao
      James Yao

      Hi Ian,

      I have successfully configured SAC integration in my DI 3.0 op environment. It works well.

      Thanks again!

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Excellent, glad you got it connected.

      Thank you for the feedback.

      Author's profile photo Aleksey Salinin
      Aleksey Salinin

      Ian, i understand correctly that in the case of a DI on-prem in conjunction with a SAC Dataset, the data is not uploaded to the cloud?

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Aleksey,

      With the SAC integration the dataset is uploaded to the SAC tenant in the cloud.

      This is true for both a DI on-prem or DI Cloud setup.

      Author's profile photo Marcus Schiffer
      Marcus Schiffer

      Hi,

      the connector creates a dataset in SAC. How would we create a model and update the model on a regular basis from DI ?

      Thanks for any hint.

       

      Regards

      Marcus

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Marcus,

      The Data Intelligence integration happens at the dataset level.  You would build a model on top of the dataset. You could then use Data Intelligence integration to update the data in this model.

      Author's profile photo Marcus Schiffer
      Marcus Schiffer

      Hi Ian,

       

      thanks for the reply, we were told that a model (with planning) could not be updated by a schedule. We only find the typical list of connections in the menu (e.g. Google Drive, which we actually use to write to from DI and update the model from in SAC). Could you add a bit more on the procedure to schedule a model data upload from DI ?

      Many Thanks

      Marcus

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Marcus,

      I think we could be talking cross purposes. You scenario seem different, I am not aware of how to achieve this.  You should post this question on https://answers.sap.com

      Thanks, Ian.

      Author's profile photo Vinay Bhatt
      Vinay Bhatt

      Hello Ian,

       

      Any clue on this message, guessing maybe something to do with SAC Authorizations ?

      I’m using SAP DI cloud instance.

      {“message.request.id”:”initialRequest”,”openapi.header.content-type”:”application/json”,”openapi.status_code”:”500″}{“status”:500,”message”:”Access to V1 datasets is not allowed.”}

       

      Regards,

      Vinay

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Vinay,

      In the "SAC Cloud Formatter" Are you creating a "New" dataset or trying to "Overwrite" an existing dataset named V1?

      Do you see the existing SAC datasets retrieved?

      Cheers, Ian.

      Author's profile photo Nina Sun
      Nina Sun

      Hi Ian,

      I got a message as below. I've successfully granted authorization like you mentioned above. Could you please give me some hints?

      "{"message.request.id":"","openapi.header.content-type":"application/json","openapi.status_code":"500"}{"status":500,"message":"The resource does not exist, or you do not have permission to read it"}"

      Thanks in advance,

      Nina

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Nina,

      Do you see anything before this in the wiretap?
      You would usually see your existing datasets being retrieved.

      I suspect one of the URLs is incorrect, perhaps the tenant name in the SAC Formatter or the host in the SAC Producer?

      Have you logged a ticket to enable the Dataset API?

      Author's profile photo Vasi Venkatesan
      Vasi Venkatesan

      hi Ian,

      I have my instance in Azure, but I keep getting this error. Even after I stop and restart the pipeline.

      I see my DI server is Not Secure in the browser URL Not sure, if it is b/c of that.

      when I view the Wiretap, it is saying lookup error / no such host on the (error image below in the 2nd image). Not sure, what I can do to fix this. Thanks,

      Not%20Secured

      SAC Producer UI Error

       

      wiretap%20error

      Wiretap error

       

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      This could be a networking issue.

      Perhaps the NAT gateway or similar on your cluster is not correct?

      Can you connect to other external sites using a full qualified name?

      You could test with the HTTP Client.

      Author's profile photo Vasi Venkatesan
      Vasi Venkatesan

      Thank you Ian.

      Yes. I am connecting to this FQDN (DI launchpad) using my SAP system though the instance is in customer's azure.

      The cluster load balancer is open to public, though it is inside the customer's on-premise azure. For e.g., I was able to connect to the SAP HANA instance (using connection manager on port 30015) and it worked fine.

      Will http connection be a problem even with that?

      Thank you.

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      I was thinking of the networking going out to the Internet and not in to DI.

      Does the DI system understand external (internet) traffic/addresses?

      Author's profile photo Vasi Venkatesan
      Vasi Venkatesan

      if it is name resolution, (for sacgo-1.us10.sapanalytics.cloud) i doubt that it can do that. Not sure if I can put the IP address from nslookup of the same host.

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      In case anyone else experiences this issue, it was caused by the tenant parameter in the SAC Producer.

      It should be the hostname without any https://

      Author's profile photo Kyrill Köhn
      Kyrill Köhn

      Hi Ian,

      This is a great blog from you again!

      Unfortunately i have been struggling with creating the OAuth Client in the SAP Analytics Cloud.

      Our SAP Data Intelligence is running on AWS EKS. When adding the hostname, just as you showed it in the Screenshot I get the following Error:

      OAuth client cannot be created. Please try again.
      Correlation ID: 26612419-5521-4717-a796-177593929321

      Do you have any ideas for the reason of this error?

       

      Best Regards,

      Kyrill!

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Kyrill,

      Do you have a NEO or CF SAC tenant? When creating the OAuth Client the Data Intelligence host is not checked. I would double check there are no spaces at either end, and try a simple https://sap.com, or similar.

      Cheers, Ian.

      Author's profile photo Kyrill Köhn
      Kyrill Köhn

      Hi Ian,

      We have a CF SAC tenant. I've just tried it again with entering https://sap.com/** into the Redirect URL box. Sadly same error arrives.

      Best Regards, Kyrill

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      I just tried to create a new OAuth Client on my tenant, and I also receive the same error.

      Can you please log this with our support team?
      In parallel, I will do the same and update you once I get some feedback.

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Kyrill,

      My incident has been resolved, I can create the oAuth Clients again.
      I hope you logged your issue too.

      Author's profile photo Chandra Bhushan
      Chandra Bhushan

      Hi Ian,

      Hope you are doing well!

      This blog is very informative and was quite helpful.

      We are having a use case where the users will have the option to enter the desired inputs on SAC Analytical Application which then should connect back to DI and perform the required calculations and then refresh with appropriate values on SAC. Could you suggest an operator that can perform this task of interacting continuously between SAC and DI as the user input changes.

      We have referred to your blog which is for DI to SAC data push not vice versa. Could you help/suggest with an approach to achieve this?

       

      Regards,

      Chandra Bhushan

      Author's profile photo Iñigo de la Maza
      Iñigo de la Maza

      Hi Ian,

      Thank you very much for the post, it was very helpful. I managed to get a dataset sent to SAC once, following one by one your instructions, and everything seemed to be working fine.

      However, one week after, and without having modified any parameter of the SAP DI pipeline, it does not work anymore. The problem is that once I grant access and can visualize the Access Token (valid for one hour in theory), and I stop and re-run the pipeline, it asks me to grant permission once again. Somehow, the pipeline does not "remember" or detect, that access has already been granted on the previous run.

      Is there any requirement on the internet connection needed to have the pipeline detect that the access has already been granted?

      What do you think could be the reason?

      Thank you very much in advance and looking forward to your answer,

      Iñigo

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Iñigo,

      I presume your SAC Producer output shows 401 Unauthorized.

      This has been identified as a bug.  Please log a support ticket and we can get the fix pushed to your Data Intelligence tenant.

      Thanks, Ian.

       

      Author's profile photo Martijn van Foeken
      Martijn van Foeken

      Hi Ian,

      Currently it's required to grant authentication/access each time you run the pipeline. Is there a way to automate this process or is this going to change in a future release?

      Kind regards,

      Martijn van Foeken | Interdobs

      Author's profile photo Ian Henry
      Ian Henry
      Blog Post Author

      Hi Martijn,

      Yes, this is the current behaviour. The credentials are not stored anywhere. As I understand it, the pipeline could run continuously, new data could be push through and you would already be authenticated.  I am not aware of any changes on the roadmap.

      Thanks, Ian.