Skip to Content
Technical Articles

Trigger SAP Analytics Cloud Data Management Schedule via REST API

Scope

SAP Analytics Cloud offers many possibilities for reporting and planning. Especially SAP Analytics Cloud core planning includes features like “Generating Private Versions” and “Advanced Commenting”. To be able to use SAP Analytics Cloud core planning, data have to be imported into SAP Analytics Cloud Planning Models. SAP Analytics Cloud Planning Models can use global dimension importing data from different sources. In a hybrid scenario with SAP BW as source system it could be very useful to use existing InfoObjects as import source for SAP Analytics Cloud global dimensions and keep data in sync.

Prepare

In this example a SAP BW InfoObject ZEPPART is used as template for our SAP Analytics Cloud global dimension.

In SAP Analytics Cloud a new global dimension was created called ZEPART_001 with the objective to load key and country key out of SAP BW source system.

In Data Management Data Source SAP BW (Including BW/4HANA) with Data Source Type “On-Premise” was selected and further the import data source for the SAP BW System – in our case ZW2_IMPORT.

Filter to the right data source.

Select the SAP BW Import Connection and click next.

After searching for the InfoObject – in our case ZEPPART – we selected fields to be extracted, did some mapping and created the global dimension.

 

Search for the Infoobject in our case ZEPPART.

Select Data you want to integrate and go on the finish the mapping.

Now we were ready to schedule a data load out of SAP Analytics Cloud. We scheduled a load and received 89 records out of SAP BW (ZW2).

Get into scheduling

A first look at the request with the help of Chrome Development tools, could help to identify requests that are used to trigger schedules and read out status of the scheduling.

 

The main service that was called is located in /sap/fpa/services/rest/fpa/dataintegration in our case  https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration. Different parameters could help to steer the scheduling with the help of the service.

The get out logs by model a request to the main service was posted to with parameters action=getScheduleLogsByModel&tenant=G

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=getScheduleLogsByModel&tenant=G

 

The request payload of this call include the model name in JSON format.

In our example :

{“jobType”:”bpcDataImport”,”modelName”:”t.G:ZEPPART_001“,”mappingId”:[]}

 

To schedule a run a request to the main service was posted to with parameters action=runScheduledJob&tenant=G

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=runScheduledJob&tenant=G

The payload includes name of dataimport “F58CA15F78AA9039E200586F05E2BB9” and mappingid “F48CA15F78AA9039E200586F05E2BB93” in json format like:

{“jobType”:”DAFrameworkDataImport”,”name”:”F58CA15F78AA9039E200586F05E2BB93″,”description”:null,”recurrence”:null,”param”:{“mappingId”:”F48CA15F78AA9039E200586F05E2BB93″,”optParams”:{}},”status”:”STOPPED”}

Trigger schedule via REST API Call

Wouldn’t it be great to be able to schedule the loading from outside, for instance from process chain?

We succeeded in a first step to run and check a run with the help of Advanced Rest Client version 10.0.12 in Chrome.

User Authorization for REST API Calls

The first hurdle is to be able to post a request from external. The blog by Patrick Volker https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/ helped us a lot to establish the setup and managed our REST API calls with oauth.

If you did the setup correctly you should have now cookie and bearer token enabling you to call REST API located in your SAP Analytics Cloud instance.

Get Schedule Logs by Model

First we checked the current status of the schedules of our model ZEPPART_001 by posting a request to the main service /sap/fpa/services/rest/fpa/dataintegration with parameter action=getScheduleLogsByModel including the following request payload {“jobType”:”bpcDataImport”,”modelName”:”t.G:ZEPPART_001“,”mappingId”:[]}

In our example:

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=getScheduleLogsByModel&tenant=G

 

As already mentioned to request payload includes model name – in our case ZEPPART_001.

 

As a response we receive a list of runs identified by a “ScheduleID” including status and number of rows imported.

Trigger your Schedule

Now we try to trigger our load with?action=runScheduledJob in a next step.

In our case a request to the main service with parameters action=runScheduledJob&tenant=G:

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=runScheduledJob&tenant=G

In the payload of the request the name of the DataImport “F58CA15F78AA9039E200586F05E2BB93” and the mappingId “F48CA15F78AA9039E200586F05E2BB93” has to be provided like:

{“jobType”:”DAFrameworkDataImport”,”name”:”F58CA15F78AA9039E200586F05E2BB93“,”description”:null,”recurrence”:null,”param”:{“mappingId”:”F48CA15F78AA9039E200586F05E2BB93“,”optParams”:{}},”status”:”STOPPED”}

DataImport name and mappingId can be found in the previous response getting logs for the model in case.

With status 200 in payload of the response we can make sure that our request was executed successfully. In addition we receive the scheduleId that identifies our load we triggered.

{

“status”: 200,

“data”: {

“scheduleId”: “F58CA15F78AA9039E200586F05E2BB93”

}

}

In SAP Analytics Cloud we now can spot a new schedule triggered by our REST API call.

Check Data Load

It would be very helpful to check status of our run via REST API.

We call again the REST API with parameter action=getScheduleLogsByModel for our model. With the help of the scheduleId  “F58CA15F78AA9039E200586F05E2BB93” provided by the previous response we are able to identify the status and some metadata of our load.

Conclusio

With that approach, you are able to execute a data load in SAP Analytics with the help of the REST API call and check its execution status. This could help you to integrate and/or trigger a load with the help of external tools.

In our Scenario SAP BW could trigger with the help of a process chain the SAP Analytics Cloud load right after updating the Infoobject.

16 Comments
You must be Logged on to comment or reply to a post.
    • Hi Jef Baeyens

      SAP supports the SAP Analytics Cloud application, the functioning of its API framework, and the availability its Data Acquisition services.

      However, any Custom development on top of that falls within the remit of the customers' own code support.

      In this case, i think it's Postman being used to test the program logic - you wouldn't use that to send Rest requests productively.

      Regards
      H

      • Thanks Henry. Great to hear the API framework itself is officially supported!
        Now I wish we could use them more easily in application designer 🙂

  • Fantastic article, thank you!

    How can i obtain that x-csrf-token? I am able to obtain a bearer token but i do not know where should i find x-csrf-token value and also cookie value as mentioned on the one of your screenshots. I am able to successfully use API for users and groups resources but this "modelling API" seems to need a little bit different approach.

    Thanks for any additional info!

    Regards,

    PN

    • At this moment I am getting the message below while calling getScheduleLogsByModel action:

      Error Code: 3000, Error Message: User is not properly configured or user may not exist in tenant
      Thank you!
      PN
  • How do you get this setup to work if your using a identity provider to authenticate users and not the built in authentication?

    • Hi Vi Tran,

      We are using OAUTH2 with OAUTH2 Client from backend.

       

      Like described in https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/

      br mario

      • Is there a good guide out there when using OAUTH2 with SAML.  We are using a SAML IDP to authenticate the user and not the built in SAC user authentication.

        • I am trying to do a similar thing. We are using Azure AD FS to do single-sign-on federation in our SAC environment for end-users and it is working fine.  Ours is a Cloud Foundry (CF) instance as opposed to a NEO instance.  Much of the documentation I've stumbled upon make more references to NEO environments as opposed to CF environments.

          My goal is to create a PowerShell script that can start an SAC dimension data management refresh job which would be called by an external job scheduler.  I have reviewed https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/ many times but am unable to get it working.

          Using Postman, I am able to retrieve an OAuth token with basic authentication and a grant type of client_credentials.
          OAuth%20Client%20URLs

          OAuth Client URLs

          Using that token as a bearer token for calls to various endpoints under https://{SAC_Hostname}.hcs.cloud.sap/api/v1 , I am able to retrieve a list of groups, users, get details on an individual user, and retrieve a list of Resources (SAP Analytics Cloud User and Team Provisioning API - SAP Help Portal)

          However, this same bearer token does not work for the dataintegration endpoint which used to initiate data refreshes within SAC (https://{SAC_Hostname}.hcs.cloud.sap/sap/fpa/services/rest/fpa/dataintegration).

           

          The path is different, and I'm assuming it is a different API all together requiring its own authentication and retrieval of a token steps.  I can't figure out how to authenticate to this API to interact with it, and cannot find any useful documentation besides this blog and the other referenced (https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/)

          Any additional guidance you can provide or working examples in a similar environment as ours would be much appreciated!

          /
          OAuth%20Client%20URLs
          • Hi Kurt,

            I was trying exactly the same and was also facing the same issue. Working with groups, users and so on does work, but I couldn't find any solution so far to work with the endpoint/api for scheduling.

            In detail when I try to use this api I get some html/javascript-content containing a comment "..<!-- we should only load this content when not authenticated -->...".

            Any additional help/guidance for this set up would be really be great!

          • Hi Michaela,

            Thank you for validating my issue.

            I am getting the same response you are seeing:

            <html>
            <head>
            </head>
            <body>
                <!-- we should only load this content when not authenticated-->
                <script>
                    if (window.parent === window) {
                  // the page is not loaded in an iframe
                  var currentUrl = window.location.href;
                  // e.g. https://myhost:433
                  var baseUrl = window.location.protocol + '//' + window.location.host;
                  var currentRelativeUrl = currentUrl.substring(baseUrl.length);
                  var encodedRedirectUrl = encodeURIComponent(currentRelativeUrl);
                  var redirectEndpoint = '/approuter/v1/redirect?url=' + encodedRedirectUrl;
                  window.location.href = redirectEndpoint;
                } else {
                  var url = '/approuter/v1/embedded-auth';
                  var popupWindow;
                  window.addEventListener('message', function(event) {
                    // check event.origin
                    console.log(event.data);
                    if (popupWindow) {
                      popupWindow.close();
                      window.location.reload();
                    }
                  });
                  popupWindow = window.open(url, 'SAP Analytics Cloud', 'width=600,height=500,menubar=0,dependent=1,status=0');
                }
                </script>
            </body>
            </html>
          • We are facing the same issue with our Okta SAML IDP. Still playing with things as far as getting a destination server setup to see if that will allow this workaround to work.