Skip to Content
Technical Articles

Trigger SAP Analytics Cloud Data Management Schedule via REST API


SAP Analytics Cloud offers many possibilities for reporting and planning. Especially SAP Analytics Cloud core planning includes features like “Generating Private Versions” and “Advanced Commenting”. To be able to use SAP Analytics Cloud core planning, data have to be imported into SAP Analytics Cloud Planning Models. SAP Analytics Cloud Planning Models can use global dimension importing data from different sources. In a hybrid scenario with SAP BW as source system it could be very useful to use existing InfoObjects as import source for SAP Analytics Cloud global dimensions and keep data in sync.


In this example a SAP BW InfoObject ZEPPART is used as template for our SAP Analytics Cloud global dimension.

In SAP Analytics Cloud a new global dimension was created called ZEPART_001 with the objective to load key and country key out of SAP BW source system.

In Data Management Data Source SAP BW (Including BW/4HANA) with Data Source Type “On-Premise” was selected and further the import data source for the SAP BW System – in our case ZW2_IMPORT.

Filter to the right data source.

Select the SAP BW Import Connection and click next.

After searching for the InfoObject – in our case ZEPPART – we selected fields to be extracted, did some mapping and created the global dimension.


Search for the Infoobject in our case ZEPPART.

Select Data you want to integrate and go on the finish the mapping.

Now we were ready to schedule a data load out of SAP Analytics Cloud. We scheduled a load and received 89 records out of SAP BW (ZW2).

Get into scheduling

A first look at the request with the help of Chrome Development tools, could help to identify requests that are used to trigger schedules and read out status of the scheduling.


The main service that was called is located in /sap/fpa/services/rest/fpa/dataintegration in our case  https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration. Different parameters could help to steer the scheduling with the help of the service.

The get out logs by model a request to the main service was posted to with parameters action=getScheduleLogsByModel&tenant=G



The request payload of this call include the model name in JSON format.

In our example :



To schedule a run a request to the main service was posted to with parameters action=runScheduledJob&tenant=G


The payload includes name of dataimport “F58CA15F78AA9039E200586F05E2BB9” and mappingid “F48CA15F78AA9039E200586F05E2BB93” in json format like:


Trigger schedule via REST API Call

Wouldn’t it be great to be able to schedule the loading from outside, for instance from process chain?

We succeeded in a first step to run and check a run with the help of Advanced Rest Client version 10.0.12 in Chrome.

User Authorization for REST API Calls

The first hurdle is to be able to post a request from external. The blog by Patrick Volker helped us a lot to establish the setup and managed our REST API calls with oauth.

If you did the setup correctly you should have now cookie and bearer token enabling you to call REST API located in your SAP Analytics Cloud instance.

Get Schedule Logs by Model

First we checked the current status of the schedules of our model ZEPPART_001 by posting a request to the main service /sap/fpa/services/rest/fpa/dataintegration with parameter action=getScheduleLogsByModel including the following request payload {“jobType”:”bpcDataImport”,”modelName”:”t.G:ZEPPART_001“,”mappingId”:[]}

In our example:



As already mentioned to request payload includes model name – in our case ZEPPART_001.


As a response we receive a list of runs identified by a “ScheduleID” including status and number of rows imported.

Trigger your Schedule

Now we try to trigger our load with?action=runScheduledJob in a next step.

In our case a request to the main service with parameters action=runScheduledJob&tenant=G:


In the payload of the request the name of the DataImport “F58CA15F78AA9039E200586F05E2BB93” and the mappingId “F48CA15F78AA9039E200586F05E2BB93” has to be provided like:


DataImport name and mappingId can be found in the previous response getting logs for the model in case.

With status 200 in payload of the response we can make sure that our request was executed successfully. In addition we receive the scheduleId that identifies our load we triggered.


“status”: 200,

“data”: {

“scheduleId”: “F58CA15F78AA9039E200586F05E2BB93”



In SAP Analytics Cloud we now can spot a new schedule triggered by our REST API call.

Check Data Load

It would be very helpful to check status of our run via REST API.

We call again the REST API with parameter action=getScheduleLogsByModel for our model. With the help of the scheduleId  “F58CA15F78AA9039E200586F05E2BB93” provided by the previous response we are able to identify the status and some metadata of our load.


With that approach, you are able to execute a data load in SAP Analytics with the help of the REST API call and check its execution status. This could help you to integrate and/or trigger a load with the help of external tools.

In our Scenario SAP BW could trigger with the help of a process chain the SAP Analytics Cloud load right after updating the Infoobject.

You must be Logged on to comment or reply to a post.
    • Hi Jef Baeyens

      SAP supports the SAP Analytics Cloud application, the functioning of its API framework, and the availability its Data Acquisition services.

      However, any Custom development on top of that falls within the remit of the customers’ own code support.

      In this case, i think it’s Postman being used to test the program logic – you wouldn’t use that to send Rest requests productively.


      • Thanks Henry. Great to hear the API framework itself is officially supported!
        Now I wish we could use them more easily in application designer 🙂

  • Fantastic article, thank you!

    How can i obtain that x-csrf-token? I am able to obtain a bearer token but i do not know where should i find x-csrf-token value and also cookie value as mentioned on the one of your screenshots. I am able to successfully use API for users and groups resources but this “modelling API” seems to need a little bit different approach.

    Thanks for any additional info!



    • At this moment I am getting the message below while calling getScheduleLogsByModel action:

      Error Code: 3000, Error Message: User is not properly configured or user may not exist in tenant
      Thank you!