Skip to Content
Technical Articles
Author's profile photo Edward Zhang

Integrate External Applications Using Multi Actions in SAP Analytics Cloud

This blog will introduce how to set up the HTTP API step of multi actions to integrate external applications—not only SAP applications, but also non-SAP products.

The new feature, called API step, allows you to integrate your own functionality to SAP Analytics Cloud multi actions without having to modify the original applications. The API step creates customer exits. You can hang your own add-on functionality onto these hooks.

Before we start to build the API step in multi actions, you’re required to create a connection that is for API integration specifically. To create the connection:

  1. Go to Connections > Add Connection.

    add a connection

  2. Under Connect to Public API, choose Select HTTP API Connection.

select HTTP API connection


Fill in the mandatory information for connection configuration. The API step in multi actions is for backend services integration, hence HTTP API connection supports basic authentication and OAuth2.0’s client credentials. Therefore, customer should offer the public API that supports at least one of above authentications.

Simply speaking, a GET method URL with JSON MINE type for response, need basic/client credential authentication type, which must return HTTP code 200 is OK to be used here for connection saving. It is same as the other connections.

For on-premise connection, it needs port to be explicitly specified due to SCC restriction.

configure HTTP API connection


The connection also supports the on-premise landscape. If the HTTP API resides in the network that SAP Analytics Cloud could not access, then it is necessary to setup SAP Cloud Connector in customer’s network, because it plays the role that creates the bridge between SAP Analytics Cloud and the customer’s on-premise service. For how to set up and configure SAP Cloud Connector, please refer to the blog Installation and Configuration of SAP Cloud Connector.


Create API Step

If the above prerequisites are finished, you can  go to multi actions and create an API step from the toolbar or from the popup menu. Now you will see the connection you created in the list. Select the connection you want to use.

Secondly, specify the API URL of the HTTP API to integrate with SAP Analytics Cloud. The API step supports only the POST method and only JSON MIME type, which is predefined in the default headers.

basic configuration


If needed, you can add header fields. Additional Header fields allow only ‘Prefer’:’respond-async’ and customized headers with keys beginning with ‘X-‘, except ‘x-forwarded-host’.

To set multiple values for a header field, you can separate the values using a comma or enter multiple header fields with the same key.

specify request headers


Now you need to specify the body of the POST request, if any. You can also click the Edit JSON button to open the edit dialog if the default editor is too small to edit.

when triggering


Depending on the API provider, you can choose to determine the trigger status by HTTP code or the combination of HTTP code and response body.

According to the common design standard:

HTTP code returns 202 if the asynchronous activity is triggered successfully, which represents that the remote API starts to process the backend job in the remote system. If it returns 40x/50x, HTTP API fails to process the backend job in the remote system, which causes the API step to fail in multi actions.

If the status of HTTP API is determined by both HTTP code and response body, two mandatory properties (status, jobId) and one optional property (message) are required in Response Body.

For the status representing the asynchronized activity status, its accepted value could be “DONE”, “FAILED”, or “IN_PROCESS”. Another property called jobId stands for the asynchronous activity, and its value can be used to query the status of the running job remotely. The Message field is for detailed warning/error messages of the HTTP API trigger.

As mentioned above, in order to get the asynchronized activity, the system assumes a corresponding query API should be available for querying. You are supposed to set the URL in the section Get API Execution Result. However, if such API completion follows the OData standard, it is not mandatory for user to set the URL here, because the location in the trigger response header should be valid for query the status.

when polling



API Step integration case description

The integration supports 2 most popular protocols, including OData and normal REST API.  Therefore,  if the triggering response follows OData standard, it expects 202 for HTTP code and the corresponding location in headers. Refer to the following example:

Triggering – OData standard


If the polling response follows the OData standard, it expects 202 for HTTP code if the job is still in process, and 200 if the job is completed.

Polling – OData standard


If the API is a non-OData interface, it accepts 200/202 for the HTTP code, concreate information in the response body which indicates the job status.

Triggering – non-OData standard


The corresponding polling response for non-OData accepts 202 or 200 if the job is still in process; however, only the detailed information in the body can tell whether the job is completed or not.

Polling – non-OData standard


At the end, the API step also offers the option to judge the API status by code only. If you choose this option, the body will be ignored.


Welcome to share your feedback or thoughts in a comment part and refer to below SAP community resource for more information.


Using the instructions of this blog posting, you will be able to set up HTTP API connection and integrate external API with API step in Multi Actions.

I hope this blog post can help your team doing the same implementation faster.

If you find this blog post helpful – please press like button 🙂

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Gerrit Posthumus
      Gerrit Posthumus

      Based on you blog post I have the below additional questions:

      1. Is it possible to use parameters in the JSON body so we can dynamically query the API and can these parameters been set via prompts or via java script API in a analytic application ?
      2. Is it possible to post process the response data send back from the API for example in a data action step and is it also possible te get the response data back via a java script API in an analytic application ?
      Author's profile photo Edward Zhang
      Edward Zhang
      Blog Post Author

      Hi Gerrit Posthumus

      Thanks a lot for reading and the question here

      Q: 1. Is it possible to use parameters in the JSON body so we can dynamically query the API and can these parameters been set via prompts or via java script API in a analytic application ?

      A: It is not supported yet in current release, but it is definitely considered when we started the feature implementation. It is on the roadmap in the near future.


      Q: 2.Is it possible to post process the response data send back from the API for example in a data action step and is it also possible te get the response data back via a java script API in an analytic application ?

      A: Unfortunately not supported yet in current release. So far, it is designed to get some certain properties from the response as I mentioned in the article. But due to the diversity of different APIs, it must provide a way for customer to do the mapping themselves. For example, to map the run_id(from customer side) with jobId(SAC expects) so that most public API could be integrated with Multi Action API step. However, this is only restricted for those 3 properties, it is not considered to handle complex case like transform the transaction data returned in the response. because API step is designed as a custom exit to custom calculation engine, for the cases which requires to handle huge mount of data, it is supposed to use data import and export API from both side. But I am not very certain the detail requirement, you can raise it here

      Kind Regards




      Author's profile photo Gerrit Posthumus
      Gerrit Posthumus

      Thanks for the response.

      The URL to the wiki does not work. I get the error that I am not authorized. 

      Currently I have two user cases:

      1)We have created a OpenAPI Servlow step in our SAP DI data pipeline that import and ETL data files send by our retailers and we build in the option to trigger activities like:

      1. Reprocess a failed file after we have update file definition configuration
      2. Reprocess all historical files within a file definition
      3. Reprocess files based on load status
      4. Update file status of particular file instance
      5. etc

      We would like to integrate the above in our administration cockpit application that we have built in an SAC Analytic Application. To make this work we need support for parameterisation and post command because the DI Open API servlet step does not support get command. This all need to be triggered via a java script API in the Analytic Application

      2) We need near real time inference to AWS Sagemaker to our price optimization model to simulate price optimization to predict how much more are less can be sold when prices are raised or discounts are given.

      To integrate the above we need to parameterise the body of the request and process the result into our data model. Ideally we would like to post process the data from the response body into the target data model. I think this could also be done with a data import step and data action if the data import step can also be parameterized to only retrieve the data base on a parameterized filter. 

      Author's profile photo Edward Zhang
      Edward Zhang
      Blog Post Author

      Hi Gerrit Posthumus ,

      Thank you for reaching out.

      Which URL doesn't work do you mean? Is the url to be set when you save the connection?

      The feature is released with the minimal requirements and the parameterization is one of the most important backlog need to achieve in our plan.

      Regarding to the detailed requirement about 'post process the data from the response body', I would kindly remind that if the data volume is very big, then the feature is not ideal to handle the case, because the purpose of the feature is to trigger seies of remote activities, if such remote data need to be imported to SAC model, we will normally suggest to use the import step in multi actions.

      So the proposed workflow for case2 at your side would be

      1. be triggerted by multi actions to start to read data from AWS Sagemaker.
      2. the data will be stored in the data model remotely.
      3. import the data via import step in multi actions. this step is assumed to be triggerred automactally when the step1,2 are done.
      4. etc.
      Author's profile photo Jonas Neurath
      Jonas Neurath

      Hi Edward,

      thank you very much for the blogpost. We have a related issue and due to your statement, that some functions will be on the roadmap in the near future, I want to ask for some updates.

      Q1: Is it still not possible to pass data from a SAC model to the API? Maybe by using the new parameter function from QRC Q3? Or is there another possibility to pass data from a model (live or import) to an API in the meantime?

      Q2: Do you know any way to pre process the data coming back from the API in the response body and integrate it in either a story or model? It would be enough for us to just display the responses.

      For our current project it would be very helpful to get some answers on these questions.

      Thanks in advance!

      Best, Jonas

      Author's profile photo Ilaria Martiradonna
      Ilaria Martiradonna

      Hi Edward Zhang ,

      let´s say I have an API that retrieves some description fields that are normally translated next to standard ID fields. The texts are read from a companion entity translated with the input language.

      I create a connection to SAC as you described and a model in SAC. Will I see the text fields based on the language of login of SAC? So, will the descriptions be translated?





      Author's profile photo Edward Zhang
      Edward Zhang
      Blog Post Author

      Hi Ilaria Martiradonna ,

      API step will only read the given information and show it as it is. Therefore, the description will be shown as it was, no translation would be applied.



      Author's profile photo Airat Kugashev
      Airat Kugashev

      Good day Edward.


      Could you please tell me if MultiAction API step fully supports OData V4 protocol? It's not clear from the article.


      BR, Airat.

      Author's profile photo Edward Zhang
      Edward Zhang
      Blog Post Author

      Hi Airat,

      It supports async part of oData4 protocal.



      Author's profile photo Blirona Keraj
      Blirona Keraj

      Hi Edward Zhang,

      Thank you for this informative blog post. I have a question related to the response body.

      I am currently using the API connection to prompt an endpoint and there is an action triggered which then returns some relevant results in the response.body. However currently I am able to successfully ping the endpoint but I only get the status 200 back. How can I get also the response.body content?

      I tried the parameter field mapping, but the response body of the API i use has only 1 key-value pair, and not job_id or status. I am new to SAC so I am wondering if there is any other option I am missing to get the results and store them in SAC, and not only the status 200.

      Thank you in advance,