Skip to Content
Technical Articles
Author's profile photo Pavan G

SAP Cloud Integration with AzureBlob, AzureStorageAccount, AzureContainer


To Read, Write, Delete AzureBlob, AzureStorageAccount, AzureContainer using REST APIs via SAP CPI.


There are multiple reasons for Integration Consultants to use REST APIs to connect with Azure or any third-party applications.

  • Great deal of flexibility and accessibility
  • Secured connectivity options
  • Handling multiple types of calls and different data formats
  • Avoid coding to connect applications and debugging it during errors and so on

REST is an architecture style that allows you to interact with a service over an internet protocol such as HTTP / HTTPS. It is platform and software independent running on server or the client.

In this blog post I will discuss about creating a Azure Storage Account and provide the API details for AzureBlob, AzureContainer.

Azure Storage Account REST API provides HTTP operations like PUT, GET and DELETE calls to write, read and delete a Azure Storage account respectively.

We all know that each REST API has 3 parts Authorization, Request and Response.

Let’s discuss in detail how we Authorize a request of Azure REST API and get a Response and implement the same in SAP CPI.


Authentication has 2 parts.

  • Creating Azure Service Principal and assign role to the application
  • OAuth2 token request to get the access token

Creating Azure Service Principal and assign role to the application:

We will have to create a Azure Service Principal so that this can be used for authentication to read, write or delete storage account when request is submitted.

To access Azure resources, Azure provides concept of service principal identity which can be created for use with applications and automated tools. Service principal is assigned to various roles to provide access to resources in controlled manner. It is recommended to use service principals with applications or other tools to access azure resources rather than allowing them to use user identity.

These service principals are non-interactive Azure accounts. Like any other user, their permissions are managed with Azure Active Directory.

To create a service principal for your application:

  • Sign into Azure Account through the Azure portal.
  • Select Azure Active Directory –>App registrations –>+ New application registration.
  • Provide a name and URL for the application. Select either Web app / APIfor the type of application. After setting the values, select Create.
  • Select the subscription to assign the application.
  • Now click on “Access control (IAM)” –> “Role assignments” –> + Add –> Select “Add Role Assignment”
  • Select the “Contributor” and “Storage Blob Data Owner” roles to allow the application to execute actions like rebootstartand stop instances and access the Azure Storage services respectively.
  • Select Save to finish assigning the role. You see your application in the list of users with the roles for that scope. Service Principal is set up now.

 OAuth2 token request to get the access token:

Now these service principals are non-interactive Azure accounts. Like any other user, their permissions are managed with Azure Active Directory.

We will use OAuth 2.0 Client Credentials Grant Flow which permits a web service (confidential client) to use its own credentials (service principal) instead of impersonating a user, to authenticate when calling another web service.


  • The client application authenticates to the Azure AD token issuance endpoint and requests an access token.
  • The Azure AD token issuance endpoint issues the access token.
  • The access token is used to authenticate to the secured resource.
  • Data from the secured resource is returned to the client application.
  • To use Service Principal, we would need the ClientID and Authentication Key
    • Select our application, copy the Application ID (Client ID)and store it.
    • To generate an authentication key, select Settings –> Keys.
    • Provide a description of the key, and a duration for the key. When done, select Save. The Key value will be used as Client Secret
  • OAuth2 token request to get access_token URL is{{directoryId}}/oauth2/token where directory ID can be found from Azure Active Directory Application we created.
  • For the OAuth2 token request URL we must pass the below parameters in the body of the request to read the token.
    • grant_type = client_credentials
    • resource =
    • client_id = as mentioned in the screenshots below
    • client_secret = as mentioned in the screenshots below
  • Access token read from these details using Postman client has been shown in the below screenshots.
  • Now the service principal is authenticated and the access_token can be used in request to access the resource.

Request API format and required parameters:


To create a new Azure Storage Account, we must use PUT HTTP request. Likewise, we use GET and DELETE HTTP requests to read and delete the details of the storage account.

To build the API request, we need request headers, URI parameters and request body are required.

Request headers:

Request header Description
Content-Type: Required. Set to application/json.
Authorization: Required. Set to a valid Bearer access token.

 URI parameters:

Name Description
subscriptionId The subscription ID that identifies an Azure subscription.
resourceGroupName The name of the resource group that contains the resource.
accountName The name of the storage account. Following naming accounts best practices is recommended.
api-version The API version to use for the request.
This blog post covers api-version 2018-02-01, included in the above URL.

Request body:

The table below describes the required JSON properties for the request body. Sample request body has been mentioned below

Name Type Description
location string Resource location where you would like to create storage account.
kind Kind Specifies which type of storage account to create. The general-purpose StorageV2 choice is recommended and used in this blog post.
sku Sku Defines the capabilities of the Storage account, such as redundancy strategy and encryption. This blog post uses Geo-Redundant storage.



“sku”: {

    “name”: “Standard_GRS”


  “kind”: “StorageV2”,

  “location”: “eastus2”,



Handling the response:

Successful requests to create a new account return a 201 or 202-status code with an empty response body. The storage account is created asynchronously. If the account already exists or is being provisioned, the request response has a 200-return code with the configuration of the existing storage account in the response body.

Implementation in CPI:

We will use the above discussed API details and create a custom iFlow in SAP CPI to create new storage account in Azure.

Steps to be followed in creating the iFlow:

Step 1: Create a Package with Name: Azure Storage Account

Step 2: Create an IFlow with Name: Create Storage Account

Step 3: Create an Integration flow with following Components

  • Start Timer
    • Configure the timer to run for once.
  • Set Header (Content Modifier)

In content modifier create the below properties

  • Header:
    • Content-Type: application/x-www-form-urlencoded
  • Message Body:
    • grant_type=client_credentials&resource={{client ID}}&client_secret={{ Client Secret }}
  • Get Token (Request-reply)
    • Use HTTP adapter and do GET call on the OAuth2 token URL{{DirecctoryID}}/oauth2/token
  • Azure AD (Receiver Participant)
  • JSON to XML Converter
    • Default Configurations
  • Read Token from XML body (Content Modifier)

In content modifier create the below properties

  • Exchange Property:
    • token: java.lang.String: //access_token
  • Set Auth (Content Modifier)

In content modifier create the below properties

  • Header:
    • Content-Type: application/json
    • Authorization: java.lang.String: Bearer ${header.token}
  • Message Body:


“sku”: {

“name”: “Standard_GRS”


“kind”: “StorageV2”,

“location”: “australiaeast”


  • Create Storage Account (Request-reply)
    • Use HTTP adapter and do PUT call on the create storage account API URL without any authentication.
    • API:{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}?api-version=2018-02-01
  • Azure (Receiver Participant)
  • Response Body (Content Modifier)
    • Message Body:${in.body}

Step 4: Once the configuration is done and iflow is deployed successfully, Azure Storage account will be created.

As there is no provision to pass add resource in CPI OAuth2.0 artifact, we are doing a HTTP call using request reply prior making the actual call to read the token and pass to next HTTP call which is creating the storage account.


This is one of the ways to consume Azure REST APIs. With the help of this Azure REST APIs and its feature we can further extend the same API URL to read or delete a storage account. Similarly, we can consume Azure REST APIs used for containers and blobs as well.

APIs for AzureBlob and AzureContainer can be found in the official documentation as below:

READ Blob:



READ Container:

WRITE Container:

DELETE Container:



Thank you for reading this blog post.

Please feel free to share your feedback or thoughts or ask questions in the Q&A tag below.



Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Sivachandran Deivasigamani
      Sivachandran Deivasigamani

      Hi Pavan,

      Thanks for the blog. we have a similar kind of requirement. we need to place the file (csv) from SCPI into Azure blob storage. Could you pls help, how can I adapt this as per my requirement.

      Author's profile photo Pavan G
      Pavan G
      Blog Post Author

      Hi Sivachandran Deivasigamani ,

      You can try the same Iflow mentioned in the blog and send your file(CSV) in the body. Try this and let me know here if you face any issues.



      Author's profile photo Sivachandran Deivasigamani
      Sivachandran Deivasigamani

      Hi Pavan G,


      1. While uploading the Excel file(.xlsx) into blob storage, the file is getting corrupted.
      2. With CSV file, special characters are replaced by "?"

      Have you faced this kind of issue?




      Author's profile photo Pavan G
      Pavan G
      Blog Post Author

      Hi Sivachandran Deivasigamani ,

      I didn't face any issues with a CSV file while placing into a blob. Which special characters are coming in CSV file? Let me know, I'll check.

      May I know the idea behind storing the excel file into a blob storage. Blob storage are usually used to store unstructured raw data. You can store a CSV file though.

      If you want a tabular structure, I would suggest you to use Table blobs. You can also use File blob service.


      Pavan G


      Author's profile photo kumar vibhash
      kumar vibhash

      Hi Pavan,

      Thanks for the great blog. We have a scenario wherein we wanted to push data from SAP on-prem system to azure blob storage. We installed "abap sdk on azure", and have written ABAP report to call blob api and using sdk we are pushing to blob. For error handling we store the log in ztable and planning to push the same to azure blob to have a power BI dashboard which will show the log analytics etc. Do you think it's a sustainable approach? Any experience doing it this way.

      Author's profile photo Anurag Mathur
      Anurag Mathur

      Hi Pawan,

      Thanks for the blog.

      I have a similar request from my client.

      I have developed an integration which extracts demographic information - the output is a payload in CSV format - which my client wants to place in their Azure BLOB storage.

      But they want the csv file to be stored as "MyData.csv" in their BLOB storage.

      I don't see any place to specify a filename in the HTTP adapter. How can I store the payload from my integration as a specific filename?

      Also, they've given me the following Azure credentials.


      Storage Account Name


      Do I need to create a user credential or a OATH2 Client credential to connect to their Azure BLOB.


      Any and all help will be deeply appreciated.


      Any and all help will be greatly appreciated.

      Author's profile photo Rajesh PS
      Rajesh PS

      Pavan G

      Azure blob natively supports SFTP, REST, NFT but why HTTP or cloud connectors not being used it is more precise and secured?

      Author's profile photo subhashini Ponnambalam
      subhashini Ponnambalam

      Hi Pavan,


      I'm unable to get the token from Azure. Should we autheticate via certificate ?

      Getting error , The endpoint only accepts POST, OPTIONS requests. Received a GET request.","strTraceId":"","iErrorCode":900561,"iHttpErrorCode":400"


      How did you fix this issue.



      Author's profile photo Pavan G
      Pavan G
      Blog Post Author

      Hi Subha,

      As per the error message mentioned by you, the end point only accepts the POST method.

      Please make an API call with the POST method instead of GET.