Skip to Content
Technical Articles

Cloud Integration with Commerce Azure Blob Storage using REST API

Introduction:

In this blog post, I will explain how to use REST API to transfer file to Azure Blob Storage.

Hot folder is file-based, asynchronous integration mechanism for importing data into Commerce. In traditional cases, SFTP, FTP or NFS drivers are used to push files to the Hot folder. But SAP Commerce Cloud uses Cloud Hot folder with Microsoft Azure Blob Storage. It removes the need of  local or shared directories.

Azure Blob storage contains 3 levels of resources. 

  1. Storage Account: Unique Account name
  2. Container: placeholders for Blobs
  3. Blob

Default container name:  hybris

Default hot folder path : master/hotfolder

Reference Link: Cloud Hot Folder

Scenario:

Transfer a csv file from FTP server to Azure Blob Storage using SAP Cloud Integration. Block blob (a type of blob) will be used to transfer the file to Blob Storage,

Prerequisite Setup:

  1. Set up an Integration suite trial. Help Link: Setup.   
  2. Azure Storage Explorer. Help Link: Azure Storage Explorer.

Design Solution in Cloud Integration:

IFlow

Integration Flow

Step 1:

Configure FTP sender channel to pick csv file from FTP server.

FTP%20Sender%20Channel

FTP Sender Channel

Step 2:

Use Content Modifier to set properties and header for version, as shown below.

Set%20Version%20Header

Set Version Header

Version header is set as an externalized parameter to provide the capability to change it to the next version in future.

Set%20Properties

Set Properties

‘accountKeyAlias’ refers to the secure parameter which contains Account Key.

Secure%20Parameter

Secure Parameter

Step 3:

Use Groovy Script to set up the remaining headers for calling REST API using shared key. 

The format for authorization header is shown below:

Authorization=”SharedKey <AccountName>:<Signature>”  

Signature is a Hash-based Message Authentication Code (HMAC), constructed from the request and computed using the SHA256 algorithm, and then Base64 encoded.

Reference Link: Authorize with Shared Key

import com.sap.gateway.ip.core.customdev.util.Message
import com.sap.it.api.ITApiFactory
import com.sap.it.api.securestore.SecureStoreService
import com.sap.it.api.securestore.UserCredential
import com.sap.it.api.securestore.exception.SecureStoreException
import javax.crypto.Mac
import javax.crypto.spec.SecretKeySpec
import java.security.InvalidKeyException
def Message processData(Message message) 
{
    //Set Current Time
    TimeZone.setDefault(TimeZone.getTimeZone('GMT'))
    def now = new Date().format("EEE, dd MMM yyyy HH:mm:ss") + " GMT"
    //Set header for datetime
    message.setHeader("x-ms-date", now)
    //Get Content Length
    def body = message.getBody(String)
    def contentLength = message.getBody().length()
    String contentType = 'text/csv; charset=UTF-8'
    // Get file name
    String filename = message.getHeaders().get("CamelFileNameOnly")
    //Get container name
    String container = message.getProperties().get("container")
    //Get folder path
    String folderPath = message.getProperties().get("folderPath")
    //Get Blob name
    String blobname = folderPath + '/'+ filename
    //Get Account Name
    String account = message.getProperties().get("accountName")
    // Set canonicalized Resource
    String canonicalizedResource = '/'+ account + '/'+ container +'/' + blobname
    //Set Blob Type
    String blobType = 'Blockblob'
    // Set header for Blob type
    message.setHeader("x-ms-blob-type", blobType)
    // set verb as requested method
    String verb = 'PUT'
    //Get version
    String version = message.getHeaders().get("x-ms-version")
    //Set Signature String
    String StringToSign = verb +'\n'+'\n'+'\n'+ contentLength +'\n'+ '\n' + contentType +'\n'+'\n'+'\n'+'\n'+'\n'+'\n'+'\n'+'x-ms-blob-type:'+ blobType +'\n'+'x-ms-date:'+ now +'\n' +'x-ms-version:' + version + '\n'+ canonicalizedResource
    //Get Account Key from Secure Parameter
    String accountKeyAlias = message.getProperties().get("accountKeyAlias")
    def accountKey = getAccountKey(accountKeyAlias)
    // Decode Account Key
    def decodedKey = accountKey.decodeBase64()
    //Get Hash Value
    String hash = hmac_sha256(decodedKey, StringToSign)
    //Set Authorization header
    String auth = 'SharedKey'+ ' ' + account + ':' + hash
    message.setHeader("Authorization", auth)
    //Set Content-Type header
    message.setHeader("Content-Type", contentType)
    //Set Content-Length header
    message.setHeader("Content-Length",contentLength)
    //Set message body
    message.setBody(body)
    return message
}

String getAccountKey(String accountKeyAlias)
{
   def secureStorageService =  ITApiFactory.getService(SecureStoreService.class, null)
    try
    {
        def secureParameter = secureStorageService.getUserCredential(accountKeyAlias)
        return secureParameter.getPassword().toString()
    } 
    catch(Exception e)
    {
        throw new SecureStoreException("Secure Parameter not available")
    }
}

String hmac_sha256(byte[] secretKey, String data) 
{
    try 
    {
        Mac sha256_HMAC = Mac.getInstance("HmacSHA256")
        SecretKeySpec secret_key = new SecretKeySpec(secretKey, "HmacSHA256")
        sha256_HMAC.init(secret_key)
        byte[] digest = sha256_HMAC.doFinal(data.getBytes())
        return digest.encodeBase64()

    } catch (InvalidKeyException e) 
    {
        throw new RuntimeException("Invalid key exception while converting to HMac SHA256")
    }
}

The above code can be modified according to requirement.

Step 4:

Configure HTTP receiver channel as below.

HTTP Receiver Channel

Step 5:

In exception sub-process, use Groovy Script to log the HTTP error response. 

Reference Link: Exception Handling in HTTP Receiver

Test Execution:

The below shows a sample CSV file. Please note the data used in executing the scenario is test data/ fake data only.

Sample CSV Input File

After successful execution of the message in Cloud Integration, the file is uploaded in Azure Blob Storage.

Azure%20Blob%20Storage%20Explorer

Azure Blob Storage Explorer

What happens to the placed file?

  1. The cloud hot folder moves any file placed in the Blob directory to a temporary processing directory.
  2. The cloud hot folder downloads the file from the processing directory to a standard hot folder in SAP Commerce Cloud.
  3. Standard hot folder decompresses the file and converts it into Impex format using ImpexConverter for import. 
  4. When the hot folder finishes processing the file, the cloud hot folder moves it from the temporary processing directory to the error/archive directory.

Conclusion:

Since Cloud Hot Folders are using Azure Blob Storage, the options to use FTP/SFTP/NFS  are no longer available. The Azure Blob Services offer several REST operations through the HTTP protocol for connectivity. PUT Blob rest operation is used to transfer files to Blob storage.

Thank you for reading this blog post. Please feel free to share your feedback or thoughts or ask questions in the Q&A tag below.

QA link

Regards,

Priyanka Chakraborti

7 Comments
You must be Logged on to comment or reply to a post.