Technical Articles
Cloud Integration with Commerce Azure Blob Storage using REST API – Part 1
Introduction:
In this blog post, I will explain how to use REST API to transfer file to Azure Blob Storage.
Hot folder is file-based, asynchronous integration mechanism for importing data into Commerce. In traditional cases, SFTP, FTP or NFS drivers are used to push files to the Hot folder. But SAP Commerce Cloud uses Cloud Hot folder with Microsoft Azure Blob Storage. It removes the need of local or shared directories.
Azure Blob storage contains 3 levels of resources.
- Storage Account: Unique Account name
- Container: placeholders for Blobs
- Blob
Default container name: hybris
Default hot folder path : master/hotfolder
Reference Link: Cloud Hot Folder
Scenario:
Transfer a csv file from FTP server to Azure Blob Storage using SAP Cloud Integration. Block blob (a type of blob) will be used to transfer the file to Blob Storage,
Prerequisite Setup:
- Set up an Integration suite trial. Help Link: Setup.
- Azure Storage Explorer. Help Link: Azure Storage Explorer.
Design Solution in Cloud Integration:
Integration Flow
Step 1:
Configure FTP sender channel to pick csv file from FTP server.
FTP Sender Channel
Step 2:
Use Content Modifier to set properties and header for version, as shown below.
Set Version Header
Version header is set as an externalized parameter to provide the capability to change it to the next version in future.
Set Properties
‘accountKeyAlias’ refers to the secure parameter which contains Account Key.
Secure Parameter
Step 3:
Use Groovy Script to set up the remaining headers for calling REST API using shared key.
The format for authorization header is shown below:
Authorization=”SharedKey <AccountName>:<Signature>”
Signature is a Hash-based Message Authentication Code (HMAC), constructed from the request and computed using the SHA256 algorithm, and then Base64 encoded.
Reference Link: Authorize with Shared Key
import com.sap.gateway.ip.core.customdev.util.Message
import com.sap.it.api.ITApiFactory
import com.sap.it.api.securestore.SecureStoreService
import com.sap.it.api.securestore.UserCredential
import com.sap.it.api.securestore.exception.SecureStoreException
import javax.crypto.Mac
import javax.crypto.spec.SecretKeySpec
import java.security.InvalidKeyException
def Message processData(Message message)
{
//Set Current Time
TimeZone.setDefault(TimeZone.getTimeZone('GMT'))
def now = new Date().format("EEE, dd MMM yyyy HH:mm:ss") + " GMT"
//Set header for datetime
message.setHeader("x-ms-date", now)
//Get Content Length
def body = message.getBody(String)
def contentLength = message.getBody().length()
String contentType = 'text/csv; charset=UTF-8'
// Get file name
String filename = message.getHeaders().get("CamelFileNameOnly")
//Get container name
String container = message.getProperties().get("container")
//Get folder path
String folderPath = message.getProperties().get("folderPath")
//Get Blob name
String blobname = folderPath + '/'+ filename
//Get Account Name
String account = message.getProperties().get("accountName")
// Set canonicalized Resource
String canonicalizedResource = '/'+ account + '/'+ container +'/' + blobname
//Set Blob Type
String blobType = 'Blockblob'
// Set header for Blob type
message.setHeader("x-ms-blob-type", blobType)
// set verb as requested method
String verb = 'PUT'
//Get version
String version = message.getHeaders().get("x-ms-version")
//Set Signature String
String StringToSign = verb +'\n'+'\n'+'\n'+ contentLength +'\n'+ '\n' + contentType +'\n'+'\n'+'\n'+'\n'+'\n'+'\n'+'\n'+'x-ms-blob-type:'+ blobType +'\n'+'x-ms-date:'+ now +'\n' +'x-ms-version:' + version + '\n'+ canonicalizedResource
//Get Account Key from Secure Parameter
String accountKeyAlias = message.getProperties().get("accountKeyAlias")
def accountKey = getAccountKey(accountKeyAlias)
// Decode Account Key
def decodedKey = accountKey.decodeBase64()
//Get Hash Value
String hash = hmac_sha256(decodedKey, StringToSign)
//Set Authorization header
String auth = 'SharedKey'+ ' ' + account + ':' + hash
message.setHeader("Authorization", auth)
//Set Content-Type header
message.setHeader("Content-Type", contentType)
//Set Content-Length header
message.setHeader("Content-Length",contentLength)
//Set message body
message.setBody(body)
return message
}
String getAccountKey(String accountKeyAlias)
{
def secureStorageService = ITApiFactory.getService(SecureStoreService.class, null)
try
{
def secureParameter = secureStorageService.getUserCredential(accountKeyAlias)
return secureParameter.getPassword().toString()
}
catch(Exception e)
{
throw new SecureStoreException("Secure Parameter not available")
}
}
String hmac_sha256(byte[] secretKey, String data)
{
try
{
Mac sha256_HMAC = Mac.getInstance("HmacSHA256")
SecretKeySpec secret_key = new SecretKeySpec(secretKey, "HmacSHA256")
sha256_HMAC.init(secret_key)
byte[] digest = sha256_HMAC.doFinal(data.getBytes())
return digest.encodeBase64()
} catch (InvalidKeyException e)
{
throw new RuntimeException("Invalid key exception while converting to HMac SHA256")
}
}
The above code can be modified according to requirement.
Step 4:
Configure HTTP receiver channel as below.
HTTP Receiver Channel
Step 5:
In exception sub-process, use Groovy Script to log the HTTP error response.
Reference Link: Exception Handling in HTTP Receiver
Test Execution:
The below shows a sample CSV file. Please note the data used in executing the scenario is test data/ fake data only.
Sample CSV Input File
After successful execution of the message in Cloud Integration, the file is uploaded in Azure Blob Storage.
Azure Blob Storage Explorer
What happens to the placed file?
- The cloud hot folder moves any file placed in the Blob directory to a temporary processing directory.
- The cloud hot folder downloads the file from the processing directory to a standard hot folder in SAP Commerce Cloud.
- Standard hot folder decompresses the file and converts it into Impex format using ImpexConverter for import.
- When the hot folder finishes processing the file, the cloud hot folder moves it from the temporary processing directory to the error/archive directory.
Conclusion:
Since Cloud Hot Folders are using Azure Blob Storage, the options to use FTP/SFTP/NFS are no longer available. The Azure Blob Services offer several REST operations through the HTTP protocol for connectivity. PUT Blob rest operation is used to transfer files to Blob storage.
Thank you for reading this blog post. Please feel free to share your feedback or thoughts or ask questions in the Q&A tag below.
To achieve the same functionality using open connectors, check this blog post.
Regards,
Priyanka Chakraborti
Next – Part 2
Interesting but this groovy signature is very similar that I did for S3 Bucket AWS.
Groovy Signature AWS S3 Bucket
Important to make references.
Thanks for sharing.
Hi Ricardo,
Thanks for the reference but I were not aware of this.
Regards,
Priyanka
Hello,
It's just metter to be polite and kind, this is the way that all grow around
Congratulations.
Very nice explanation.. really helpful.
Regards,
Anish
Thank you 🙂
Nice One. 🙂
Thanks 🙂
Hi Priyanka,
Thanks for the blog .
I am utilizing this for getting the file from blob storage but its showing Authentication failed and Signature does not match with any computed signature.
Could you please help .
I followed the same process to generate Signature :Authorization with GET verb .
Thanks & Regards
Saurabh
kumarsaurabh8618@gmail.com
Hi Saurabh,
Signature string will be different for the GET method. You can try with the below script. Modify the property names as per your iflow config. For GET method, you have to specify the exact filename also. In the below script, the filename is retrieved from property 'fileName'.
For e.g. If you want to retrieve file named 'Sample.csv', set property fileName as 'Sample.csv'.
HTTP Receiver settings:
HTTP Receiver Adapter
Regards,
Priyanka
Hi Priyanka,
Thanks for your swift response.
It really worked dedicated for one file.
is it possible to get all the files from the folder path ,those have been created or modified after the last pole by CPI
Thanks and Regards
Saurabh
Hi Saurabh,
1st you can use List Containers REST API to get the list of filenames and filter out using last modified datetimestamp and then use the GET API to retrieve the file one by one.
Link: https://docs.microsoft.com/en-us/rest/api/storageservices/list-containers2
Regards,
Priyanka
Hi Priyanka,
List with Blobname is retrieved but there are mutiple blobs and is it possible to get the file where blob name with multiple directories e.g - myaccount->container->blobs->blob name -> A/B/C/FileName1.....N.csv.
is it possible to get list only these files FileName1....N and I tried with prefix but It did not work and again filtering based on last modified then passing them one by one to next REST API call to retrive the file .
Condition - CPI should search the file which is modified after the Last Pole happened from CPI ,So for this comparison where should I store the timestamp for lastpole and pass this to compare .
Hi Saurabh,
The scenario sounded similar to the one that I designed previously. Please check out the new blog post on this. Link: Part 2
Hope this helps.
Regards,
Priyanka
Thanks a lot,
Really helpful,Great blog,simply awsm.
Hello Priyanka,
Thanks for your blog,
I have followed as to same. I am getting below error cloud please help on the same.
Error
in the monitoring it is showing 403 status code but same public key i am able to opening the Blob Storage.
Do i need to check any thing from Azure Blob side.
Regards,
Kumar
Hello Kumar,
I can see the content-length is missing. To debug the issue, you can add an attachment for StringToSign (specified in groovy script). Both StringToSign and server generated signature string should match. Content-Length is must for PUT call.
Regards,
Priyanka
Hi Priyanka,
Thanks,
I have added content-type
I am not aware how to do(you can add an attachment for StringToSign (specified in groovy script). Both StringToSign and server generated signature string should match) this i am very new to CPI.
I have copy and past your scritp and done some changes.
I have put a debuge and find this error
"<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:82095516-401e-006e-7502-fb1478000000 Time:2021-12-27T09:16:38.4731631Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request 'sNsO93QmQ5RwzpSpoIIFYvSP8XMM2xjxYaXEwylaMK8=' is not the same as any computed signature. Server used following string to sign: 'PUT"
Thanks a lot.......
Hi Kumar,
To add attachment, refer to the link: https://help.sap.com/viewer/368c481cd6954bdfa5d0435479fd4eaf/Cloud/en-US/17dba92e6ed4402f8cb0f05093a34269.html?q=attachment
You can add the script lines to the same groovy script only. In that case, use like below
messageLog.addAttachmentAsString('Signature', StringToSign, 'text/plain').
This will help you to check the signature string generated by the script.
Hi Kumar,
Able to fix the issue, was it something to do with version?
Thanks,
Poorna
Hi Priyanka,
Thanks for all your qucik replies is was very helped me to resovle.
Signature I ahve added the Header in http communication chenall now the issue notcoming.
I ahve given the spaces and tried some combinations like spaces but no luck.
please suggest me.
Thanks a lot.......
Hi Madhu,
You don't have to add signature as header. Authorization is calculated based on the signature and the headers, which are required to be passed, are already shown in the screenshot attached in the blogpost.
Regards,
Priyanka
Hi Priyanka,
Thanks for the valuable blog. I got the requirement with the similar setup, but with the pdf files.
I used the same design to process the pdf file with Content type as application/pdf.
below is the Signature details -
Signature
Required details are available. With CSV files i am receiving expected result. But I'm facing error for PDFs. Below is the error messages we are receiving while processing pdf files.
<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:77abe719-e01e-0043-32c6-75fc92000000
Time:2022-06-01T14:44:02.7676748Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request 'gruflAzW4bDMxttgMWkZ0qK7oqmX2HQR10noUi2BpQU=' is not the same as any computed signature. Server used following string to sign: 'PUT
51604
application/pdf; charset=ISO-8859-1
x-ms-blob-type:Blockblob
x-ms-date:Wed, 01 Jun 2022 14:44:02 GMT
x-ms-version:2020-04-08
/-------------------------------/invoices/nonprod/invoice_test1.pdf'.</AuthenticationErrorDetail></Error>
Error
Could you please help me to resolve the same.
Thanking you in advance.
Aavez
Hi Aavez,
We're you able to find the fix for this issue?
Thanks,
Poorna
Hello Priyanka,
thanks for the Blog post, it works perfectly!
Thanks,
René
**EDIT:
I ran into several 403 errors issue at the azure endpoint, as the azure calculated content-lenght of the body was sometimes different from the value in the StringToSign variable. Here is why:
Your approach calculates the length like the standard-HTTP header "content-length" based on byte-size:
But Azure seems to calculate the string-length which can be different depending on the payload. I fixed this by using body.length() instead: