Skip to Content
Technical Articles
Author's profile photo Mayura Belur Mohana

Connecting to Azure Blob storage from SAP Cloud Platform Integration using Camel-Azure component

This blog is a continuation series of the earlier blog https://blogs.sap.com/2020/07/16/apache-camel-community-adapters-usage-in-sap-cloud-platform-integration/ ,  which talks about usage of Apache Camel Components as SAP Cloud Platform integration ADK compliant integration adapters in the integration flow.

In this series let us understand how to use the Apache Camel-Azure component as the SAP cloud platform integration adapter.

You can find the adapter artefacts in the SAP GitHub apibusinesshub-integration-recipes, whose link is provided below

https://github.com/sap/apibusinesshub-integration-recipes/tree/master/Recipes/for/azure-integration-adapter

In the Apache Camel Azure component, it looks for a bean to resolve the Azure cloud account credentials details, here in the code that bean lookup is bypassed with the manual credential configuration, which you have to provide in the project source.

Please import the project source in to Eclipse IDE using Maven project import wizard and edit the BlobServiceComponent.java class ,CreateEndpoint method and provide the Azure cloud account credential configuration as shown below:

 

You have to configure account name and account secret, which you can obtain from Azure Blob Storage account Portal

 

After doing the changes, build the project using “maven clean install” and generate the adapter ESA file. which you have to deploy in to the CPI integration runtime

 

Deploy the Integration adapter ESA file on to the SAP cloud platform integration runtime

After successful deployment you can see the adapter under the deployed artefacts

 

As a next step You can import the integration flow, which is provided as a part of artefacts, which demonstrate the usage of the Apache Camel Azure component

 

After importing the integration flow, you can provide Azure Blob storage container configuration

you can see an example configuration shown in the adapter property sheet

 

Input provided in the Integration Flow for storing the message payload into the Azure block blob Storage

 

 

After providing all the configuration, save/deploy the integration flow and check that the integration flow message processing successfully completed in the CPI monitoring.

 

 

You can see the Exchange payload from CPI iFlow message processing getting stored as a Azure Block blob

 

You can view the Blob file content, you can see the Exchange payload

 

Also going forward, we continue to provide more samples of Apache Camel components (https://github.com/apache/camel/tree/master/components  ) as Cloud integration ADK Compliant adapters. Here all the components we provide based on Apache Camel version 2.17.4.

Disclaimer: This adapter is shipped as “community contribution” under Apache 2.0 license; support for this adapter should be requested by raising an issue in the GitHub repo and not standard SAP channels.

Assigned Tags

      2 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Kiran Deevela
      Kiran Deevela

      Very Useful

      Author's profile photo Anton Delitsch
      Anton Delitsch

      Nice article.

      Do you know how to make the adapter evaluate expressions when it is called? I am trying to set blobname to the value of an exchange property.

       

      I added an field for blobName (and removed som other logic) but when setting the value to ${header.something} tha adapter fails with the error bellow.

      java.lang.IllegalArgumentException: Illegal character in path at index 66: https://<storage>.blob.core.windows.net/<path>/${property.SAP_MessageProcessingLogID}