Skip to Content
Author's profile photo Shankar Narayanan SGS

Integrating Azure Storage Services in a Big Data Workflow with SAP BODS

Azure Storage Services are one of the oldest and now widely used Cloud based Storage solutions. They are used for storing ad-hoc flat files ranging from small bytes and scale up to petabytes which classifies them as “Big Data” Services. In this blog, we will explore how Enterprises can use these services as file stores and integrate them into their SAP BODS Workflow.

Mounting file storage in Windows Explorer

The Azure File Store can be mounted on to the system which BODS can read files from and write the output into. On the Azure File Store, create a New File Share to create a new File share or click on an existing one.

Integrating Azure Storage Services in a Big Data Workflow with SAP BODS - 1

You can connect the file share as a Network drive by using the Power Shell or regular windows command prompt. If you are running Linux, you can use CIFS to mount the drive. Click on connect button for the connection commands.

The drive letter should be the same for both the BODS Client and the Server. The service running the BODS services should have access to the mounted drive. Once you mount it, you should be able to see the drive in your Explorer.

Integrating Azure Storage Services in a Big Data Workflow with SAP BODS - 2

Accessing file store in BODS

The Mounted file store can be accessed in BODS by its mounted directory. Create a new local File location with the mounted directory.

Integrating Azure Storage Services in a Big Data Workflow with SAP BODS - 3

Now create a new Flat file. Select the location that was created in the above step and the File name inside the File Store.

Integrating Azure Storage Services in a Big Data Workflow with SAP BODS - 4

You can now use this in your Workflow as a Flat File for reading and writing Data. Some of the other benefits of using this are:

  • Since the Files are stored directly on the cloud, it can scale up in size as needed.
  • It can act as a backup storage for Staging files, logs, etc.
  • This can be used to consume large size files without having to waste space locally on the Server.

Next in this series, we will explore how we can directly use Cloud Services on BODS Workflow.

Assigned Tags

      11 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo sree tiruchanur
      sree tiruchanur

      Good one!!!

      Author's profile photo Shankar Narayanan SGS
      Shankar Narayanan SGS
      Blog Post Author

      Thanks  🙂

       

      Author's profile photo SNEHASISH GHOSH
      SNEHASISH GHOSH

      Hi Shankar,

       

      Can we use AWS S3 as a Target? I checked your other blog which suggested Azure can be used as a Target.

       

      Thanks,

       

      Snehasish

       

      Author's profile photo Shankar Narayanan SGS
      Shankar Narayanan SGS
      Blog Post Author

      Yes you can. BODS can connect to AWS S3.

       

      Regards

       

      Author's profile photo srujan G
      srujan G

       

      HI Shankar,

       

      Very interesting Read it is very helpful.

       

      Is it possible to read the data directly from Azure BLOB storage path instead of mounting the path to our local system.

       

      We are working on this approach, we are unable to read the data directly from Blob storage. could you please advise.

       

      Author's profile photo Carlos Cardenas
      Carlos Cardenas

      Hello Shankar,

       

      Is it possible, to connect a SAP ERP with Data Services to Azure Data Lake as destination?

       

      Regards

      Author's profile photo Dirk Venken
      Dirk Venken

       

      Sure, define a datastore for the SAP source system.

      Also note you don't need the workaround for accessing Azure as described in this blog. DS features a native interface (File Location) to Azure Cloud Storage.

      Author's profile photo Carlos Cardenas
      Carlos Cardenas

      and with this we can load delta extraction?

      Author's profile photo Dirk Venken
      Dirk Venken

       

      Yes, we can.

      Author's profile photo Siddharth Krishna
      Siddharth Krishna

      hoz the performance with big tables like BSEG , Bkpf?

      We would extract such tables in form of output split files and place them on cloud. Like to know any performance measure if some one has it by experience.

      Author's profile photo Srinivas Reddy
      Srinivas Reddy

      Hi Shanker,

      I need your help.

      I want to put Customer.csv file in AWS S3 bucket using BODS job.

      Details:

      DS 4.2 SP10 and Unix environment.

      I have prepared a BODS job help of your blog. Job completed successfully, but the files is not placing in S3 bucket.

      While ran the job in Debug mode it's showing Error "HTTP client error:<35>:<SSL connect error>."

      Can you please help me to resolve this issue.

      Note: Do we need to install any libraries

      Regards,

      Srinivaas