Azure Storage Services are one of the oldest and now widely used Cloud based Storage solutions. They are used for storing ad-hoc flat files ranging from small bytes and scale up to petabytes which classifies them as “Big Data” Services. In this blog, we will explore how Enterprises can use these services as file stores and integrate them into their SAP BODS Workflow.
Mounting file storage in Windows Explorer
The Azure File Store can be mounted on to the system which BODS can read files from and write the output into. On the Azure File Store, create a New File Share to create a new File share or click on an existing one.
You can connect the file share as a Network drive by using the Power Shell or regular windows command prompt. If you are running Linux, you can use CIFS to mount the drive. Click on connect button for the connection commands.
The drive letter should be the same for both the BODS Client and the Server. The service running the BODS services should have access to the mounted drive. Once you mount it, you should be able to see the drive in your Explorer.
Accessing file store in BODS
The Mounted file store can be accessed in BODS by its mounted directory. Create a new local File location with the mounted directory.
Now create a new Flat file. Select the location that was created in the above step and the File name inside the File Store.
You can now use this in your Workflow as a Flat File for reading and writing Data. Some of the other benefits of using this are:
- Since the Files are stored directly on the cloud, it can scale up in size as needed.
- It can act as a backup storage for Staging files, logs, etc.
- This can be used to consume large size files without having to waste space locally on the Server.
Next in this series, we will explore how we can directly use Cloud Services on BODS Workflow.
Good one!!!
Thanks 🙂
Hi Shankar,
Can we use AWS S3 as a Target? I checked your other blog which suggested Azure can be used as a Target.
Thanks,
Snehasish
Yes you can. BODS can connect to AWS S3.
Regards
HI Shankar,
Very interesting Read it is very helpful.
Is it possible to read the data directly from Azure BLOB storage path instead of mounting the path to our local system.
We are working on this approach, we are unable to read the data directly from Blob storage. could you please advise.
Hello Shankar,
Is it possible, to connect a SAP ERP with Data Services to Azure Data Lake as destination?
Regards
Sure, define a datastore for the SAP source system.
Also note you don't need the workaround for accessing Azure as described in this blog. DS features a native interface (File Location) to Azure Cloud Storage.
and with this we can load delta extraction?
Yes, we can.
hoz the performance with big tables like BSEG , Bkpf?
We would extract such tables in form of output split files and place them on cloud. Like to know any performance measure if some one has it by experience.
Hi Shanker,
I need your help.
I want to put Customer.csv file in AWS S3 bucket using BODS job.
Details:
DS 4.2 SP10 and Unix environment.
I have prepared a BODS job help of your blog. Job completed successfully, but the files is not placing in S3 bucket.
While ran the job in Debug mode it's showing Error "HTTP client error:<35>:<SSL connect error>."
Can you please help me to resolve this issue.
Note: Do we need to install any libraries
Regards,
Srinivaas