Skip to Content

In my earlier blog, we had discussed Integrating Big Data Workflow with SAP BODS. In this blog, we will explore how we can directly use Cloud Services on BODS Workflow.

loud Storages are Services provided by major cloud platforms that can store and handle large number of files of huge sizes. AWS S3, Azure and Google provide Cloud Storages that are used for storing ad-hoc files like log, flat files and data dumps. SAP BODS 4.2. SP7 introduced the support for the above-mentioned Cloud Storages.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 1

In this blog, we will consume data from AWS S3. The Steps for the other Cloud Services are similar.

Configuring Cloud Storage Services

The Cloud Storage Services should be configured so that SAP BODS can connect to it. The configuration can be followed from the guide published by the Cloud vendor.

To Connect to AWS S3, we will need to enable IAM access to AWS. Once the IAM access is enabled, then access and secret key must be generated to the IAM user for the S3 which is used by BODS to consume the data from S3.

The access and secret key can be generated from the Users section in IAM. Copy the access and secret key after generation.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 2

Place the required files in S3 bucket to consume it in SAP BODS.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 3

Configuring BODS with the Cloud Services

We need to create a File Locations in SAP BODS that points to the AWS S3. Login to the Designer and navigate to Formats in the Local Object Library.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 4

In the File Locations context menu, select New and create a new Flat File or Excel file depending on your source.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 5

Create the File Location by selecting the protocol as Amazon S3 Cloud Storage. Fill in the Security details of Access and Secret key and select the region. Provide the details of bucket name from which the data has to be fetched and configure the other necessary parameters.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 6

Different Configurations can be set for your Dev/Quality and Production. Azure and Google Cloud can be configured in similar manner.

Create a new Flat File or Excel file depending on the Data Source and Enter the format of the file.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 7

Drag and drop the file in the Data Flow and you can use that Object to perform Transformation and other operations.

Azure and Google Cloud Services can be configured using the above mentioned method and BODS can be used to process files between each other or combine files from them together and process the same.

To report this post you need to login first.

4 Comments

You must be Logged on to comment or reply to a post.

  1. Former Member

     

    Hi Shankar,

    I tried to create File Location for Amazon S3 in Data Services,I could not succeed.

    I am not really sure what needs to be passed for File System (remote directory and bucket).

    Can you please guide me on this.

     

    (0) 
      1. V S Srirangarajan

        Hi,

        I have requirement to connect BODS with AWS. I have successfully uploaded a file into Amazon S3 bucket. But the requirement is to upload a file into a subfolder under the bucket.

        I checked notes on S3 product notes and it mentions S3 has a flat structure under bucket. Found other applications also have similar issues using subfolders and they use “Key name” concept to refer to file object. Not sure how BODS can handle this

        https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingMetadata.html

        I have tried to specify subfolder name using different options like in File location connection as well File properties but not successful

        Would appreciate any input on the same

         

        Thanks

         

        (0) 

Leave a Reply