Skip to Content

Consuming Data From Cloud Storage in SAP BusinessObjects Data Services

In my earlier blog, we had discussed Integrating Big Data Workflow with SAP BODS. In this blog, we will explore how we can directly use Cloud Services on BODS Workflow.

loud Storages are Services provided by major cloud platforms that can store and handle large number of files of huge sizes. AWS S3, Azure and Google provide Cloud Storages that are used for storing ad-hoc files like log, flat files and data dumps. SAP BODS 4.2. SP7 introduced the support for the above-mentioned Cloud Storages.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 1

In this blog, we will consume data from AWS S3. The Steps for the other Cloud Services are similar.

Configuring Cloud Storage Services

The Cloud Storage Services should be configured so that SAP BODS can connect to it. The configuration can be followed from the guide published by the Cloud vendor.

To Connect to AWS S3, we will need to enable IAM access to AWS. Once the IAM access is enabled, then access and secret key must be generated to the IAM user for the S3 which is used by BODS to consume the data from S3.

The access and secret key can be generated from the Users section in IAM. Copy the access and secret key after generation.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 2

Place the required files in S3 bucket to consume it in SAP BODS.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 3

Configuring BODS with the Cloud Services

We need to create a File Locations in SAP BODS that points to the AWS S3. Login to the Designer and navigate to Formats in the Local Object Library.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 4

In the File Locations context menu, select New and create a new Flat File or Excel file depending on your source.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 5

Create the File Location by selecting the protocol as Amazon S3 Cloud Storage. Fill in the Security details of Access and Secret key and select the region. Provide the details of bucket name from which the data has to be fetched and configure the other necessary parameters.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 6

Different Configurations can be set for your Dev/Quality and Production. Azure and Google Cloud can be configured in similar manner.

Create a new Flat File or Excel file depending on the Data Source and Enter the format of the file.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 7

Drag and drop the file in the Data Flow and you can use that Object to perform Transformation and other operations.

Azure and Google Cloud Services can be configured using the above mentioned method and BODS can be used to process files between each other or combine files from them together and process the same.

9 Comments
You must be Logged on to comment or reply to a post.
  •  

    Hi Shankar,

    I tried to create File Location for Amazon S3 in Data Services,I could not succeed.

    I am not really sure what needs to be passed for File System (remote directory and bucket).

    Can you please guide me on this.

     

      • Hi,

        I have requirement to connect BODS with AWS. I have successfully uploaded a file into Amazon S3 bucket. But the requirement is to upload a file into a subfolder under the bucket.

        I checked notes on S3 product notes and it mentions S3 has a flat structure under bucket. Found other applications also have similar issues using subfolders and they use “Key name” concept to refer to file object. Not sure how BODS can handle this

        https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingMetadata.html

        I have tried to specify subfolder name using different options like in File location connection as well File properties but not successful

        Would appreciate any input on the same

         

        Thanks

         

        • Hello,

          BODS version: 4.2 SP7

          I am facing the same issue. I tried several options and couldn’t solve it.

          I am trying to upload a file to Amazon S3 through BODS. I keep getting the error – ‘ AWS S3 bucket <awxyz-sdsa> does not exist’. I have access only to subfolders under the S3 bucket. 

          I tried from S3 browser and I am able to connect to the subfolders under S3 bucket.

          Any help is greatly appreciated!

          Thank you.

           

  • Hi Shankar,

     

    Is it possible to read the files from Azure Blob storage? If yes, could you please provide the steps.

     

    I have created a file location of Blob storage, but not sure how to read the file from that location.

  • Hi Shankar,

    have you been able to export such an AWS S3 Cloud Storage out from Dataservice?

    When I do an export – with a Export-Password – I get empty nodes / Items in XML / ATL file.

    Importing it into another Repo destroys such Storage.

    In the ATL / XML file it looks like this:

    <fl_s3_accesskey></fl_s3_accesskey>

    <fl_s3_secretkey></fl_s3_secretkey>

     

    I also do not find a solution to set the fl_s3_accesskey / fl_s3_secretkey with an executable from Data Service.

    So automating the execution of a job will fail and I have to create such Stores manually.

     

    Bit of a background:
    I export Job as a XML file and integrate it in an overall ETL-process code

    I create an Jenkins Pipeline which takes the XML and imports it into a Repo

    The pipeline imports Systemconfiguration, Datastores, Flatfiles, etc …

    The Pipeline executes the job after import

    The pipeline includes also other Deployment Items which are related to this DS-Job

     

    Everything works fine except the AWS S3 storages. I can not create an XML/ATL export and reimport it (Including Password) withoud destroying it. Creating those Stores manually is not a very generic approach. I would like to automate this.

     

    Would be grate if you have a hint (Something like al_engine.exe –setSecretOfAWS_accesskey=theKey )

     

    Thanks,

    Mansur