Skip to Content
Technical Articles
Author's profile photo Sreekanth Surampally

Exporting data from HANA cloud to Amazon S3 Storage

SAP HANA cloud offers data import and Export options to public Cloud storage services like Amazon S3, Azure Storage and Alibaba cloud OSS. In this blog, I will explain about data export to Amazon S3 storage from HANA cloud objects. However, the process is going to same for other cloud services as well. To perform this workshop, I am using HANA cloud trial and AWS free tier S3 storage.

At first, I am going to access Database Explorer — > invoke SQL console and create a DB table and insert some data records for data export.

In the next Step, I am going to Set up Amazon S3 storage service and create a bucket to store the data files. I have chosen Canada data center and updated with setting public access.

now, I am going to get the Trust Certificate from Amazon S3 service. I have shown the steps to get the certificate in video below. Also I have provided the certificate in this Github file along with other SQL statements to be run for creating SSL trust. so that, you can easily access the code.

It is all set up now. Let us Export the data from the table, Select the table and choose Export data from Context menu.  Then choose the Cloud store option as Amazon S3, provide the region and path for s3 bucket as shown below. how to get the Access key and secret key is also explained in the video tutorial.

Export is finished and file is now placed in S3 bucket. It can be downloaded and moved to another secure location for external usage.

Alternatively we can run the SQL command to do this export, EXPORT into SQL statement is used for that. Code snippet can be found in this github file.

That is it, now we have successfully exported data HANA Cloud table to Amazon s3 cloud storage, with this feature, you can also export catalog objects, data from Calculation views data as well.

Video recording on this session is below. https://youtu.be/z1fnNJZuXek

 

Thanks

Sreekanth

 

 

 

 

 

 

Assigned Tags

      5 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Vivek Kumareng verma
      Vivek Kumareng verma

      Nice blog

      Author's profile photo Anton Efremov
      Anton Efremov

      Thank you Sreekanth for your blog!

      Another helpful statement at the end as step 5 for me was:

      SET PSE SSL PURPOSE REMOTE SOURCE;

      Some additional information can be found here.

      Author's profile photo Michael Kaufmann
      Michael Kaufmann

      Hi Anton, the link does not work (seems to be internal). Regards, Michael

      Author's profile photo Prashanth Mally
      Prashanth Mally

      Providing few steps which worked for me:

      Login to SQL Console

      Step#1

      CREATE PSE AWS; -- AWS can be any Name

      Step#2

      CREATE CERTIFICATE FROM  -- Ask your AWS team to provide AWS S3 Bucket's Certificate
      '-----BEGIN CERTIFICATE-----
      MIIEDzCCAve END CERTIFICATE-----'
      COMMENT 'S3 AWS Extract';

      *Please Note -- After excuting above statement, you will find the certification number in certification table

      Step#3

      Select * from certificates where comment = 'S3 AWS Extract';  -- Take the certification number and use it in below alter statement,

      Step#4

      ALTER PSE AWS ADD CERTIFICATE XXX170;

      --if anything goes wrong you decided to delete or drop, use the alter statement with "drop" (example: Alter PSE AWS Drop Certificate XXX170)

      Step#5
      SET PSE AWS PURPOSE REMOTE SOURCE; -- This final step is important which is missing in this github
      
      
      
      I just performed the same steps as mentioned above and it worked for me, hopefully it might helpful for others too.
      
      
      
      
      Author's profile photo Ivan Despotovic
      Ivan Despotovic

      Thank you, Sreekanth, for a great blog.

      I have a similar requirement, but I struggle to find a solution. I was wondering if someone could assist.

      I needed to export data from HANA Cloud to Azure blob container in a specific file format (csv or parquet.) I followed the Importing and Exporting with Microsoft Azure Storage guide and I was able to export data using Azure SAS token.

      The problem is that, at my client, all azure services (in this case azure storage account/container) should be accessed using service principal as SAS token approach is not allowed.

      So, my question would be, is it possible to use the EXPORT INTO functionality to export data from HANA Cloud to Azure blob storage but using azure service principal credentials instead of SAS token?

      Best regards,
      Ivan