Exporting data from HANA cloud to Amazon S3 Storage
SAP HANA cloud offers data import and Export options to public Cloud storage services like Amazon S3, Azure Storage and Alibaba cloud OSS. In this blog, I will explain about data export to Amazon S3 storage from HANA cloud objects. However, the process is going to same for other cloud services as well. To perform this workshop, I am using HANA cloud trial and AWS free tier S3 storage.
At first, I am going to access Database Explorer — > invoke SQL console and create a DB table and insert some data records for data export.
In the next Step, I am going to Set up Amazon S3 storage service and create a bucket to store the data files. I have chosen Canada data center and updated with setting public access.
now, I am going to get the Trust Certificate from Amazon S3 service. I have shown the steps to get the certificate in video below. Also I have provided the certificate in this Github file along with other SQL statements to be run for creating SSL trust. so that, you can easily access the code.
It is all set up now. Let us Export the data from the table, Select the table and choose Export data from Context menu. Then choose the Cloud store option as Amazon S3, provide the region and path for s3 bucket as shown below. how to get the Access key and secret key is also explained in the video tutorial.
Export is finished and file is now placed in S3 bucket. It can be downloaded and moved to another secure location for external usage.
Alternatively we can run the SQL command to do this export, EXPORT into SQL statement is used for that. Code snippet can be found in this github file.
That is it, now we have successfully exported data HANA Cloud table to Amazon s3 cloud storage, with this feature, you can also export catalog objects, data from Calculation views data as well.
Video recording on this session is below. https://youtu.be/z1fnNJZuXek
Thank you Sreekanth for your blog!
Another helpful statement at the end as step 5 for me was:
Some additional information can be found here.
Hi Anton, the link does not work (seems to be internal). Regards, Michael
Providing few steps which worked for me:
Login to SQL Console
CREATE PSE AWS; -- AWS can be any Name
CREATE CERTIFICATE FROM -- Ask your AWS team to provide AWS S3 Bucket's Certificate
MIIEDzCCAve END CERTIFICATE-----'
COMMENT 'S3 AWS Extract';
*Please Note -- After excuting above statement, you will find the certification number in certification table
Select * from certificates where comment = 'S3 AWS Extract'; -- Take the certification number and use it in below alter statement,
ALTER PSE AWS ADD CERTIFICATE XXX170;
--if anything goes wrong you decided to delete or drop, use the alter statement with "drop" (example: Alter PSE AWS Drop Certificate XXX170)
Thank you, Sreekanth, for a great blog.
I have a similar requirement, but I struggle to find a solution. I was wondering if someone could assist.
I needed to export data from HANA Cloud to Azure blob container in a specific file format (csv or parquet.) I followed the Importing and Exporting with Microsoft Azure Storage guide and I was able to export data using Azure SAS token.
The problem is that, at my client, all azure services (in this case azure storage account/container) should be accessed using service principal as SAS token approach is not allowed.
So, my question would be, is it possible to use the EXPORT INTO functionality to export data from HANA Cloud to Azure blob storage but using azure service principal credentials instead of SAS token?