Technical Articles
Using the HANA Cloud Athena api adapter to access S3 data
Using the SAP HANA Cloud Athena api adapter to access S3 data In this blog post we will explore how to access csv data from Amazon S3 into SAP HANA Cloud. This can be a handy approach when customers want to ingest S3 data directly into HANA Cloud. Prerequisites
-
You should have admin access to AWS console, along with the Access Key ID and the secret access key for the AWS user.
-
You have a SAP Cloud Platform access and a SAP HANA Cloud instance
Log into the AWS console and perform the below steps.
-
Edit your AWS user and add the following two policies to your user
-
-
AWSQuicksightAthenaAccess
-
AmazonAthenaFullAccess
-
-
Create a S3 bucket in AWS and create two folders, one for uploading the csv data file and the other to store query results.
-
-
First create an Athena service and then click on the “set up a query results location in Amazon S3”
-
-
-
Set the query results location to the S3 folder
-
-
-
Click on the workgroup and add the query results location there as well
-
-
Create an Athena database and a table based on the csv file. Use the create table (from S3 bucket data) wizard in the query editor.
-
Open the SAP HANA database explorer from the Open in option under the SAP HANA service information. Note that you need to have CREATE REMOTE SOURCE system privilege. If your user does not have this privilege ask your administrator to grant it .
-
The Athena adapter needs both Athena and S3 certificates for TLS/SSL. Download the required root certificates from the following sites:
https://aws.amazon.com/ (Amazon Root CA 1) https://<bucket_name>.s3.amazonaws.com (Baltimore CyberTrust Root) I would suggest downloading ALL the certificates from these 2 sites. https://aws.amazon.com has five certificateshttps://hscbucket.s3.amazonaws.com has 3 certificates.
![]()
-
Once the certificates are downloaded, you will need to run the below set of commands to create a PSE, add the certificates and set a purpose
CREATE PSE PSE1;
CREATE CERTIFICATE FROM '
-----BEGIN CERTIFICATE-----
MIIF+DCCBOCgAwIBAgIQA7LS7ojPMbWMYHYrCWXTaDANBgkqhkiG9w0BAQsFA
## lines are remove##
MlnNeJZsTV8qxACwZEO9v+mLjAc5gGRK/T4jZTMwUF2O1QJcvSzuW5B9y4w=
-----END CERTIFICATE-----
' COMMENT 'AWS1CERT’;
-
Repeat the above create certificate from command for all the eight certificates.
-
Add the certificates to the above created PSE
SELECT CERTIFICATE_ID FROM CERTIFICATES WHERE COMMENT = '<comment>';
ALTER PSE PSE1 ADD CERTIFICATE <certificate_id>;
-
Set the purpose of the PSE to remote source using the below SQL
SET PSE PSE1 PURPOSE REMOTE SOURCE;
Here is a screen shot of my certificate store one the above exercise is done![]()
-
Now you can create a new remote source by right clicking remote source in the database explorer. You will need your Access Key ID and the secret access key for the AWS user and will need to choose Athena as the adapter.
-
You will be able to see databases and tables you created in Athena. You can now check the box against the table name and then create a virtual object .
-
Now you will be able to see the table1 in schema DBADMIN.
-
You can display the data of the S3 bucket in HANA Cloud.
Conclusion In this blog post, we created a S3 bucket and loaded csv file into it. We then created a Athena table on it. There after we created a remote source using the SAP HANA Cloud Athena api adapter and created a table in SAP HANA Cloud which was accessing data from the AWS athena. With the right configurations on AWS and the right SSL certificates in SAP HANA Cloud, it is quite simple to consume the S3 data into SAP HANA Cloud..
Well documented