Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Vitaliy-R
Developer Advocate
Developer Advocate






This post builds on top of the previous, but should help you as a reference without going through all previous posts hopefully too.

In my SAP HANA database in the SAP HANA Cloud trial, I got four calculation views built: 3 dimensions Date_Dim, Item and Warehouse and 1 cube with the star join Inventory.


In this post let's see how to use the SAP Analytics Cloud trial to create a live data connection to the SAP HANA Cloud database using a technical user authenticated with a password. That connection would allow us to build stories or analytic applications using the data from the cube.

Some time ago ian.henry described already how to Connect SAP Analytics Cloud to HANA Cloud in Three Clicks. We are going to follow the same clicks but will practice a few different options with users and required authorizations in our trial environment.

Please note that we use a trial environment to learn a few things, not to describe the single best practice 🤓

Option 1: Reuse of the existing HDI Service Key


The aforementioned documentation mentions a user used for the connection should "need the access_role and external_privileges_role for the HDI container".

Checking these roles we can see they are assigned to existing application user (with _RT postfix) used in our project in SAP Business Application Studio and as well in Database Explorer to work with the project's target container. It comes from the shared HDI service key SharedDevKey used as well in .env file, as we could see in Modeling in SAP HANA Cloud (part 2): starting with the project files and users.


And then the access_role specifically gives the user SELECT privilege on the whole HDI container's schema with calculation views.

Sounds we can quickly reuse this user as a technical user for our HANA live connection in SAP Analytics Cloud. Let's try!

Create a connection in SAP Analytics Cloud


Please note that I am using a trial tenant of the SAP Analytics Cloud, which got an update to the version 2021.9 already, and therefore it has the upgraded UI described in the Introducing a New Product Navigation for SAP Analytics Cloud by sean.mcgregor.


So, let add a new entry in Connections with live data connection to SAP HANA.

Back in the SAP Business Application Studio collect values required for the connection definition similarly to the way used in previous posts.
cat db/.env | cut -d "=" -f 2- -z | \
~/bin/jq '.hana[] | select(.tags[]=="mta-resource-name:hdi_db") .credentials | {host: .host, user: .user, password: .password}'


Now let's use these host (without the port number!), user and password in SAP Analytics Cloud.



Create a model


Go to the Modeller tool in the SAP Analytics Cloud and create a new live data model selecting SAP HANA as a system type, previously created connection HCTrialTPC and Inventory cube as a data source.


We do not need to modify anything in this example, so just simply save the model as Inventory in the new folder TPC.



Check data in a simple story


Once the model is saved, the navigation in the new shell bar allows you to quickly create a new story based on it.


Because of the mandatory Date-to-Report variable defined in the HANA calculation view we need to provide the date in the prompt in the story. Let's pick 1998-01-01 that has data in the cube.


And then in the story let's just quickly check data by switching to the Data view and add the dimension Category to display a default bar chart for the only measure in the model Quantity on Hand.


That's all for now to check the model/story building with the existing connection we have created.

Our focus today is on other options to create working connections.

Option 2: A dedicated Service Key


Using the same user for everything might be a bad idea even in the trial environment. Another approach would be to create a separate service key (and separate set of service database users) for our HDI container.

As a reminder you can get the target container's service name running the query...
cat db/.env | cut -d "=" -f 2- -z | \
~/bin/jq -r '.hana[] | select(.tags[]=="mta-resource-name:hdi_db").instance_name'

... and then using the name of the service to create a service key called e.g. SACKey to get a user and a password of the application user _RT created with the new key.
cf create-service-key tpcds-hdidb-ws-x4bb9 SACKey
cf service-key tpcds-hdidb-ws-x4bb9 SACKey




Obviously we can get the same done from SAP BTP Cockpit, but we are trying to reduce the number of jumps between windows and clicks 🤓



Let's check the roles assigned to the new user TPCDS_HDI_DB_1_7JZOO8ISOFUVQKP00K9V8MVEX_RT. Indeed, it got automatically ::access_role for the container assigned.


Let's try this user in the connection.

Replace the user in the connection in SAP Analytics Cloud


Edit previously created connection replacing the user and password values with the newly created user and confirm the change.



Create a story for the existing model


Now let's go to Stories tool and create a new Canvas...


...with the data coming from the existing model.


Select previously created modem Inventory from the folder TPC and in the next step select the mandatory date, e.g. 1998-01-01.


Select the measure Quantity on Hand, and you should see a value returned from the cube in SAP HANA Cloud.



Remove the service key SACKey


That was our second option, but the user gets many more roles than just ::access_role required to read data from calculation views.

So, before we move forward, let's remove the service key we created in this step.
cf delete-service-key tpcds-hdidb-ws-x4bb9 SACKey



Option 3: Assign only ::access_role to a new user


Let's create a new user TPC_ANALYST and assign an access role to it.

In SAP Business Application Studio create a file analytics_user_with_roles.sql in misc folder with the following content.
CREATE USER TPC_ANALYST PASSWORD "H3LL0C!oud" NO FORCE_FIRST_PASSWORD_CHANGE SET USERGROUP DEFAULT;
GRANT "TPCDS_HDI_DB_1::access_role" TO "TPC_ANALYST";

As a reminder, in the previous post Modeling in SAP HANA Cloud (part 1): set up the project in SAP Business Application Studio we installed SAP HANA Clients and created an entry HANACLOUDTRIAL_DBADMIN in the local user store for the DBAdmin database user.
~/sap/hdbclient/hdbuserstore list
~/sap/hdbclient/hdbsql -U HANACLOUDTRIAL_DBADMIN -I misc/analytics_user_with_roles.sql
~/sap/hdbclient/hdbsql -U HANACLOUDTRIAL_DBADMIN -A \
"SELECT GRANTEE, ROLE_NAME, GRANTOR FROM GRANTED_ROLES WHERE GRANTEE='TPC_ANALYST';"




This role granting was possible thanks to the fact that DBAdmin user has a ROLE ADMIN system privilege.



From now on I'll skip step of updating the connection and verifying objects in SAP Analytics Cloud, but feel free to check if modified security still works.

Revoke the role from the user


Let's say we do not want to give a technical user used in the connection from SAP Analytics Cloud privileges to DELETE, INSER, UPDATE on the schema, as it is possible with the ::access_role.


Using DBAdmin user you can revoke the role, i.e. executing the command in the terminal.
~/sap/hdbclient/hdbsql -U HANACLOUDTRIAL_DBADMIN \
"REVOKE \"TPCDS_HDI_DB_1::access_role\" FROM \"TPC_ANALYST\";"
~/sap/hdbclient/hdbsql -U HANACLOUDTRIAL_DBADMIN -A \
"SELECT GRANTEE, ROLE_NAME, GRANTOR FROM GRANTED_ROLES WHERE GRANTEE='TPC_ANALYST';"



Option 4: Assign privileges to SELECT from the container's schema


In real project the other issue with the previous option might be that a developer like you or me would not have access to DBAdmin or other user with the system-vide ROLE ADMIN privilege. In such a case we should use HDI Container API SQL procedures GRANT_CONTAINER_SCHEMA_ROLES or GRANT_CONTAINER_SCHEMA_PRIVILEGES.

The latter is granting privileges for the entire schema, and it is what we are going to use now. The procedure is located in the #DI schema of the container, and a privilege to execute this procedure is given to the design-time users _DT.


Let's open an SQL Console as an "Admin" of the HDI container in SAP Database Explorer...


...and verify the user and the schema.
SELECT 'Current user' as "Property", Current_User as "Value" FROM DUMMY
UNION ALL
SELECT 'Current schema', Current_Schema FROM DUMMY;


The current user is the _DTuser -- as required.

Let's copy the schema name and add #DI to it before setting the schema for execution of the following SQL statements that will assign SELECT privilege on the container's schema to the user TPC_ANALYST.
SET SCHEMA "TPCDS_HDI_DB_1#DI";
CREATE LOCAL TEMPORARY COLUMN TABLE "#PRIVILEGES" LIKE "_SYS_DI"."TT_SCHEMA_PRIVILEGES";
INSERT INTO "#PRIVILEGES"("PRIVILEGE_NAME", "PRINCIPAL_SCHEMA_NAME", "PRINCIPAL_NAME")
VALUES ('SELECT', '', 'TPC_ANALYST');
CALL "GRANT_CONTAINER_SCHEMA_PRIVILEGES" ("#PRIVILEGES", "_SYS_DI"."T_NO_PARAMETERS", ?, ?, ?);
DROP TABLE "#PRIVILEGES";


We should see the privilege assigned to the user...


...and it can be used successfully in the connection defined in the SAP Analytics Cloud tenant.

Revoke privileges on the schema


But let's say we still want to be more restrictive what the technical user used in the connection can see, and the user should not have SELECT access to all the objects in the container's schema.

Change the previous code from GRANT_CONTAINER_SCHEMA_PRIVILEGES to REVOKE_CONTAINER_SCHEMA_PRIVILEGES and re-execute SQL statements.
SET SCHEMA "TPCDS_HDI_DB_1#DI";
CREATE LOCAL TEMPORARY COLUMN TABLE "#PRIVILEGES" LIKE "_SYS_DI"."TT_SCHEMA_PRIVILEGES";
INSERT INTO "#PRIVILEGES"("PRIVILEGE_NAME", "PRINCIPAL_SCHEMA_NAME", "PRINCIPAL_NAME")
VALUES ('SELECT', '', 'TPC_ANALYST');
CALL "REVOKE_CONTAINER_SCHEMA_PRIVILEGES" ("#PRIVILEGES", "_SYS_DI"."T_NO_PARAMETERS", ?, ?, ?);
DROP TABLE "#PRIVILEGES";


Refreshing the previous screen from SAP HANA Cockpit's security management should show the privilege is not assigned to the user anymore.



Option 5: Create a role with .hdbrole and grant it


In our project in SAP Business Application Studio let's create a new CalcViews_Select artifact of the type Role (hdbrole).


And using a graphical editor add all four calculation views with SELECT privileges...


...that will build following code.
{
"role": {
"name": "vital.tpcds::CalcViews_Select",
"object_privileges": [
{
"name": "vital.tpcds::Inventory",
"type": "VIEW",
"privileges": [ "SELECT"]
},
{
"name": "vital.tpcds::Item",
"type": "VIEW",
"privileges": [ "SELECT" ]
},
{
"name": "vital.tpcds::Warehouse",
"type": "VIEW",
"privileges": [ "SELECT"]
},
{
"name": "vital.tpcds::Date_Dim",
"type": "VIEW",
"privileges": [ "SELECT" ]
}
]
}
}

Deploy it and check the role has been created.


Now similarly to the previous option let's use HDI container SQL API, but this time with the GRANT_CONTAINER_SCHEMA_ROLES procedure.
SET SCHEMA "TPCDS_HDI_DB_1#DI";
CREATE LOCAL TEMPORARY COLUMN TABLE "#ROLES" LIKE "_SYS_DI"."TT_SCHEMA_ROLES";
INSERT INTO "#ROLES"("ROLE_NAME", "PRINCIPAL_SCHEMA_NAME", "PRINCIPAL_NAME")
VALUES ('vital.tpcds::CalcViews_Select', '', 'TPC_ANALYST');
CALL "GRANT_CONTAINER_SCHEMA_ROLES" ("#ROLES", "_SYS_DI"."T_NO_PARAMETERS", ?, ?, ?);
--CALL "REVOKE_CONTAINER_SCHEMA_ROLES" ("#ROLES", "_SYS_DI"."T_NO_PARAMETERS", ?, ?, ?);
DROP TABLE "#ROLES";


And back to SAP HANA Cockpit you should see the TPC_ANALYST user now assigned to the role (after hitting the Refresh icon, if required).


You can check the user with these authorizations should allow you to still do analysis of data from these calculation views from SAP HANA Cloud instance.

Similarly to previous options you can use REVOKE_CONTAINER_SCHEMA_ROLES procedure, but I am going to leave it as it is for now.

Commit changes to the local Git repository


Before we close this post let’s commit our changes to the local Git repository.
git add --all
git commit -am "Add the role to select from calc views"
git hist







And with that, we are done with the basic exercise bringing data from CSV files in cloud-native storage to SAP HANA Cloud database and then model calculation views to expose this data for live analysis in SAP Analytics Cloud.

If you have any comments, what you would do differently or additionally, please share in the comments!
-Vitaliy, aka @Sygyzmundovych
6 Comments