Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
As our customers adopt SAP Data Warehouse Cloud, we often need to help them set up new users for both training and productive use.  This can be a significant administrative task when there are many users, spaces, connections, and shares needed for each user.  NOTE: SAP provides the SAP Data Warehouse Cloud command line interface (CLI) for automating some provisioning tasks.

For a specific customer, we needed to create 30 training users, along with a Space per user, multiple Connections per user, and numerous shares from a common space.  This could all have been accomplished using the SAP Data Warehouse Cloud user interface but we wanted to go faster, and more importantly make it repeatable.

To support these types of automated and repeatable provisioning activities, we created a utility, called provisioner, to simplify and automate these setup steps.

This is the first of a series of blog posts on this tool.  Future blogs will review the code and provide deep dives into all the ways provisioner can help automate the management of SAP Data Warehouse Cloud environments.

Creating Users


Users for our training environments are always created using the SAP Data Warehouse Cloud provided "Importing or Modifying Users from a File" feature (link).  This built-in feature quickly creates the needed users, but we wanted a little bit more information about how often the user accesses our tenant.

The provisioner utility "users list" command shows us more information - specifically, how often a user has visited the tenant.
provisioner users list --wildcard company.com

Now with a simple command, I can list any user with "company.com" anywhere in their definition (--wildcard).  The output is configurable and I can quickly see, not only when they last logged into the tenant, I can get an idea of how active they are in the tenant.
User         Email                         Last Login Days Visited
------------ ----------------------------- ---------- ------------
XCOCHRAN. xavier.cochran@company.com 2022-06-22 20
BCROW bob.crow@company.com 2022-06-28 1
BPURDY barb.purdy@company.com None None
CPATTON curt.patton@company.com 2022-06-28 5
DTAYLOR d.taylor@company.com 2022-03-14 1
DVALENCIA danny.valencia@company.com None None
EMCMIILLAN sally.mcmillan@company.com None None
GFERRAZ greg.foster@company.com None None

Everything SAP Data Warehouse Cloud knows about a user is available for reporting and analytics.  Look for more information in future blogs.

Creating Spaces


The next step for each of our new users was to create a space just for them.  Of course, SAP Data Warehouse Cloud lets administrators create Spaces using the GUI.  There are also blogs (see here) that show how the process can be automated.  We wanted something even simpler and faster.

To create a single space, the provisioner has the "spaces create" command:
provisioner spaces create \
--template COMMON_TEMPLATE \
--disk 1 \
--memory .5 \
MY_NEW_SPACE \
mark.greynolds@sap.com otheruser@company.com anotheruser@company.com

What's really helpful in this command is the ability to reference an existing space (--template) to copy options (disk, memory, or priority) from one space to another.  In this example, I also specified override values for disk and memory.  As a last step in the command, I specified the users I wanted to be members of the space - this can be any number of users.

In many of our engagements, we need to create many unique spaces, one for each person new to  SAP Data Warehouse Cloud.  While the command above is useful, we needed something even faster.  We added the provisioner command "spaces bulk create" to create a list of users from a CSV file.
provisioner spaces bulk create \
--skip 1 \
--force \
--template COMMON_TEMPLATE \
spaces-example.csv

With this example, we skip the first line of the CSV file (--skip), i.e., the heading line, force creation of the space if it already exists, specify the template Space, and finally provide the name of a CSV file containing the list of Spaces to create. Note: force means dropping the space if it already exists and then creating it again.

The format of the CSV is very straightforward:



































Column Description
Space Id The technical name of the new Space - see here.
Label Enter the business name of the space. Can contain a maximum of 30 characters, and can contain spaces and special characters.
Disk Amount of disk storage to assign in gigabytes, e.g., 1 = 1 GB
Memory Amount of in-memory storage to assign in gigabytes, e.g., .5 = 500MB
Template The name of the template space (if any) to use as the definition.  A different template can be specified per space.
Force Recreate the space if already present - this deletes the Space before creating the new version.
Users List of members added to the new space.


Spaces CSV file format



Connections


Setting up a set of training users also means giving each user the same connections, perhaps as identified in a training guide.  SAP Data Warehouse Cloud provides a GUI and each user could create their own connections, but this may be too advanced or tedious for new users.  We created a provisioner command to quickly create the same connection in many spaces at the same time.
provisioner connections create \
--wildcard \
--force \
TRAINING \
connection-def-file.json

This command uses the --wildcard option to search for any space containing the word TRAINING,  provisioner then "force" creates the definition contained in the JSON file.

The contents of the JSON file are specific to the type of connection being created, i.e., an SAP HANA connection is different than an ABAP connection and both are different than a Generic ODBC.  See below for an example of a common connection type.

Shares


One of the important features of SAP Data Warehouse Cloud is the ability to share objects (tables, views, etc) from one Space to another.  Just like all the other features of SAP Data Warehouse Cloud, there is a GUI for setting up shares, but we wanted a way to script the process.  The provisioner "shares create" command bulk shares an object to one, or many other spaces.
provisioner shares create \
--wildcard \
--sourceSpace COMMON_SPACE \
--sourceObject FINANCE_DATA_VIEW \
--targetSpace TRAINING ONE_SPACE OTHER_SPACE

This example identifies the FINANCE_DATA_VIEW object (view) in the COMMON_SPACE and creates a share in all the spaces containing the word "TRAINING," e.g., TRAINING_SPACE, TRAINING_USER_1, COMMON_TRAINING.  In this example, the source object is also shared with any space containing the string ONE_SPACE or OTHER_SPACE.

NOTE: the --wildcard option does not affect the --sourceSpace or --sourceObject object selection parameters.

After creating the shares, the provisioner can list all the shares.  This helps us visually inspect the shares to ensure they are all present.
provisioner shares list --sourceSpace COMMON_SPACE

Conclusion


The provisioner tool lets us quickly bulk provision SAP Data Warehouse users with Spaces, Connections, and Shares.  In future blogs, I review the provisioner tool in more detail and discuss how to install and use the tool in your environment.




Example Connection

This example connection is for an on-premise HANA connection using an SAP Data Provisioning agent named ONPREMIST_AGENT.
{
"data": {
"businessName": "HANA On-premise",
"name": "HANA_Onpremise",
"description": "Example of a HANA on-premise connection via DP Agent.",
"location": {
"agentName": "ONPREMISE_AGENT",
"location": "agent",
"connected": true
},
"credentialMode": "technicaluser",
"configuration": {
"ConnectionProperties": {
"configurations": {
"cdc_properties": {
"prefix": "DWC6GFQQR_",
"enable_manageable_trigger_prefix": "false",
"conn_pool_size": "8",
"min_scan_interval": "0",
"max_scan_interval": "1",
"max_batch_size": "128",
"ddl_scan_interval": "10",
"batch_queue_size": "64",
"max_transaction_count": "1000",
"max_scan_size": "50000",
"statement_level_trigger": "false",
"record_pk_only": "false",
"shadow_table_type": "COLUMN",
"trigger_queue_type": "COLUMN",
"data_pattern_analysis": "true",
"eventual_consistency": {
"transmit_data_in_compaction": "false",
"enable_transaction_merge": "false"
}
},
"database": {
"category": "onpremise",
"host": "host.name.for.db@company.com",
"port": "30015",
"whitelist_table": "",
"schema": "",
"connection_options": "",
"version_in_load_table_dictionary": ""
},
"schema_alias_replacements": {
"schema_alias": "",
"schema_alias_replacement": ""
},
"gateway": {
"cloud_connector": "false"
},
"security": {
"encrypt": "false"
}
}
},
"CredentialProperties": {
"credentials_mode": "technicaluser",
"credential": {
"user": "HANA_USER",
"password": "notRealPassword!"
}
},
"ConnectionFeatures": {
"features": {
"remoteTables": true,
"dataProvisioningOption": "dpAgent",
"dpAgent": "ONPREMISE_AGENT",
"dataflows": true
}
}
},
"typeId": "HANA",
"capabilityHanaSdi": "true",
"capabilityDisDataflow": "true",
"capabilityModelTransfer": "false"
}
}
1 Comment