SAP Analytics Cloud – Replicating BW Analysis Authorizations into Dimension Read/Write Property
When it comes to import models, the first question that comes to anyone’s mind is whether they will need to redesign their authorization model again on SAP Analytics Cloud. In my previous blog post I showed a method to automate the authorization replication from SAP SuccessFactors for a certain use case.
In this blog post, I will be tackling a much more common use case, which is the BW Analysis Authorizations to be replicated into the SAP Analytics Cloud public dimenions’ read and write properties using a HANA calculation view.
- At least SAP HANA 2.0 SPS 1.
- User IDs for each user on SAP Analytics Cloud to be the same as their BW users.
- If there is a prefix or a suffix included, it can be appended to the code, by concatenating either of them to the final UNAME.
- In case the ids are not the same, a mapping containing the UNAME (BW USER ID), and the SAC USER ID can be loaded to either an attribute of an infoobject or to a new ADSO, the table containing the relevant data will have to be joined through UNAME from “AGR_HIER” and either the infoobject table or the ADSO table. SAC USER ID will replace all the UNAME callings in the presented code, after the selection of the “Main” table.
- An import connection setup between the HANA 2.0 database and SAP Analytics Cloud.
- Backend user to have “Read” right to the BW schema.
- Works for BW4HANA as well as BW on HANA
- Code is highly dynamic and can be deployed in any HANA 2.0 database, once it is executed, you will be prompted to enter:
- The schema name
- Name space
- InfoObject name
- No tables, functions, or any object other than the HANA calculation view need to be created, everything is handled temporarily during the execution through the usage of table variables as well as normal variables.
- It works for the Equals, Between, Contains Pattern options specified within an Analysis Authorization Object.
- No redundant or derived member ids are generated, all are retrieved from the InfoObject’s master data table.
This version of the solution does not currently handle the hierarchy based authorizations.
Before I jump into the technical aspects of the code, I would like to shed some light on the basis of this idea.
Our main goal in the end is to get a list of specific members for a certain dimension in one column, and the user ids of everyone that has access to that member, whether it is through EQ, BT, or CP options in another column.
The authorization relevant infoobject values are maintained in an analysis authorization object that is maintained in tcode “RSECADMIN”, then a role is created having the authorization “S_RS_AUTH” for that specific object, “BIAUTH” must be set to the authorization object’s name that we created at first.
To be able to get the user id that is assigned to the role, and the value that is assigned to a certain object within the analysis authorization we need to go through 3 tables:
From the “AGR_HIER” table we will be getting the linkage between the user ids and role names (AGR_NAME) from the “AGR_1251” table, we will be getting the linkage between the role names and the analysis authorization object name, from “RSECVAL_STRING” we will be getting the values maintained for a certain infoobject for a specific authorization object.
The letters beside the table names represent their aliases that are used in the code that will be explained below, joins between the 3 tables are maintained as in the diagram, green fields represent the unprocessed expected final outputs, the other fields are not everything in scope, but the most important fields that we have.
Using dynamic SQL, I will be dynamically selecting the R table of a given infoobject, through the concatenation of schema “SCM”, name space “NMSPC”, R, infoobject Name “IOBJNM”. Ex: for the infoobject “ZE_PLANT”, the calculation view will be fetching the values from the “/BIC/RZE_PLANT” table. The main reason for choosing the R table in particular is because it only includes the member ids and the SIDs, no need for unnecessary attribute columns calls.
Step by Step Guide
Login to the HANA database, in a new or an existing package, right click, select New>Calculation View, specify a name then make sure you set the type to “SQL Script”
From the left panel, select “Script_View”, from the right window under the Output panel, click on (+) add. We need to create 2 attributes “MID” which will be an NVARCHAR(60), and “READ” which will be NVARCHAR(1333) similar size to the NCLOB type, 1 Measure INTEGER “DUMMY” which is dummy (as having a measure is mandatory in any calculation view).
Right click Parameters>New>we need to create 3 prompts, SCMA NVARCHAR(40), NMSPC NVARCHAR(10), IOBJNM NVARCHAR(15), all of them will be of Parameter Type “Direct”, all of them are mandatory.
Below is a the SCMA parameter setup and default value, setting it up based on the default value of your application database schema will save the effort of having to enter it each time the view is called.
From the left panel, click “Semantics”, specify MID, READ as attributes by selecting each of them then from the top left panel click on the blue rectangle, as for the DUMMY column, please set it to a measure by clicking the orange rectangle.
This is how your output panel should look like right now:
Now let’s get to the code itself:
/********* Begin Procedure Script ************/ BEGIN --Main table from which the user ids per dimension member will be captured DECLARE MAIN TABLE (id nvarchar(60), tcthigh nvarchar(60), read nvarchar(1333), iobj nvarchar(15), option nvarchar(2), tctauth nvarchar(15)); --Final 'EQ' result table, straight forward DECLARE EQ TABLE (mid nvarchar(60), read nvarchar(1333)); --Result set of filtering the Main table based on the option 'BT' DECLARE BTRES TABLE (id nvarchar(60), tcthigh nvarchar(60), read nvarchar(1333), rnum integer); --Initial loop result table, used as a temp area for supplying the final value in 'BT' table DECLARE RES TABLE (mid nvarchar(60), read nvarchar(1333)); --Final 'BT' result table, containing all the members between each line of range DECLARE BT TABLE (mid nvarchar(60), read nvarchar(1333)); --Count of result lines from the BT query result in BTRES DECLARE LNBT INTEGER; --Result set of filtering the Main table based on the option 'CP' DECLARE CPRES TABLE (id nvarchar(60), tcthigh nvarchar(60), read nvarchar(1333), rnum integer); --Final 'CP' result table, containing all the members matching a certain pattern DECLARE CP TABLE (mid nvarchar(60), read nvarchar(1333)); --Count of result lines from the CP query result in CPRES DECLARE LNCP INTEGER; --'BT' and 'CP' Loop related variables DECLARE i INTEGER DEFAULT 1; DECLARE LOW NVARCHAR(60); DECLARE HIGH NVARCHAR(60); DECLARE RD nvarchar(1333); DECLARE PTRN NVARCHAR(60); EXEC 'SELECT DISTINCT tctlow as id, tcthigh, UNAME as read, tctiobjnm as IOBJ, tctoption as option, tctauth FROM ( Select r.tctlow, r.tcthigh, u.UNAME, r.tctiobjnm, r.tctoption, r.tctauth, a.AGR_NAME, a.AUTH From "'||SCMA||'".AGR_USERS u Join "'||SCMA||'".AGR_1251 a on u.AGR_NAME = a.AGR_NAME join "'||SCMA||'".RSECVAL_STRING r on a.low = r.tctauth WHERE r.tctlow NOT LIKE''%:%''AND r.objvers =''A'' AND r.tctauth !=''0BI_ALL''AND r.tctiobjnm ='''||IOBJNM||''')' INTO MAIN; --Handling the Equals 'EQ' Scenario EQ = SELECT id as MID, read as READ FROM :MAIN WHERE option = 'EQ'; --Handle the Between 'BT' Scenario BTRES = SELECT id, tcthigh, read, row_number() over(ORDER BY id asc) as rnum FROM :MAIN WHERE option = 'BT'; SELECT COUNT(*) INTO LNBT FROM :BTRES; FOR i in 1..LNBT DO SELECT id INTO LOW FROM :BTRES WHERE rnum = i; SELECT tcthigh INTO HIGH FROM :BTRES WHERE rnum = i; SELECT read INTO rd FROM :BTRES WHERE rnum = i; EXEC 'SELECT "/'||NMSPC||'/'||IOBJNM||'" as MID, '''||RD||'''as READ FROM "'||SCMA||'"."/'||NMSPC||'/R'||IOBJNM||'"'||' WHERE "/'||NMSPC||'/'||IOBJNM||'"'||' BETWEEN '||''''||LOW||''''||' AND '||''''||HIGH||'''' INTO RES; INSERT INTO :BT (SELECT * FROM :RES); i = i+1; END FOR; --Handle the Contains Pattern 'CP' Scenario CPRES = SELECT id, tcthigh, read, row_number() over(ORDER BY id asc) as rnum FROM :MAIN WHERE option = 'CP'; SELECT COUNT(*) INTO LNCP FROM :CPRES; FOR i in 1..LNCP DO SELECT replace(id,'*','%') INTO PTRN FROM :CPRES WHERE rnum = i; SELECT read INTO rd FROM :CPRES WHERE rnum = i; EXEC 'SELECT "/'||NMSPC||'/'||IOBJNM||'" as mid, '''||rd||'''as read FROM "'||SCMA||'"."/'||NMSPC||'/R'||IOBJNM||'"'||' WHERE "/'||NMSPC||'/'||IOBJNM||'"'||' LIKE '||''''||PTRN||''''INTO RES; INSERT INTO :CP (SELECT * FROM :RES); END FOR; --Final Result var_out = SELECT DISTINCT MID, STRING_AGG(READ,';') as READ, 1 as DUMMY from (select * from :EQ union select * from :BT union select * from :CP) group by MID; END /********* End Procedure Script ************/
What needs to be done is simply copying everything above inside the Script_View, activate the calculation view, then you are good to go, as simple as that.
Before we see it on SAP Analytics Cloud, let me explain what this code is expected to do, let’s assume 3 simple cases.
Let’s start by the dimension members in the infoobject “ZE_PLANT”
Case 1, USER1 has access to plant AA01 only
Case 2, USER2 has access to plants between AA01 and AA05 only
Case 3, USER3 has access to all plants that start with A
Finally, combined together if these analysis authorization objects are active and assigend to certain roles that are assigend to certain users, this would be the CV’s result:
This is what SAP Analytics Cloud exactly needs to supply the property of the “ZE_PLANT” with the appropriate read users.
Finally, I will be showing actual screenshots from the calculation view results and the import job on SAP Analytics Cloud.
We have 3 users:
- TEST_B1 has access to company code 011118 only.
- TEST_B2 has access to company codes between 000000 and 099999.
- AUTH_USR5 has access to all company codes.
Now let’s create an import job in the “Data Management” tab of the public dimension that I created on SAP Analytics Cloud, “ZCOMPCOD6”, once the calculation view is selected, the prompts will show up:
Map the dimension id and the property
And just like that, any changes in the analysis authorization that is done with respect to the company code infoobject will be reflected through the scheduled import job, this will also work for the write property.
Thank you for your time reading this, I hope you have found this useful and informative. Hopefully the solution provided can solve one of the use cases that you currently have.
Looking forward to hearing your thoughts!