Skip to Content
Technical Articles
Author's profile photo Thomas Jung

SAP Tech Bytes: SAP HANA / CAP Access Tables from a Different Schema

Introduction

I read on social media about a New Year’s resolution idea: instead of answering questions sent in direct communication, write the response as a blog post and send the requester a link to the post. It sounds like a great idea to better utilize time and share knowledge so I decided to give it a try. Fourteen days into the new year and so far I’ve failed spectacularly. I find when I go to write a blog post I want to provide more background and detail. All this takes more time than you can usually squeeze into the day. This blog post represents my attempt to take at least one question I’ve received and answer via blog post, although admittedly after already responding to the original request.

The question I received:

I have created one CAP project which is using HDI to store entities /tables ..but now i got requirement to reuse tables from different schema as well.

The%20Question%20at%20Hand

The Question at Hand

Good question! It’s a question that will eventually hit all kinds of developers.  HANA native developers working with XSA, HDI, or HANA Cloud are going to encounter this.  But also most SAP Cloud Application Programming Model developers are going to hit it eventually as well.  Maybe you need to consume an existing table from an SAP application system schema, a replicated table, a virtual table, a Calculation view from another developers project, or even a CAP entity from a separate project.  All these scenarios will eventually take to this same problem and solution.

History

Before we talk about the solution to the question; let’s go back in time and discuss the reasons why this is even a question.  If you aren’t interested in the background and just here for the answer yourself, feel free to skip ahead to the next section of this blog post.

Why is this even a question? Why can’t developers simply access every table or database object in the same database instance? For the origins of this question we need to go back to the earliest days of SAP HANA. Just after the big bang and before the universe started to cool and expand…

A model of the expanding universe opening up from the viewer's left, facing the viewer in a 3/4 pose.
By NASA/WMAP Science Team, Public Domain, Link

Well not quite that far back.  Maybe only back to HANA 1.0 Service Pack 1 and 2. In these earliest days of HANA it was primarily used as secondary database for reporting. A subset of data was often replicated to a “side-car” HANA system from the transactional system and reporting done within HANA.

Security in the source systems (which were often ABAP based) was generally done at the application server layer.  A technical DB user is used to connect and perform SQL statements. Therefore when the data is replicated to HANA, not table level security controls come over with it.  Within a pure SQL database you would normally control authorizations via SQL grants directly on tables and views.  Development user, “DEV_A”, would need SELECT authorization on any table to include it in their development objects.  Likewise business user, “USER_B”, would need SQL SELECT authorizations to run reports on the tables in question.

But for development with HANA, SAP wanted to simplify this approach and not tie so much security directly to the database users. We wanted the security to act more like the authorizations within the Applications themselves. So there were two problems to solve – how to improve the security access for the Business User reporting on the data and how to improve the security access for the developer creating Views and other reporting content within the database.

The first problem is often resolved within databases using Views.  A view allows you to build in restrictions and aggregations and basically control how the underlying tables are used.  You have a Sales Order Header and Item table.  They each contain a hundred or more columns; some of which might even be sensitive financial or personal data.  Yet you want to give a group of users reporting access to see aggregated sales totals data.  You don’t want to give each reporting user access to both database tables. You only want to give them access to view which contains a subset of the content.

And this is where the SAP HANA Calculation View was born.  It contains a security abstraction approach that was already common in SQL databases for stored procedure execution called definer rights. The basic idea of definer rights is that the underlying SQL query within a view or procedure is not performed by the user requesting the query but by the DB user who created the DB object in question.

So user DEV_A creates a view named V_SALES. V_SALES aggregates data from both SALES_HEADER and SALES_ITEM. USER_B, the business reporting user, then is granted SELECT to V_SALES but has no access to the underlying tables SALES_HEADER or SALES_ITEM. When they run a report that performs a SELECT against V_SALES; within the database the query is not performed by their user but is actually executed by DEV_A – the user who created the view. This approach solves part of the security control issues, the part with the Business User; but leaves us still with issues around the Development User. Hence the HANA Repository was born.

The HANA Repository

The issue that remains is twofold.  The user, DEV_A, now needs SELECT access to the two underlying tables (SALES_HEADER and SALES_ITEM). Even in production the developer user that created the view will need this access due to definer rights. But this leads to even bigger problem that if the developer user gets locked or deleted the view stops working for everyone. It’s really not idea to have development objects dependent upon the lifecycle of the development user who created them.

This is where the HANA repository comes to the rescue. The HANA Repository uses an abstracted approach to the creation of database artifacts called activation. It basically plays on the idea of definer rights again, but now to create the database artifacts in the database itself.  When a development user activates a Calculation View, it doesn’t perform CREATE VIEW SQL commands as that database user. Instead the activate operation calls a stored procedure to perform the creation, altering or deletion of database objects. These stored procedures are created with Definer Rights and in turn all created themselves by a single System DB user called _SYS_REPO.

This way _SYS_REPO becomes the database object owner of all database objects created by the HANA Repository. So problem solved with the ownership issue with the development user.  But this solves another problem.  The Development user no longer needs access to the underlying database tables (SALES_HEADER and SALES_ITEM) either. As long as _SYS_REPO has access to all or most reporting tables; developers can freely create views against most everything. And hence the distinction between runtime and design time artifacts in HANA:

Content%20Management%20in%20SAP%20HANA%20circa%201.0%20SPS%2005

Content Management in SAP HANA circa 1.0 SPS 05

But ultimately this leads to another issue. You now have a single, nearly all powerful system user in _SYS_REPO.  How then to restrict access between different organizations or development teams. Maybe you don’t want to want Department_A to access and reuse the views from Department_B?  The HANA Repository has the concept of Packages and a whole new set of security build around Packages to control just such situations. But ultimately this approach doesn’t meet all the security restriction, ease of use, nor usage traceability requirements of all HANA users. These requirements we’ve discussed, along with others like version management/branching and application server layer access, ultimately lead HANA to its next major evolution — the change from the SAP HANA Repository to the SAP HANA Deployment Infrastructure or HDI for short.

HDI – HANA Deployment Infrastructure

The HANA Deployment Infrastructure or HDI takes the general concepts of the HANA Repository but then breaks it up into smaller pieces that can more easily be controlled and used standalone.  Instead of one single Repository and all powerful system user, HDI creates almost a new repository (with both design time and runtime management) per project.  This is referred to in HDI as an HDI Container. HDI itself was first introduced in HANA 1.0 SPS 11.

Each Container gets its own unique technical user to control database artifact creation similar to _SYS_REPO. But HDI goes even a step further and actually has two main technical users per container. One that acts as the Object Owner and performs all create/delete/alter SQL commands on the database artifacts themselves. The other user is the Container Runtime User that is used for all external data access (SELECT, INSERT, etc) to the database artifacts. This provides an even better layer of separation than what _SYS_REPO could.

Diagram%20of%20an%20HDI%20Container

Diagram of an HDI Container

But this brings us to the real root of the original question; how to access tables from a different schema or container?

The Answer

The two HDI container technical users intentionally have no access nor visibility to most other objects in the database outside of their own container. Only a few required system views/procedures are automatically granted to these technical users. So while they are all powerful within the scope of the content of their own container content; they are nearly powerless outside that container.

For the first problem of visibility we can solve this by using Synonyms. Synonyms are like a pointer or alias to another object in the database.  But the beauty is that synonyms live within a specific Schema or it the case of HDI a Container (hint: within the DB the Container is really just a Schema). This way we can have a table in a foreign schema or container and use a synonym for the technical users in our database to have visibility to it.  A SELECT against the foreign schema within the Container will fail but a SELECT against the Synonym will work (once the synonym is created of course).

But that comes to the other part – how to create the Synonym. If the technical users of Container_A can’t see or access the objects of your foreign schema or container; how can they create a synonym to them?  Simple answer – they can’t.

They need to be granted that access specifically by the owner of the foreign schema or container. This way control of who access the tables or other database objects still relies with the owner.  This is unlike the _SYS_REPO situation where any developer could essentially use most anything in the system. This special grant is done via the hdbgrants artifact in the HDI container where you want to create the synonym.  It uses a user provided service or direct binding to the foreign container to perform the actual granting. This way a user from the foreign environment is the one granting access to the technical users in your container. But the definition of the grant (or think of it is as the request for the grant) is a design time artifact within your development project. This way the request for grant is just another development object that is transported along with your content and the grant itself is done upon deployment of the HDI container

A little hint: The README.md document within the @sap/hdi-deploy module on npm, actually contains some of the best documentation you can find on the whole cross container/cross schema subject. I highly suggest a further reading of the “Permissions to Container-External Objects” section of this document to learn even more about this topic. It goes deeply into sub-topics like procedure grantors, revoking, etc.

hdi-deploy%20documentation

hdi-deploy documentation

The Answer – Step by Step

Now that we have discussed all the theory and background around this question, let’s walk through the how-to in a more step-by-step approach.

  1. If connecting to a foreign Schema (non-HDI Container), you are going to need a User Provided Service. This is a way to configure a connection to the database and which user and password you will use to connect to the database. This should be a User in the foreign schema which has Grantor rights to the objects which you want to add to your container. Although it might be tempting to just utilize a user with full rights to the foreign schema, its’s strongly recommended to configure a separate user for each usage which the most restrictive rights possible. Only what is absolutely needed to fulfil the use case of this one HDI container’s needs.
  2. If you developing using the SAP Business Application Studio, the HANA Projects view has a great wizard in the option “Add Database Connection” that’s going to perform the hard part of the next step for you. Don’t worry though if you are using other development tools. Approaches for on premise development are shown in the links in the next section.
  3. This wizard is going to ask if you want to bind to an existing User-Provided Service, Create a New User-Provided Service (both of those options for foreign schema access, or Bind to an existing HDI container. You can then select the target depending upon the first option. This is going to bind the target container or schema to your project (updating your mta.yaml) making it possible to connect at deploy time later and use this connection to perform the actual grant.
  4. Now that the setup steps are done we can go about creating the grants themselves. That’s it’s own development artifact that contains what role or direct access types you want to assign to your container users. You can assign different authorizations to both your Container Object Owner (who creates and alters the DB objects) as opposed to the Application User (the one used for all runtime access to database objects in your container).
  5. Personally during development, I like to deploy to the database. This will test all the security configuration and grant the necessary rights before we start trying to add synonyms to the container definition.
  6. The final step is create the synonym. This is another HDI design time artifact called hdbsynonym. One trick – you don’t have to make the synonym object name the same as the target. This can be useful when working in Cloud Application Programming model where some of the HANA DB naming standards aren’t compatible with CAP. This is your opportunity to rename things to be compatible.
  7. You can now use the synonym as you would any other DB object. For instance your Calculation Views.
  8. There is one additional step if you want to bring the synonym into your Cloud Application Programming Model as an entity. Synonyms aren’t directly or automatically brought into your CAP metamodel. You must create a proxy entity in the CAP CDS definition for the Synonym with the special annotation @cds.persistence.exists.  That annotation is very important. That’s what tells CAP not to try and try to create this entity in the database but just assume it already is there. Another hint – the hana-cli tool and the inspectTable or inspectView command can help same time and avoid mistakes with the generation of the proxy entity since it must contain the column names and data types of all fields in the target of the synonym.

There are also existing tutorials, sample code and videos that show this process as well.  See the next section on additional links if you want further help.

Additional Links and Resources on This Topic:

Help For HDI Administration:
SAP HANA Deployment Infrastructure (DI) Administration – SAP Help Portal

Help for HDI Development:
SAP HANA Cloud, SAP HANA Database Deployment Infrastructure Reference – SAP Help Portal

Original Blog Post first Announcing HDI back in HANA 1.0 SPS 11 (interesting but development tooling has certainly changed since then):
SAP HANA SPS 11: New Developer Features; HDI | SAP Blogs

A Tutorial that shows step-by-step how to perform this Cross Schema operation for an SAP HANA Cloud on Business Application Studio Project:
https://developers.sap.com/tutorials/hana-cloud-access-cross-container-schema.html

Another tutorial and although it’s focused on CAP and Using A Calculation View in CAP; the process of creating the proxy entity shown here is exactly the same for doing this with Synonyms as well: https://developers.sap.com/tutorials/hana-cloud-cap-calc-view.html

A very good collection of resources from Philip MUGGLESTONE on the topic of HDI and Cross Container/Schema:
Going beyond! Using synonyms to access data external to your HDI container | SAP Blogs

A GitHub Sample that contains a project with both Cross-Schema and Cross-Container access along with SAP Cloud Application Programming Model:
https://github.com/SAP-samples/hana-opensap-cloud-2020

Video showing Cross Schema Access in On Premise HANA Express with SAP Web IDE for SAP HANA: SAP HANA Basics For Developers: Part 5.6 Non-Container Schema Access – YouTube

Video showing Cross Container Access in On Premise HANA Express with SAP Web IDE for SAP HANA:
SAP HANA Basics For Developers: Part 5.7 Cross Container Local – YouTube

HANA Academy Video for Cross Schema Access on HANA Cloud:
HANA Cloud: Access Schema from HDI Container – YouTube

HANA Academy Video for Cross HDI Container Access on HANA Cloud:
HANACloud: Intra-HDI Container Access – YouTube

Video showing Cross Schema Access in HANA Cloud with SAP Business Application Studio:
Building hana-opensap-cloud-2020 Part 5: Cross Schema Access – YouTube

Video showing Cross Container Access in HANA Cloud with SAP Business Application Studio:
Building hana-opensap-cloud-2020 Part 6: Cross Container – YouTube

Assigned Tags

      22 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Denys van Kempen
      Denys van Kempen

      Congratulations, Tom

      An excellent New Year’s resolution plan.

      Keep it up and keep them coming!

      Cheers

      Author's profile photo Jason Scott
      Jason Scott

      Great post Thomas Jung on what is an easily misunderstood topic. The background info (the why) is great!

      Author's profile photo Peter Baumann
      Peter Baumann

      Thank you Thomas Jung for your insights and a deeper understanding how to make better use of HANA!

      Author's profile photo Martin Stenzig
      Martin Stenzig

      Thomas, great blog.

      Tangential question. Is there a way to grant i.e. the schema.admin role to a global role as part of the deployment process?

      Background: to allow people to access HDI objects (i.e. calculation views) from SAC I created a global role that represents that SAC access. As DBADMIN I can obviously perform the grant, but I have not found a way to do that grant as part of the deployment.

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      I don't think so - and that is on purpose. By design you are supposed to only grant container access to container specific roles.  Those roles can then be granted to users/global roles (for scenarios like SAC access) but intentionally not at deploy time. It isn't intended that container access becomes global in any way. And although possibly convenient, I don't think for security reasons you really want actionable roles granted actively at deploy time.   I know sometimes people will do things with global_access_role of the container to achieve similar; but it just feels like that is dangerous and goes against the design of the architecture to me.

      Author's profile photo Niels Jelsma
      Niels Jelsma

      You can achieve this by adding a post deploy script to your db module.

      Add a "poststart" line in "scripts" section in your package.json. You can point to a custom script that contains any logic you want.

      As Thomas mentions, granting roles should arguably be done separately instead of at deploy time.

      Author's profile photo Martin Stenzig
      Martin Stenzig

      Niels, thanks again for your insight. Just fyi, I decided to write a little support package over the weekend. https://www.npmjs.com/package/hdi-deploy-poststart

       

      Author's profile photo Deepak Kumar Paramashetty
      Deepak Kumar Paramashetty

      Thomas Jung Great blog and Thank you for all the explanation. Need more such blogs.

      Author's profile photo Simon Gaudek
      Simon Gaudek

      Hi Thomas Jung,

      great blog, thanks for that.

      But what do you see as the benefit of sharing schemas across multiple CAP applications? Instead of sharing a schema, wouldn't it be better to implement a microservice for it, through which other apps access the data?

      Sure, sharing a schema makes accessing the data very easy. But what if the schema changes? Then all apps that use the schema have to be adapted. This can be prevented by using your own microservices. As long as the API of the service remains the same, no app needs to be adapted.
      Or if access constraints are to be introduced, they must be implemented in all apps. With a central API, this handles the access and no access checks need to be implemented in the various apps.

      Regards
      Simon

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      Of course, a microservice approach for encapsulation is wise. But not all CAP projects can have the entire data model self-contained in the same project.

      First really an HDI container is a micro service implemented at the DB level. Maybe you use Calculation Views as your "interface" layer.  So, you have a reusable HDI container of Calc Views that you then want to reuse in your CAP Project.

      Or you have large data volumes that should be processed via code pushdown within the database and not accessed purely via an HTTP based interface.

      Or you have an existing data model.  Say a set of tables in your SAP applications themselves. In order to service enable those you might first access them via cross container or classic Schema.

      Author's profile photo Kanika Malhotra
      Kanika Malhotra

      Hi Thomas Jung

      Is there a way to create entity dynamically from calculation view via CAP. Are there methods available to connect to hdb , get the calculation view object and do the inspectView which can return the entity in cds format and then deploy it using cds.deploy? Basically, I want to create odata service dynamically after I have the calculation view name.

      Thanks

      Kanika

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      There is no API that would do that. You are welcome to look at the logic in the hana-cli that does the inspectView.  You can see we just had write a much of select statements to the metadata tables and then format that according to what CDS expects. The HANA CLI tool does this dynamically - that's what the hana-cli cds command is doing. It dynamically looks up the target view/table, generates the CDS and then runs a CAP server with that CDS content.

      hana-developer-cli-tool-example/cds.js at main · SAP-samples/hana-developer-cli-tool-example (github.com)

      Most of the logic that you are describing is in a library file - dbInspect.js so that I can reuse it in the cds and Inspect* commands. hana-developer-cli-tool-example/dbInspect.js at main · SAP-samples/hana-developer-cli-tool-example (github.com)

      There are separate calls for the different pieces of metadata and then a call to formatCDS to convert that metadata.

            object = await dbInspect.getView(db, schema, prompts.table)
            fields = await dbInspect.getViewFields(db, object[0].VIEW_OID)
            let viewSource = await dbInspect.formatCDS(db, object, fields, null, "view", "preview")
      Author's profile photo Kanika Malhotra
      Kanika Malhotra

      Thanks Thomas Jung . Figured out so, but was looking like if we have some wrapper or library in NPM which would do the magic for me.

      Can you guide me on how to connect with the DB without making any kind of synonym in the db folder and just connect with the db and get the view details.

      Regards

      Kanika

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      You don't need a synonym to read these metadata types as they aren't HDI managed.  You just create a regular SQL connection with a named user (which an admin can create manually) or even your own development users.

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      If there was enough demand, I could consider pulling dbInspect out into a separate npm module.  Over time I pulled some other functionality that started off in hana-cli out into the separate HANA Promisified module: SAP-samples/hana-hdbext-promisfied-example: Example of how to use @sap/hdbext (standard node.js SAP HANA interface) via a promisfied wrapper. (github.com).  If that portion has reusable value, it is better to be a separate npm module. But I've personally never needed that logic outside of hana-cli.

      Author's profile photo Kanika Malhotra
      Kanika Malhotra

      Well,that would though be lovely. It might just make creating odata services from hana calculation views easy and a reusable component . Just a thought 🙂

      Author's profile photo Kanika Malhotra
      Kanika Malhotra

      Thanks Sir.

      Will try my hands on it, looks complicated though, atleast for node.js and CAP newbie.

      Regards

      Kanika

      Author's profile photo Kanika Malhotra
      Kanika Malhotra

      Hi Thomas Jung

      In the formatCds method, could you please explain what is the significance of below line, as I am unable to find it in the global field and getting error.
      global.__xRef.push(xref);

      ERROR: Cannot read properties of undefined (reading 'push')

      Thanks

      Kanika

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      This is specific to the processing of the hana-cli and the cds command. In this command I'm dynamically generating a CAP service for the target entity but also in the service implementation performing the query against the source table (Since it isn't a really deployed CAP application).  The database name of the target isn't always CAP compatible. Therefore, I must keep a cross reference table of real db object name to temporary entity name that can be used in the CAP read handler later.

      Author's profile photo Kanika Malhotra
      Kanika Malhotra

      Thanks Thomas Jung , it generated matching entity definition for the Calculation View perfectly.

      I want to now create the entity and deploy it in the hana db via code. (with the aim adding entity to my odata services for calculation views).

      Could you help me understand how can I do this deployment in my CAP project? I am trying as below but no luck.

      const db = await cds.connect.to("db");
      let myNewCsn = cds.compile(cdlString) // console.log below for myNewCsn
      await cds.deploy(myNewCsn).to(db) // Probably it is not deploying anything to the db.
      console.log('NewCDL: ', JSON.stringify(myNewCsn))
      let entity = myNewCsn.definitions[entityName]
      cds.model.definitions[entityName] = entity;
      let t = await db.read(entity) // Gives error : Could not find table/view V_TEST in schema 
      
      //myNewCsn console.log  
      NewCDL:  
      {"definitions":
      {"V_TEST":{"kind":"entity",
      "@cds.persistence.exists":true,
      "@cds.persistence.calcview":true,
      "elements":{
      "CODE":{"@title":"CODE: CODE","key":true,"type":"cds.String","length":3},
      "TEXT":{"@title":"TEXT: TEXT","key":true,"type":"cds.String","length":10}
      }
      }},
      "meta":{"creator":"CDS Compiler v2.15.4","flavor":"inferred"},"$version":"2.0"}

       

      Thanks

      Kanika

      Author's profile photo Thomas Jung
      Thomas Jung
      Blog Post Author

      I don't think that cds.deploy is a valid API -- at least not a released one.  I don't know of any API way to deploy to HANA like this.  I'd suggest just calling an NPM script to trigger the cds command line and deploy command.

      Author's profile photo Carlos Pinto
      Carlos Pinto

      Excellent blog post, Thomas Jung. Thanks to this blog post I have finally managed to connect SAP Analytics Cloud with Redshift via BTP. Thanks a lot.