Skip to Content
Technical Articles
Author's profile photo Nidhi Sawhney

Develop on SAP Datasphere using SAP HANA Deployment Infrastructure(HDI)

A previous version of this blogpost was called “Develop on SAP Datawarehouse Cloud using SAP HANA Deployment Infrastructure(HDI)”. With the launch of SAP Datasphere on March 8,2023 the following steps also apply to SAP Datasphere tenants as SAP Datasphere is an evolution and the next generation of SAP Data Warehouse Cloud. All references to SAP Data Warehouse Cloud or DWC below are replicable with SAP Datasphere.

In this blog post I will do a quick walkthrough of the steps needed to make SAP Data Warehouse Cloud (DWC) tenant access and use views created using SAP HANA Deployment Infrastructure (HDI).

SAP HDI provides services to manage and deploy database artefacts, which can be found in the reference manual here. That combined with the power of SAP WebIDE provides version controlled management of design time artefacts via integration with git.

With SAP DWC you can access and consume the artefacts created as HDI objects and use it for further data modelling via Data Builder and dashboarding via Story Building or through embedded SAC in DWC.

Note: The access works only for SAP DWC tenants and SCP spaces in the same datacenter.

The below high-level architecture that supports the above is very well described in this blogpost from Axel Meier

In the example below we will use SAP WebIDE tooling to create an HDI container the contents of which can then be accessed by the SAP DWC tenant.

Step 1 : Open ticket to connect your SAP Cloud Platform (SCP) space to your SAP DWC tenant

First you will need enable access from SAP DWC Space Builder Area.

In the Schema Access area, go to HDI Containers and click Enable Access. A pop-up window appears informing you that you need to open a support ticket so we can map the HDI containers with your SAP Data Warehouse Cloud space. The details required for the ticket are described here.

Step 2 : Create an HDI container and build and/or deploy it in the HANA Cloud (HC) instance of the  SAP DWC

Once the ticket confirms that your SCP space is mapped to your DWC tenant, you can create or deploy an existing HDI container to the HANA Cloud instance of you DWC tenant.

Using SAP WebIDE we have here a sample project which provides the minimum needed to create an HDI container that can be accessed via DWC.

Sample%20HDI%20MTA%20Project

Sample HDI MTA Project

 

DWC_access is an MTA(Multi Target Application) project with a single HANA DB Module called dwcaccess_db.

In the mta.yaml file you need to edit the config for database_id and provide the database id of the HANA Cloud tenant from DWC. This can be obtained from the DWC tenant Space Builder, the hostname provides the id needed by the mta.yaml file above.

Get%20HANA%20Cloud%20Database%20Id

Get HANA Cloud Database Id

 

In addition the project must provide the following DWC access roles to enable the HDI container to be accessible from DWC

DWC_CONSUMPTION_ROLE.hdbrole

DWC_CONSUMPTION_ROLE.hdbrole

 

DWC_CONSUMPTION_ROLE_WITH_GRANT.hdbrole

DWC_CONSUMPTION_ROLE_WITH_GRANT.hdbrole

 

Then Build your HDI Project. Once your project is successfully built it is ready to be access from DWC.

Step 3: Access the HDI content from SAP DWC

In the SAP DWC Space Builder HDI Containers you can now add the HDI container using the + option

 

Once you have added the HDI Container, it is available as a Source in the Data Builder in SAP DWC and can be used to build views etc for consumption.

HDI%20Source

HDI Source

 

Thats it for now, happy DWC developing with the power of HDI!

Assigned Tags

      12 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Enio Terra
      Enio Terra

      Excellent Nidhi! SAP DWC is a great tool and is getting better! Thanks for sharing.

      BR

      Enio Terra

      Author's profile photo Sarath Kannan
      Sarath Kannan

      Please tell me how we can connect to a On -premise system instead of Hana cloud

      Author's profile photo Axel Meier
      Axel Meier

      Hi Sarath,

      pls feel free to check below Blog post describing the topic.

      https://blogs.sap.com/2020/06/19/sap-data-warehouse-cloud-hybrid-access-to-sap-hana-for-sql-data-warehousing/

       

      Thanks,

      -Axel

      Author's profile photo Sarath Kannan
      Sarath Kannan

      Thanks Axel

      Author's profile photo Nidhi Sawhney
      Nidhi Sawhney
      Blog Post Author

      @sarathk85

      Hi Sarath,

      I am not sure what you mean, SAP DWC is Data Warehouse Cloud so its cloud application and in the above scenario we are deploying to the HANA Cloud within DWC. There isnt an On-Prem equivalent of it. You can ofcourse deploy HDI containers on On-Prem via HANA XSA

      BR,

      Nidhi

      Author's profile photo Sarath Kannan
      Sarath Kannan

      Thanks Nidhi

      Author's profile photo Chang Run Lin
      Chang Run Lin

      Hi Nidhi,

      Will the same work for BAS (Business Application Studio)?

      Thanks,

      Changrun

      Author's profile photo Nidhi Sawhney
      Nidhi Sawhney
      Blog Post Author

      Yes, the same applies for developing in BAS.

      Author's profile photo Ravi Kashyap
      Ravi Kashyap

      Hi Nidhi & SAP Experts,

      Is it possible to consume existing artefacts like CV , table or SP built on HANA XSA to avoid data replication. Can we connect existing HDI container build in HANA XSA to Data Builder  as source ?

       

      Thanks
      Ravi

      Author's profile photo Nidhi Sawhney
      Nidhi Sawhney
      Blog Post Author

      Hi Ravi Kashyap,

      The Datasphere tenant connectivity requires the artefacts to be on Cloud Foundry space on BTP (more details here link from help.sap.com)

      so its not possible to bring in XSA content which is on-prem without copying it over to BTP space.

      Hope this helps,

      Nidhi

       

      Author's profile photo Ravi Kashyap
      Ravi Kashyap

      Hello Nidhi,

      Thanks for your response.

      Given that we've developed Calculation Views (CVs) using XSA, a cloud-native approach, the question arises whether transferring data to BTP is necessary and reconstructing the CVs in Data mode. In comparison, similar to the situation with BW, where we can utilize ODP to consume ADSOs in BTP.

      Thanks

      Ravi.

      Author's profile photo Ian Henry
      Ian Henry

      Hi Ravi,

      As Nidhi states, the HDI Container integration happens by linking the BTP subaccount to your Datasphere tenant. This allows data federation to the HDI container from your Datasphere space.

      For on-premise integration we can also use data federation with the Cloud Connector or the Data Provisioning Agent. Using either of these components, we are able to connect to on-prem HANA and access your XSA based Calc Views.

      Does this help?

      Thanks, Ian.