Skip to Content
Technical Articles
Author's profile photo Vishwa Gopalkrishna

SAP Datasphere, SAP HANA Cloud HDI CI/CD Automation Approach

This blog post is part of a series on SAP Datasphere and SAP HANA Cloud CI/CD. I recommend checking follow-on blog “SAP Datasphere SAP HANA Cloud HDI Automation CI/CD Pipelines Details” for implementation details, including a code walkthrough.


In this blog post, I’ll go over a lifecycle management approach for an end-to-end scenario involving SAP Datasphere, SAP HANA Cloud – HDI container and SAP Analytics Cloud. For illustration, I’ll go over a scenario, leveraging database artifacts from the existing SAP HANA Cloud, modelling and enriching it in SAP Datasphere and visualizing the results in SAP Analytics Cloud. While doing so, analyze some of the typical questions and challenges faced by the IT department as listed below.

  • How can I leverage my existing footprint and investments in the SAP HANA Cloud and SAP SQL Data Warehousing?
  • Can we build CI/CD pipeline for use in SAP Datasphere and SAP HANA Cloud?
  • Can GIT be used for the SAP Datasphere?
  • Can multiple developers work on artifacts in SAP Datasphere?
  • Can I use my local development environment and command line interface (CLI) to develop on  SAP Datasphere and SAP HANA Cloud? Similarly, can I leverage Business application studio to develop on SAP Datasphere and SAP HANA Cloud?

This blog post builds on top of some of the earlier blogs listed below from SAP HANA Database & Analytics, Cross Product Management and Product Management teams. This blog does not intend to showcase SAP’s best practices, an all-in-one guide, or a hands-on follow-through tutorial. The goal is to analyze challenges, point out gaps and discuss possible solutions/mitigations, if any.

The following paragraphs cover the use case scenario and scenario illustration, followed by CI/CD automation approach and conclusion. Let’s jump in.

Use Case Scenario

  • The marketing team in your organization wants to launch a promotional campaign to help with strategic organizational goals. How are products doing in the Americas region? What does the sales number look like per product? Which product should we develop? These are the questions they have for the data team, and they want reports and visualization for further analysis.
  • The IT department wants to address marketing teams’ requirements quickly and take care of software lifecycle elements and CI/CD aspects, laying a platform for innovation agility in meeting such needs from other lines of business (LOBs).

Scenario Illustration

Figure (a) shows 3 step solution approach for the illustration scenario, and Figures (b) and (c) elaborate on the 3 steps.


Figure (a) Illustration scenario solution approach

The first step will be to review and reuse an existing Salesorder model, tables or ready calculation view from SAP HANA Cloud HDI.  Using business application studio or VS code, deploy (push the design time model artifact to runtime) on the SAP HANA Cloud DB tenant. This HDI container will be linked to the SAP Datasphere space.


Figure (b) Solution Approach Model

The second step, enhance the SAP Datasphere model with product-relevant information sourced from a CRM system and expose it as an analytical data set with relevant measures per the requirements. Finally, in the third step, use the SAP Datasphere model to publish an SAP Analytics Cloud Story answering marketing departments’ questions by using two charts, one table showing product-wise sales, discount analysis under North America Sales Org and another linked heat map to visualize gross sales of each product.Figure%20%28c%29%20SAC%20Story%20using%20the%20DWC%20Model

Figure (c)  SAP Analytics Cloud Story using the SAP Datasphere Model

Automation CI/CD Pipelines

Let’s start with the transport landscape and then jump into pipelines. Figure (d) depicts the transport landscape. The main point is that both DEV and QA HDI containers would be under the same subaccount (linked cloud foundry org) and space. This can be extended to a 3-system landscape with DEV QA and PRD, with production on a separate SAP HANA cloud tenant or similar approach. Currently, SAP Analytics Cloud artifacts can be promoted only through Analytics Content Network (ACN) and no public APIs are available for automation. Hence I have not included SAP Analytics Cloud in the transport landscape.

Figure (d)  Transport Landscape setup

Figures (e) and (f) show the tooling stack and flow used to realize automation. There are two pipelines linked to two separate GIT repos for the HDI container and SAP Datasphere artifacts. For automation on the SAP Datasphere side, we will leverage @sap/dwc-cli. project “Piper” and its Jenkins build server are used for automation execution, coordination, and sequencing. SAP Continuous Integration & Delivery is not included in the diagram as Piper is used for that purpose since it offers more flexibility with SAP Datasphere and coordination. SAP Datasphere pipeline’s build and deploy steps are executed from within a Docker container, ensuring the @sap/dwc-cli dependencies are met. Additional Docker repo for maintaining the @sap/dwc-cli dependencies.

In general, for each development, one would create a feature branch under the GIT repo, then, based on the process setup, finish development, unit testing, and validation and create a pull request to merge the branch with the main branch. While merging, a rebase and adjustment may be required if the same entities are changed in the main branch. The webhook will trigger automation and transport through the landscape.

Figure (e) DevOps Architecture and tooling stack

Figure (f) Automation flow

As shown in figure (f), The flow can either start from the HDI container pipeline or the SAP Datasphere pipeline. Suppose it involves committing HDI container artifacts via VS code or Business Application Studio. In that case, it will trigger the HDI pipeline with build deploy, validation and uploading MTA archives to SAP Cloud Transport Management. SAP Cloud Transport Management will move the MTA archives through the landscape. If all the earlier steps are successful, it will trigger the SAP Datasphere pipeline. SAP Datasphere pipeline flows through the build, deploy and validation of SAP Datasphere artifacts deploying them into QA space.


This blog introduces the use case scenario, automation flow, the challenges, and the approach which can be used for CI/CD automation with SAP Datasphere and SAP HANA Cloud!.  With SAP Cloud Transport Management, project “Piper” and @sap/dwc-cli CI/CD automation can be realized.  Check the follow-on blog “SAP Datasphere SAP HANA Cloud HDI Automation CI/CD Pipelines Details” for implementation details, including a code walkthrough.

Let me know what you think about the approach and feel free to share this blog. All your feedback and comments are welcome. In case you have any questions, please do not hesitate to ask in the Q&A area as well.


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Elson Simoes
      Elson Simoes

      Hello Vishwa Gopalkrishna,

      The Hana CI/CD pipeline is quite clear and I guess a lot of people already have it place, however I am quite intrigued with the pipeline you draw for SAP DWC.


      How it can be if DWC does not have embedded connection do GIT? Also, the DWC-CLI is only able to extract some of the artefacts (for example tasks it does not allow)  and does dwc-cli also allows the automatic deployment?


      Kind Regards,


      Author's profile photo Vishwa Gopalkrishna
      Vishwa Gopalkrishna
      Blog Post Author

      Hi Elson Simoes ,

      Thanks for your comment. Yes, automated deployment is possible with a wrapper around dwc-cli. I'll ping you when my next blog with the code is published, should not be too long it's almost ready. Take a look if it helps with your requirement. And yes if there is a specific artifact which is missing I can reach out to the product management team and check. They are working on enabling them for dwc-cli.

      Best Regards,


      Author's profile photo Elson Simoes
      Elson Simoes

      Looking forward for that blog 🙂


      But I guess the integration with GIT is also not there. Do you use also a pipeline to extract from dwc and import it to git?


      Kind regards,


      Author's profile photo Vishwa Gopalkrishna
      Vishwa Gopalkrishna
      Blog Post Author

      Hi Elson Simoes,

      Check this follow-on blog  it has implementation details and code walkthrough.

      Best Regards,