Skip to Content
Technical Articles

How to connect your CAP application to SAC – Creating HANA Calculation View

CAP applications in either NodeJS or Java are a powerful tool for rapid development in the cloud. With simplified syntax, it is reasonably simple to create a beautiful and powerful application.

Creating an application based on HANA Cloud simplifies the integration with powerful analytics tools like SAP Analytics Cloud on CF.

Usually a CAP application has the following structure

  • /db
  • /srv
  • .hdiconfig

The db folder contains your domain models. With this, you usually have a schema.cds where you define your entities and relationships.

In the srv folder, you create your OData services and apply rules like restricted access or implement your functions here.


Before we start with our calculation view, please ensure that your .hdiconfig contains the following code fragment:

"hdbcalculationview": {
   "plugin_name": ""

This will be our mapping for files containing ending with .hdbcalculationview that they will be interpreted as a HANA calculation view.

In the next step, you navigate your /db folder and create a new folder /src if you don’t have this already. Here you shall create a new file for your HANA calculation view. In this example, I call mine:


This file is XML-based, so you may want to give it firstly the suffix .xml, so your IDE applies some nice code highlighting.


Understand your data schema

Before you proceed with your calculation view, you should understand which entity you want to expose to your SAC. In my example, I will not join anything but just build a straightforward projection of an entity. Feel free to add some of your code samples for more complex calculation views.

My schema.cds looks as follows:

namespace sap.dw;

using {
} from '@sap/cds/common';

entity KPI_B_BLOCK {
    key timestamp     : DateTime;
        blockTime     : Double;
        nextRetarget  : Double;
        difficulty    : Double;
        estimatedSent : Double;
        minersRevenue : Double;
        blockSize     : Integer;

I want to expose the entity “KPI_B_BLOCK”. Most important are your entities, the attributes and your namespace.

Since we directly work on the entity level, you may also understand why we add our HANA Calculation View into the /db folder.


Begin with the HANA Calculation View

In your HANA calculation view, you now start with the following code piece:

<?xml version="1.0" encoding="UTF-8"?>
<Calculation:scenario xmlns:xsi="" xmlns:Calculation="" id="CV_Bitcoin_Blocks" outputViewType="Projection" applyPrivilegeType="NONE" dataCategory="CUBE" schemaVersion="3.0" cacheInvalidationPeriod="NONE" enforceSqlExecution="false" propagateInstantiation="true">
  <descriptions defaultDescription="BitcoinBlocks"/>
    <DataSource id="KPI_B_BLOCK">

As you recognise, we are going to design a CUBE. We use the same id as our file is called.

Most important are our dataSources. Here we refer to our entities, we use within our calculation view. I usually tend to give a DataSource the same name as the entity is called. For the resourceUri, you see we need the same name as the table has within your HDI container.

It is <namespace>_<entity>. Be aware that the dot notation is replaced with underscores. Use also ALL CAPS.


Add a calculation view

Within our XML we add following code fragment as a child of Calculation:scenario

  <calculationView xsi:type="Calculation:ProjectionView" id="Projection_1">
        <viewAttribute id="TIMESTAMP"/>
        <viewAttribute id="BLOCKTIME"/>
        <viewAttribute id="NEXTRETARGET"/>
        <viewAttribute id="DIFFICULTY"/>
        <viewAttribute id="ESTIMATEDSENT"/>
        <viewAttribute id="MINERSREVENUE"/>
        <viewAttribute id="BLOCKSIZE"/>
    <calculatedViewAttributes />
      <input node="KPI_B_BLOCK">
        <mapping xsi:type="Calculation:AttributeMapping" target="TIMESTAMP" source="TIMESTAMP"/>
        <mapping xsi:type="Calculation:AttributeMapping" target="BLOCKTIME" source="BLOCKTIME"/>
        <mapping xsi:type="Calculation:AttributeMapping" target="NEXTRETARGET" source="NEXTRETARGET"/>
        <mapping xsi:type="Calculation:AttributeMapping" target="DIFFICULTY" source="DIFFICULTY"/>
        <mapping xsi:type="Calculation:AttributeMapping" target="ESTIMATEDSENT" source="ESTIMATEDSENT"/>
        <mapping xsi:type="Calculation:AttributeMapping" target="MINERSREVENUE" source="MINERSREVENUE"/>
        <mapping xsi:type="Calculation:AttributeMapping" target="BLOCKSIZE" source="BLOCKSIZE"/>

With this code fragment, we are doing two things:

  • Define the attributes of our calculation view
  • Map the calculation view attributes to our entity

Since we don’t have any joins here, we have a simple 1:1 mapping of our attributes to our KPI_B_BLOCK entity. In joined use cases, you must state from which entity each attribute comes from.

Also here, use ALL CAPS.

We have now successfully created our calculation view called “Projection_1”.


Logical model

In the next level, we need to formulate our logical model. So far, we have only matched the syntax of our entities. We must now feed the calculation view with the information, which attribute can serve as a measure.

I don’t want to make it too tricky here, so I state simple mappings. Here you could also enhance your calculation view with dynamically calculated measures.

You’d put the following code fragment, also as a child of Calculation:scenario

<logicalModel id="Projection_1" ignoreMultipleOutputsForFilter="true">
      <attribute id="timestamp" order="1" displayAttribute="false" attributeHierarchyActive="false">
        <keyMapping columnObjectName="Projection_1" columnName="TIMESTAMP"/>
      <measure id="blocktime" order="2" measureType="simple">
    		<descriptions defaultDescription="BLOCKTIME"/>
    		<measureMapping columnObjectName="Projection_1" columnName="BLOCKTIME"/>
    	<measure id="nextretarget" order="3" measureType="simple">
    		<descriptions defaultDescription="NEXTRETARGET"/>
    		<measureMapping columnObjectName="Projection_1" columnName="NEXTRETARGET"/>
    	<measure id="difficulty" order="4" measureType="simple">
    		<descriptions defaultDescription="DIFFICULTY"/>
    		<measureMapping columnObjectName="Projection_1" columnName="DIFFICULTY"/>
    	<measure id="estimatedSent" order="5" measureType="simple">
    		<descriptions defaultDescription="ESTIMATEDSENT"/>
    		<measureMapping columnObjectName="Projection_1" columnName="ESTIMATEDSENT"/>
    	<measure id="minersRevenue" order="6" measureType="simple">
    		<descriptions defaultDescription="MINERSREVENUE"/>
    		<measureMapping columnObjectName="Projection_1" columnName="MINERSREVENUE"/>
      <measure id="blockSize" order="7" measureType="simple">
    		<descriptions defaultDescription="BLOCKSIZE"/>
    		<measureMapping columnObjectName="Projection_1" columnName="BLOCKSIZE"/>

As you see, I’ve defined only the timestamp as an attribute and have determined all my other attributes as measures.

You see, we often use our reference “Projection_1″ to our calculation view. We need here also again a type of mapping to our Attribute Mappings. You could use various enhancements like defining a measure aggregation type via aggregationType=”sum” or adding calculated measures. However, for simplicity reasons, I left these out.



The fantastic thing is we can now just proceed with standard deployments of our application, either MTA or via some custom-built deployment script. The CAP CDS module will automatically deploy your calculation view with it.

Ensure, you have no errors during deployment and re-suffixed your file. Maybe also verify if your Cube is in your HDI container. When everything worked so far, you can now easily use this calculation view within your SAP Analytics Cloud.


I hope this blog post helped you enhancing your beautiful cloud-native application with smooth integration into your SAC 🙂

You must be Logged on to comment or reply to a post.
  • Great blog, just a few cautionary remarks for the community.

    1. Your example here will work if
      1. You are on HANA Cloud. If you are still on the HANA Service, you will need to create a HANA Analytics Adapter deployment module to make this work
      2. Your SAC tenant is on Cloud Foundry. If you are on a NEO stack you will not see HANA Cloud as an option in the drop down. You can still connect to the HANA Service via the HANA Analytics Adapter.