Technical Articles
Use CAP to expose HANA Cloud tables as OData services
When you want to expose data residing into a HANA database to the outside world, the recommended best practice is to use OData.
Recently, SAP started promoting a new Cloud Application Programming model (CAP). The SAP Cloud Application Programming model is a framework of languages, libraries, and tools for building enterprise-grade services and applications. It guides developers along a ‘golden path’ of proven best practices and a great wealth of out-of-the-box solutions to recurring tasks.
CAP-based projects benefit from a primary focus on domain. Instead of delving into overly technical disciplines.
In this blog, I will use the SAP Cloud Application Programming Model to create tables on SAP HANA Cloud, and expose these tables as OData services.With this method, data is exposed using OData v4.0. As opposed to the traditional xsodata method, where data is exposed using OData v2.0.
Prerequisites
- Free Trial account of SAP BTP with SAP HANA Cloud and Business Application Studio.
Get your development environment ready
When your HANA Cloud instance is set up, and you are ready to start, open the subscriptions in your subaccount and click on SAP Business Application Studio.
Create a new Dev space.
Select the SAP Cloud Business Application Template and provide a name for your Dev Space.
Wait until the status changes from Starting to Running, then click on the tile with the Dev Space name. In the background, the Dev Space has been prepared with all of the necessary components that you would otherwise have to install on your laptop. For example, Node JS, CDS, etc.
Create a CAP Project from a template
Now we have our Business Application Studio started, configured and ready for use. On the Welcome tab, click on “Create project from template”.
Select the @sap/cap Project template.
Check the hana box in order to include SAP HANA-related features in your project.
Behind the scenes, your project will be generated. Once complete, the screen will return and you will see a pop-up message box in the bottom right to open a workspace with your project. Click on the Open in New Workspace button.
The editor will reopen in a new workspace and now you can start creating.
Notice the blue bar at the bottom of your screen that the Space has not been set with Cloud Foundy.
Click on this bar to connect Business Application Studio to the space where you want to deploy your OData service.
Insert the Cloud foundry endpoint, then enter your credentials and select the space in which you want to work.
Create your database model
Business Application Studio is now connected to your Cloud Foundry space, let’s create objects in your database model.
From the File Structure on the left, right-click on the db folder and create a new file ending with .cds .
I call mine schema.cds . The name of the file can be anything, this file will define all objects(tables, views) deployed in your HANA database.
Within the new schema.cds, create your first CAP Structure :
namespace scp.cloud;
using {
cuid,
sap.common
} from '@sap/cds/common';
entity SafetyIncidents : cuid {
title : String(50) @title : 'Title';
description : String(1000) @title : 'Description';
}
In this example, we are defining a namespace scp.cloud.
We then call the library @sap/cds/common and use the cuid aspect. It automatically defines an ID column for us in the entity SafetyIncidents. Learn more about aspects in the CAP documentation.
An entity defined in CAP will be deployed as a table in your database.
Open a terminal window by going to the Top Menu and selecting Terminal -> New Terminal.
Within your project folder, execute the command npm install
Now we will introduce you to a very useful command: cds watch .
Whenever you feed your project with new content, for example, by adding or modifying .cds , .json , or .js files, the server automatically restarts to serve the new content.
Execute cds watch
in the terminal window.
As long as that command is running, each time you change the project structure, it will automatically save and redeploy those project changes.
After a few seconds, the cds watch command generates your OData service.
It also creates the table from schema.cds in an SQLite database within your development environment.
Click on the button Expose and Open to see if your initial empty service gets rendered in the browser window.
It is still empty now.
Expose an entity as an OData service
Now that we have an entity for SafetyIncidents
defined, you can easily add a service definition to expose it as an OData service. Lets do that now! Create a new file within the srv folder called incidentService.cds
Within incidentService.cds
, enter the following code:
using scp.cloud from '../db/schema';
service IncidentService {
entity SafetyIncidents as projection on cloud.SafetyIncidents;
}
The first line references the schema.cds file we created earlier. The second line exposes cloud.SafetyIncidents as an OData service called Incident Service.
If you closed your preview tab, you can always re-open it by clicking on view: find commands, then searching for the command Ports: Preview . This will open a preview of the currently exposed ports.
Now let’s insert some data into our table. Start by creating a new folder called data within the db folder.
Within that folder, create a file called scp.cloud.SafetyIncidents.csv
with the following entries:
ID;title;description
067460c5-196c-4783-9563-ede797399da8;Broken machine;The printing machine is leaking
efec3e9f-ceea-4d17-80a7-50073f71c322;Software bug;The computer is on fire
The file name has to match the name space (scp.cloud) and the entity name (SafetyIncidents) where you want to insert data.
Double check as shown in the screenshot that you have the right spelling of the data folder under the db folder and that the filename is spelt correctly. Make sure that the column names are correct in the actual csv file.
If the cds watch is still running, stop it once and execute cds run in the terminal to ensure data is imported into your SQLite table.
The message > filling scp.cloud.SafetyIncidents from db/data/scp.cloud.SafetyIncidents.csv tells you that data is being imported.
Now that’s it running, you can open the service and click on the SafetyIncidents entry, you should see the following data:
You now have a table deployed on your SQLite database within your development environment, filled with some test data. This table is exposed through an OData service which can be accessed from outside through REST calls.
Deploy your data model and your OData service to SAP BTP
Now that you have the backend services running on SQLite in a local environment, it’s time to get this project running on SAP HANA Cloud.
Quick recap
- A schema for the incident management application has been created
schema.cds
- A service definition has been added to expose the correct entities
incidentService.cds
- The SQLite node module let us run the application connected to SQLite with data loaded into a table
Prepare your project for HANA Cloud
On SAP HANA Cloud, CDS models are deployed through the hdbtable
and hdbview
formats instead of hdbcds
. Edit your package.json to set the deploy-format to hdbtable.
Add the following line in the “cds” section of package.json.
"hana" : { "deploy-format": "hdbtable" }
Your code should be similar to this screenshot :
Build your project
Within the Node.JS world, there is an environment variable called NODE_ENV. Until now we have been using the “development” environment. It is time to switch that variable to “production”. It will affect the way that CDS behave. In order to deploy your project to SAP BTP, the following commands must be run from the Terminal window.
- Stop your running cds process with CTRL+C if it’s already running.
- Execute :
export NODE_ENV=production
- After this command runs successfully, execute :
cds build/all --clean
This command will build all of the relevant HANA artifacts and place them in a newly created folder that you should now see called gen. If you expland it, you should see 2 folders DB and SRV. As you might expect, if you drill into the DB folder, you will see the HANA DB artifacts and if you drill into the SRV, there are new files in there as well.
Create your HDI Container and deploy objects
Once the build process has completed, you will now execute 3 commands in succession in order to (1)create the HANA Deployment Infrastructure (HDI) container on Cloud Foundry, (2)deploy the HANA Artifacts and (3)deploy the SRV Artifacts.
Notice that in your terminal, the build process tells you which command you need to run in order to create the HDI container.
Execute the following command: cf create-service hana hdi-shared cap_project-db
(The creation of the container can take a few minutes, you should wait before executing the next step!)
This will create a HDI container called cap_project-db.
Note: On the screenshot below, I run the command cf create-service hanatrial hdi-shared cap_project-db
: I am actually deploying on “HANA as a Service”.
Use the parameter hana to deploy on HANA Cloud.
This will deploy the generated hdbtable
and hdbview
objects to your HDI Container.
The HDI container creation takes a couple of minutes,
Execute the following command: cf push -f gen/srv --random-route -k 320M
This will deploy the Node.JS application exposing your OData service.
If all three of the last commands executed correctly, you should see a route specified towards the bottom of the terminal window. The use of the word option --random-route
directs the process to create a random URL.
Once you find the route name that was generated uniquely for you, you can paste that URL into a browser to validate that it is running and available on the internet.
Open a web browser, paste your newly created route and you should see a familiar screen that looks like this when you open your entity. This fully deployed service is now available on the internet and using SAP HANA Cloud as a persistence layer.
Good job, you just deployed an OData service on SAP HANA Cloud !
Expore further
If you want to build a more complex OData service, here are some ideas to start. I will use this service in my next blog where I create a Fiori UI to create an application where users can report Safety Incidents. Explore the entity definitions below and the CAP Documentation to learn how CAP makes your life easier.
Replace your schema.cds
file with the following code:
namespace scp.cloud;
using {
cuid,
managed,
sap.common
} from '@sap/cds/common';
entity SafetyIncidents : cuid, managed {
title : String(50) @title : 'Title';
category : Association to Category @title : 'Category';
priority : Association to Priority @title : 'Priority';
incidentStatus : Association to IncidentStatus @title : 'IncidentStatus';
description : String(1000) @title : 'Description';
incidentResolutionDate : Date @title : 'ResolutionDate';
assignedIndividual : Association to Individual;
incidentPhotos : Association to many IncidentPhotos
on incidentPhotos.safetyIncident = $self;
incidentHistory : Association to many IncidentHistory
on incidentHistory.safetyIncident = $self;
}
entity Individual : cuid, managed {
firstName : String @title : 'First Name';
lastName : String @title : 'Last Name';
emailAddress : String @title : 'Email Address';
safetyIncidents : Association to many SafetyIncidents
on safetyIncidents.assignedIndividual = $self;
}
entity IncidentHistory : cuid, managed {
oldStatus : Association to IncidentStatus @title : 'OldCategory';
newStatus : Association to IncidentStatus @title : 'NewCategory';
safetyIncident : Association to SafetyIncidents;
}
entity IncidentPhotos : cuid, managed {
@Core.IsMediaType : true imageType : String;
@Core.MediaType : ImageType image : LargeBinary;
safetyIncident : Association to SafetyIncidents;
}
entity IncidentsCodeList : common.CodeList {
key code : String(20);
}
entity Category : IncidentsCodeList {}
entity Priority : IncidentsCodeList {}
entity IncidentStatus : IncidentsCodeList {}
Replace your incidentService.cds
with the following code:
using scp.cloud from '../db/schema';
service IncidentService {
entity SafetyIncidents as projection on cloud.SafetyIncidents {*,assignedIndividual: redirected to Individual };
entity Individual as projection on cloud.Individual {*,safetyIncidents : redirected to SafetyIncidents};
entity SafetyIncidentsNoImages as projection on cloud.SafetyIncidents{ID ,createdAt, priority, incidentStatus,description};
entity IncidentPhotos as projection on cloud.IncidentPhotos {*,safetyIncident : redirected to SafetyIncidents};
entity IncidentHistory as projection on cloud.IncidentHistory {*,safetyIncident : redirected to SafetyIncidents};
entity IncidentsByCategory as select from cloud.SafetyIncidents {count(ID) as categories:Integer,key category} Group By category;
@readonly entity Category as projection on cloud.Category;
@readonly entity Priority as projection on cloud.Priority;
}
Find here a few more advanced examples of using CAP to develop applications :
-
- Developing CAP applications in SAP Business Application Studio
- E2E: SAP BTP Internet of Things to SAP Analytics Cloud
- Developing a Fiori elements app with CAP and Fiori Tools
- Create Calculation views and OData services in a single project with HANA Cloud
- Architecting, Solutioning & Scaling Complex Apps using SAP BTP Extension Suite: Real Time CAPM Scenarios
Maxime SIMON
Very useful. Thanks for sharing.
You said you are deploying this to HANA Cloud, but in your HANA HDI container instance creation you are using the hanatrial service:
cf create-service hanatrial hdi-shared cap_service-db
The hanatrial service is actually the older HANA As A Service offering not HANA Cloud. Actually the hana service in the create-service command should be used for the HANA Cloud trial as well. It is not just for production as you stated in this blog post.
Hello Thomas, thanks for the comment.
As you said, I deployed my HDI container on HANA as a Service as I did not have a HANA Cloud instance running.
I updated the blog to use the hana service in the create-service command, in order to stick to a HANA Cloud tutorial.
Hi Maxime,
If you want to try it out with SAP HANA Cloud, you can sign up for trial. Alternatively, reach out to me and I can share more info.
Hi Maxime,
Fantastic blog! It explains how to easily create a CAP based application with a HANA Cloud db. I have one remark:
When creating the HDI service you say:
Execute the following command:
cf create-service hana hdi-shared cap_service-db
While the name of the service should be cap_project-db
You may want to change that.
Thanks!
Thanks, I corrected the mistake
When executing the create-service command, it wouldn’t it create an HDI instance under the SAP Hana Service service and not SAP Hana Cloud?
We are on a migration scenario, so I want to make sure the container is binded SAP Hana Cloud service.
SAP documentation indicated this pattern to be used:
https://help.sap.com/viewer/db19c7071e5f4101837e23f06e576495/LATEST/en-US/2863434ddda042b8b8011a3f24856281.html
The -c parameter allows you to define the database ID if you have several databases running in the same space.
Hi Maxime,
The Blog is really wonderful. Can you please share sample data (csv files for all tables) for Incident Example.
Thanks
a srikanth
How to expose cloud HANA db tables & CDS views as XSOData services
CDS views need to be exposed as OData services through the ABAP layer of S/4HANA.
This blog does not cover S/4HANA.
This was very helpful 🙂 one question though, as we are in the process of implemeting something similiar. Just to have a clear idea about the possiblities after exposing the OData service, can it be consumed as a destination in Cloud foundry and used in another app created on BAS?
Yes. After you deploy your Odata service, it can be consumed from the destinations in Cloud foundry and from other apps.
Hi,
I am trying to acquire data in SAP Analytics Cloud through OData Services. I have carried out all the required steps in this blog. I wanted to know about what authentication (Basic Authentication, OAuth, no authentication) will we require to fill when connecting it to SAC. Also, is the Data service URL the same as the random URL generated through --random-route ?
This is the dialog box in SAC
Hello, in this blog I did not cover about authentication. If you replicated my steps, authentication is not required to access your OData service.
If you want to set up authentication, you need to use token-based authentication (XSUAA).
Hi, thank you for the reply. Could you please also tell if you have idea about the Data Service URL.
Is it the same as the random URL generated through --random-route ?
Is the "Application Routes" the Data Service URL to be used ?
Yes, this is the correct. As explained in the blog in the "Create your HDI Container and deploy objects" section
Hi Siddharth, Maxime,
I am facing a similar issue with importing data into SAC through the deployed oData Services. I used the HANA academy generator to create a simple CAP project with user authentication (no authorization) and succesfully deployed the services.
I am also able to use Postman and test the oData service successfully, however I have not had any luck connecting from SAP Analytics Cloud to my oData service with oAuth 2.0 Client Credentials.
I am getting an error "Connection to server failed: oData services"
Could you please share how you were able to create this import connection in SAP Analytics Cloud?
Thank you
Thank you for the blog post. All steps were successful for me except for deployment of tables and views.
Hi Ruchi,
Could you please check if the HANA Database instance is running, in which your HDI container is deployed? I got the same error and it worked for me when I started my database instance.
My database is running. HDI container was created, however the tables/ views were not deployed.
Thanks for sharing blog post. You have well explained.
I am getting an error while running below command.
Error details:
It seems to be a network connectivity error. Check that your terminal can reach the Cloud Foundry endpoint where you are trying to deploy the app : use "cf login" to login, then "cf services" to list the available services.
If both commands do not work correctly, check network connectivity.
If both commands work, you should be able to use "cf push" to deploy apps. If you cannot deploy your app : post details about your issue on answers.sap.com
Hi Vikas,
I had exactly the same problem when pushing the srv module. The solution was to add the following section in the package.json file:
You can get both versions by running these commands:
Actually, the node's version in my case was 14.17.6, but when placing that value in the package.json file, the push command would crash and it offered me these versions:
I picked the nearest version (ie 12.22.7).
After editing the package.json file, execute "cds build/all --clean". The push command should now work properly.
Hi Juan,
Thank for your comment soooo much!
I had exactly the same error and now it'd been solved!
Anna
My node version = 14.17.6 and npm version = 8.13.2 but configuring these values did not solve the issue. Strangely, I tried your version and the push was successful. Don't know why but thank you:)
HI All,
I am also facing the similar issue and even after trying every solution provided here it's not working. Any help please?
Thanks
Kapil
Post details of your issue on https://answers.sap.com/index.html in order to get help from the community.
You can also raise an incident on the SAP portal as a customer/partner to get enterprise support.
Link this blog as a reference and explain at which step you are facing issues.
..
Hi Maxime, I'm having some troubles while reaching to the end of this blog. Could you give me some advice?
While running this command:
Service fails to start, and in the cf logs --recent i can visualize the following error:
I searched for this error but couldn't find anyting that addresses this same issue.
Thanks in advance.
Juan.
From the error, it seems the Node.js package 'passport' cannot be imported.
Can you check whether the cds build/all --clean command correctly imported passport into your node_modules folder ?
Dependencies are listed in the package.json file. If it is not imported correctly, a possible reason might be that the package.json is not up-to-date, you might need to increase the version of the 'passport' package.
Thanks Maxime... it was a bit more tricky than that.
Only way I could make it work was:
After doing all that, service app is correctly deployed and started. And from that point on any modification to database or service can be pushed with "cf push -f ..." command.
I have no idea why SAP made such complication... this is by no means something easy to do or solve for a person whom is new to CAP.
Nice article !
After pressing the connect bar, I entered the CF API endpoint, but there is no response when I press the enter key.
Is there any solution?
HI Simon,
I have tried doing and exposing the calculation view using OData services through CAP Model. yes, it is a good application where lot of flexibility available but in order for testing the POST Method am facing some issues. where am not able to see the inserted records in the calculation view.
I know am not giving you any steps but still do we have any option to insert the new records in Odata services of a calculation view. Also if possible we can connect on my gmail.