Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
mariusobert
Developer Advocate
Developer Advocate
In this eighth post in my CloudFoundryFun series, I want to show you how to combine a CAP app, running on Cloud Foundry, with an Azure Storage Account. And the best thing is, I published the entire code base on GitHub

TechEd App Space Preview


TechEd 2019 is only a couple of days away and I can tell you, I'm super excited about it! In my last post, I already mentioned that I built a sample app which will combine the Cloud Application Programming Model with Azure cloud services. In the App Space at TechEd, you'll have the chance to complete a tutorial and learn how to deploy this sample app to your own SAP Cloud Platform account. This application exposes a list of sample entities which represent cities. Each entity contains several properties, like name, region and an image. The user can navigate to a Fiori Object Page and replace the default image with a newly uploaded file. This file will be stored in an Azure storage account and the URL that references the image will then be stored in a table within HANA.

Demo video

If you are going to be at any of this year's TechEds, stop by at the App Space and look for me or my colleague iinside. We are more than happy to chat about this sample app and to help you to deploy it to SAP Cloud Platform.

In this post, on the other hand, I want to emphasize how I developed and tested the application locally.

Architecture


The architecture of this sample app is pretty straight-forward. As all Cloud Foundry apps, this application consists of multiple microservices and backing services. The project descriptor, the mta.yaml file, also calls these microservices modules:

Application router module The entry point of the application which redirects all incoming traffic to the following two microservices. This module also contains the source code of the Fiori Elements user interface.

Server module Connects to the HDI container and exposes the annotated OData service via HTTP

Uploader module Service to upload files to the Azure storage account that return the URL to access the created resource.

Database module A Cloud Foundry tasks that will run once to set up the schema in the HDI container and to import the sample data. Once these steps are completed, the app will shut down and stop consuming memory and CPU quota.

Besides the modules, this sample app leverages two backing services:

SAP HANA service SAP Cloud Platform provides HDI containers to allow apps to efficiently store business data in a state-of-the-art in-memory database. This data store is used for structured data.

Azure Storage Account This service is provided by Microsoft to store large files (also called Blobs). We will use this store for unstructured data.


The architecture of my end-to-end sample app.



Simulate Azure Storage Account


When deploying a mtar archive to the cloud, Cloud Foundry usually provisions all backing services for us. Next, Cloud Foundry injects the credentials of this backing service in the environment variable of our application, so that it can access it. The following snippet shows how this access should look.
const cfenv = require('cfenv').getAppEnv();
const credentials = cfenv.getService('azure-blob-storage').credentials;

As we want to run our application locally, this environment variable won't exist in the local runtime. We can work around this when we provision a storage account manually, you can follow these instructions to create one. Next, you need to create a key to access this storage account remotely. Save all this information in a file named db/credentials.json, the file should have the structure:
{
"storageAccountName": "",
"accessKey": "",
"primaryBlobServiceEndPoint": "https://<ENDPOINT>.blob.core.windows.net/"
}

Now we can replace the snippet from above with this code to read the credentials from the file in the local runtime.
const cfenv = require('cfenv').getAppEnv();
const credentials = cfenv.isLocal ?
require('./credentials') :
cfenv.getService('azure-blob-storage').credentials;

Do not forget to add the credentials file to your .gitignore list. You don't want to check them into your version control system!

Database Switch


If you are familiar with CAP, you already know that (atm) this framework supports two types of databases for structured data. For local development scenarios, we store the data in an SQLite DB and for the cloud deployment, we choose SAP HANA. This gate can be defined in the CDS config file .cdsrc.json as follows:
"db": {
"kind": "sqlite",
"model": [
"db",
"srv"
],
"credentials": {
"database": "cities.db"
},
"[production]": {
"kind": "hana"
}
}

VS Code Launch Options


I used VS Code to design and implement this project. To be able to debug my microservices locally, I wanted to be able to start each service in debug mode during development. For this, we need to start three processes:

  1. The CDS process which sets up the SQLite DB and starts the OData server. Usually, we use the cds run command for this, but as we want to start the node process in the debug mode, we use the following script node --inspect=localhost:4014 node_modules/.bin/cds run. Note that we also specify the port on which the debugger shall listen here.

  2. The app router will redirect all incoming request either to our Fiori Elements application or to the CDS process from step 1. As we run this process locally, we need to inject the destination which represents the OData server and the port on which the server shall listen. We define these variables in a .env file and leverage the dotenv package to load them into the environment variables of this process node --inspect=localhost:4016 -r dotenv/config node_modules/@sap/approuter/approuter.js. 

  3. The uploader server shall listen on port 4005 and the corresponding debugger on port 4015. Similar to above, this can be achieved with the following commandPORT=4005 node --inspect=localhost:4015 app.js.


All these scripts are defined as npm scripts which we call "dev" (see cds, app router, uploader scripts). Next, we let VS Code know how to start these three debugging processes. This can be done with the following launch.json file. Note that this file specifies the debugging ports of the local processes.
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Start CDS",
"cwd": "${workspaceFolder}/",
"runtimeExecutable": "npm",
"runtimeArgs": [
"run",
"dev"
],
"port": 4014
},
{
"type": "node",
"request": "launch",
"name": "Start Uploader",
"cwd": "${workspaceFolder}/uploader/",
"runtimeExecutable": "npm",
"runtimeArgs": [
"run",
"dev"
],
"port": 4015
},
{
"type": "node",
"request": "launch",
"name": "Start Approuter",
"cwd": "${workspaceFolder}/app/",
"runtimeExecutable": "npm",
"runtimeArgs": [
"run",
"dev"
],
"port": 4016
}
]
}

Hands-on


The hands-on part of this post is actually not about Cloud Foundry directly. This time, we will start all microservices locally in debug mode for development purposes. In case you want to run the full app on your machine, you need access to an Azure Storage account. Don't worry if you don't have an Azure account. You can start the application with limited functionality without one.

This project consists of several microservices. For local development, each one can be started independently.





  1. Clone the project.



    git clone https://github.com/SAP-samples/cloud-foundry-cap-azure-cities
    cd cloud-foundry-cap-azure-cities



  2. Open the project with VS Code.



    code .



  3. You can skip this step if you don't have an Azure Storage account. Please be aware that the application won't allow you to replace images if you skip this step.


    Add a db/credentials.json file, containing the credentials for the Azure storage account, as described above.



  4. Initialize the SQLite database via npm run deploy:cds from your command line.


  5. Switch to the debugging view and start the first module in debug mode.




  6. Use the spinner control to switch to see all launch options.




  7. Launch the following three modules and make sure they are all running.




    • Start CDS

    • Start Uploader

    • Start Approuter





  8. Open http://localhost:4006 in your browser. You should now see the following screen.





Summary


In this edition you have learned:

  • What the Cloud Application Programming Model is

  • How to develop and debug a CAP app locally

  • About the great things, you can expect at the TechEd App Space

  • How to use VS Codes launch options

  • How to debug the sample application

  • How to switch between local SQLite3 DB and an HDI container for Cloud deployments

  • How to leverage the dotenv tool to inject environment variables


#CloudFoundryFun #9 – Develop with the SAP Business Application Studio

About this series









This was the eighth blog post of my monthly new series #CloudFoundryFunThe name already says all there is to it, this series won’t be about building enterprise apps on Cloud Foundry. I think there are already plenty of great posts about those aspects out there. This series rather thinks outside the box and demonstrates unconventional Cloud Foundry use-cases ?.