Skip to Content
Technical Articles
Author's profile photo Kevin Hu

Part 4 Deploy to Cloud Foundry as multi-target app #epl-app

part 1 the story begins

part 2 reuse, localization, annotations

part 3 build a UI and deploy to Cloud Foundry

part 4 deploy to cloud foundry as multi-target app

 

So in the last few posts of the series, I have built an app using Cloud Programming Model and deploy to cloud foundry as a single module app using manifest.yml file.

In this part I am going to try do a so-called “production” deployment to CF (just like the question I was asked by Manjunath Gudisi in the first part of the series, while using HANA HDI container as our db module target, and deploy the service layer and app layer as individual applications in cloud Foundry.

There are a few steps to follow in the seperate modules. An alternative way is to use WebIDE and generate a “Business Service” Application by the template and then copy relevant files into our local project. In here I am just going to walk through the steps.

The Plan

1. copy the relevant files into local project

2. build the whole project into a mta multi-target .mtar file ready for deployment

3. deploy using cloud foundry CLi

The goal is to have one local dev environment which can do local quick dev/testing using sqlite and also do remote deployment to Cloud Foundry and HANA db.

Get Started

DB layer

1. create or copy from the webide template: db/package.json as below, this file includes the script to run for deploying our db artifacts into HDI container

{
  "name": "deploy",
  "dependencies": {},
  "engines": {
    "node": "^8"
  },
  "scripts": {
    "postinstall": "node .build.js",
    "start": "node node_modules/@sap/hdi-deploy/deploy.js"
  },
  "devDependencies": {
    "@sap/hdi-deploy": "^3.11.5"
  }
}

2. create a new file: db/src/.hdiconfig

{
    "file_suffixes": {
        "csv": {
            "plugin_name": "com.sap.hana.di.tabledata.source"
        },
        "hdbafllangprocedure": {
            "plugin_name": "com.sap.hana.di.afllangprocedure"
        },
        "hdbanalyticprivilege": {
            "plugin_name": "com.sap.hana.di.analyticprivilege"
        },
        "hdbcalculationview": {
            "plugin_name": "com.sap.hana.di.calculationview"
        },
        "hdbcds": {
            "plugin_name": "com.sap.hana.di.cds"
        },
        "hdbcollection": {
            "plugin_name": "com.sap.hana.di.collection"
        },
        "hdbconstraint": {
            "plugin_name": "com.sap.hana.di.constraint"
        },
        "hdbdropcreatetable": {
            "plugin_name": "com.sap.hana.di.dropcreatetable"
        },
        "hdbflowgraph": {
            "plugin_name": "com.sap.hana.di.flowgraph"
        },
        "hdbfulltextindex": {
            "plugin_name": "com.sap.hana.di.fulltextindex"
        },
        "hdbfunction": {
            "plugin_name": "com.sap.hana.di.function"
        },
        "hdbgraphworkspace": {
            "plugin_name": "com.sap.hana.di.graphworkspace"
        },
        "hdbhadoopmrjob": {
            "plugin_name": "com.sap.hana.di.virtualfunctionpackage.hadoop"
        },
        "hdbindex": {
            "plugin_name": "com.sap.hana.di.index"
        },
        "hdblibrary": {
            "plugin_name": "com.sap.hana.di.library"
        },
        "hdbmigrationtable": {
            "plugin_name": "com.sap.hana.di.table.migration"
        },
        "hdbprocedure": {
            "plugin_name": "com.sap.hana.di.procedure"
        },
        "hdbprojectionview": {
            "plugin_name": "com.sap.hana.di.projectionview"
        },
        "hdbprojectionviewconfig": {
            "plugin_name": "com.sap.hana.di.projectionview.config"
        },
        "hdbreptask": {
            "plugin_name": "com.sap.hana.di.reptask"
        },
        "hdbresultcache": {
            "plugin_name": "com.sap.hana.di.resultcache"
        },
        "hdbrole": {
            "plugin_name": "com.sap.hana.di.role"
        },
        "hdbroleconfig": {
            "plugin_name": "com.sap.hana.di.role.config"
        },
        "hdbsearchruleset": {
            "plugin_name": "com.sap.hana.di.searchruleset"
        },
        "hdbsequence": {
            "plugin_name": "com.sap.hana.di.sequence"
        },
        "hdbstatistics": {
            "plugin_name": "com.sap.hana.di.statistics"
        },
        "hdbstructuredprivilege": {
            "plugin_name": "com.sap.hana.di.structuredprivilege"
        },
        "hdbsynonym": {
            "plugin_name": "com.sap.hana.di.synonym"
        },
        "hdbsynonymconfig": {
            "plugin_name": "com.sap.hana.di.synonym.config"
        },
        "hdbsystemversioning": {
            "plugin_name": "com.sap.hana.di.systemversioning"
        },
        "hdbtable": {
            "plugin_name": "com.sap.hana.di.table"
        },
        "hdbtabledata": {
            "plugin_name": "com.sap.hana.di.tabledata"
        },
        "hdbtabletype": {
            "plugin_name": "com.sap.hana.di.tabletype"
        },
        "hdbtextconfig": {
            "plugin_name": "com.sap.hana.di.textconfig"
        },
        "hdbtextdict": {
            "plugin_name": "com.sap.hana.di.textdictionary"
        },
        "hdbtextinclude": {
            "plugin_name": "com.sap.hana.di.textrule.include"
        },
        "hdbtextlexicon": {
            "plugin_name": "com.sap.hana.di.textrule.lexicon"
        },
        "hdbtextminingconfig": {
            "plugin_name": "com.sap.hana.di.textminingconfig"
        },
        "hdbtextrule": {
            "plugin_name": "com.sap.hana.di.textrule"
        },
        "hdbtrigger": {
            "plugin_name": "com.sap.hana.di.trigger"
        },
        "hdbview": {
            "plugin_name": "com.sap.hana.di.view"
        },
        "hdbvirtualfunction": {
            "plugin_name": "com.sap.hana.di.virtualfunction"
        },
        "hdbvirtualfunctionconfig": {
            "plugin_name": "com.sap.hana.di.virtualfunction.config"
        },
        "hdbvirtualpackagehadoop": {
            "plugin_name": "com.sap.hana.di.virtualpackage.hadoop"
        },
        "hdbvirtualpackagesparksql": {
            "plugin_name": "com.sap.hana.di.virtualpackage.sparksql"
        },
        "hdbvirtualprocedure": {
            "plugin_name": "com.sap.hana.di.virtualprocedure"
        },
        "hdbvirtualprocedureconfig": {
            "plugin_name": "com.sap.hana.di.virtualprocedure.config"
        },
        "hdbvirtualtable": {
            "plugin_name": "com.sap.hana.di.virtualtable"
        },
        "hdbvirtualtableconfig": {
            "plugin_name": "com.sap.hana.di.virtualtable.config"
        },
        "jar": {
            "plugin_name": "com.sap.hana.di.virtualfunctionpackage.hadoop"
        },
        "properties": {
            "plugin_name": "com.sap.hana.di.tabledata.properties"
        },
        "tags": {
            "plugin_name": "com.sap.hana.di.tabledata.properties"
        },
        "txt": {
            "plugin_name": "com.sap.hana.di.copyonly"
        }
    }
}

3. db/.build.js

const fs = require('fs');
const childproc = require('child_process');

if (fs.existsSync('../package.json')) {
    // true at build-time, false at CF staging time
    childproc.execSync('npm install && npm run build', {
        cwd: '..',
        stdio: 'inherit'
    });
}

Project Root

1. update the .cdsrc.json. It is for setup module build target and options.

{
    "build": {
        "target": ".",
        "tasks": [
            {
                "for": "hana",
                "src": "db",
                "options": {
                    "model": [
                        "db",
                        "srv",
                        "index.cds"
                    ]
                }
            },
            {
                "for": "node-cf",
                "src": "srv",
                "options": {
                    "model": [
                        "db",
                        "srv",
                        "index.cds"
                    ]
                }
            }
        ]
    }
}

2. package.json. note we have the new scripts for build the mta and deploy to cf.

{
  "name": "sap-cap-epl",
  "version": "1.0.0",
  "description": "Generated by cds init",
  "repository": "<Add your repository here>",
  "license": "ISC",
  "dependencies": {
    "@sap/cds": "^3.18.3",
    "express": "^4.17.1",
    "hdb": "^0.17.1"
  },
  "engines": {
    "node": "^8.9"
  },
  "scripts": {
    "build": "cds build/all --clean",
    "deploy": "cds deploy",
    "start": "cds run",
    "watch": "cds deploy --to sqlite:db/epl.db && cds watch .",
    "build:mta": "cds build/all && mbt build -p=cf",
    "deploy:cf": "npm run build:mta && cf deploy mta_archives/${npm_package_name}_${npm_package_version}.mtar"
  },
  "files": [
    "srv",
    "db",
    "index.cds"
  ],
  "cds": {
    "odata": {
      "version": "v4"
    },
    "requires": {
      "db": {
        "kind": "sqlite",
        "model": [
          "db",
          "srv"
        ],
        "credentials": {
          "database": "db/epl.db"
        },
        "[production]": {
          "kind": "hana"
        }
      }
    }
  },
  "devDependencies": {
    "sqlite3": "^4.1.0"
  }
}

3. mta.yaml. note I have also set up a UAA instance and app router. They are optional at this stage of course. I just include it for future usage.

_schema-version: 2.0.0
ID: sap-cap-epl
version: 1.0.0
modules:
  - name: sap-cap-epl-db
    type: hdb
    path: db
    parameters:
      memory: 256M
      disk-quota: 256M
    requires:
      - name: sap-cap-epl-db-hdi-container
  - name: sap-cap-epl-srv
    type: nodejs
    path: srv
    parameters:
      memory: 512M
      disk-quota: 256M
    provides:
      - name: srv_api
        properties:
          url: ${default-url}
    requires:
      - name: sap-cap-epl-db-hdi-container
      - name: sap-cap-epl-uaa
  - name: sap-cap-epl-ui
    type: nodejs
    path: app
    parameters:
      memory: 256M
      disk-quota: 256M
    requires:
      - name: srv_api
        group: destinations
        properties: 
          forwardAuthToken: true
          strictSSL: true
          name: srv_api
          url: ~{url}
resources:
  - name: sap-cap-epl-uaa
    type: org.cloudfoundry.managed-service
    parameters:
      service-plan: application
      service: xsuaa
      config:
        xsappname: sap-cap-epl-${space}
        tenant-mode: dedicated
  - name: sap-cap-epl-db-hdi-container
    type: com.sap.xs.hdi-container
    properties:
      hdi-container-name: ${service-name}
    parameters:
      service: hanatrial

4. xs-security.json for the UAA settings

{
    "xsappname": "epl-cap-app",
    "tenant-mode": "dedicated",
    "description": "Security profile of called application",
    "scopes": [
      {
        "name": "uaa.user",
        "description": "UAA"
      }
    ],
    "role-templates": [
      {
        "name": "TESTUSER",
        "description": "UAA",
        "scope-references": [
          "uaa.user"
        ]
      }
    ]
  }

an interim profit

Now we can do a deploy for only the db module to HDI container and have a look.

set CDS_ENV=production
cf deploy

you may be prompted to install SAP Cryptographic Library if you are on Windows and haven’t done so before. Follow this link to install it and run the command again.

After that we can check the HDI container from the webide database explorer

Under the hood

once cf deploy starts, it is doing the following things

1. first to build the hana db artifacts (hdbcds files) into db/src/gen folder. note the build will generate .hdbtabledata files for us, so we no longer need to do manual mapping for the fields in CSV data loads.

2. create a hdi container service in cloud foundry

3. create a service key and copy to your local project (default-env.json), make sure you include the file in your .gitignore as it includes all the sensitive information to connect to the database. The file is also copied to the root folder so that other module can use.

4. start the deployment using the hdi-deloyer

 

Service Layer

we continue with the service layer changes.

1. srv/package.json. the script we have here is doing similar job as “cds run”

{
    "name": "project-srv",
    "version": "1.0.0",
    "dependencies": {
        "@sap/cds": "^3.18.3",
        "express": "^4.17.1",
        "@sap/hana-client": "^2.4.144"
    },
    "engines": {
        "node": "^10"
    },
    "scripts": {
        "start": "cds serve gen/csn.json"
    },
    "cds": {
        "requires": {
            "db": {
                "kind": "hana",
                "model": "gen/csn.json"
            }
        }
    }
}

App Layer

1. app/package.json. I did no changes other than setting up it as an app router as it will be our entry point of the MTA application.

{
  "name": "epl-app-ui",
  "dependencies": {
    "@sap/approuter": "^6.5.1"
  },
  "engines": {
    "node": "^10"
  },
  "scripts": {
    "start": "node node_modules/@sap/approuter/approuter.js"
  }
}

2. xs-app.json. I will go through these in the future posts while I am diving into the authentication things.

{
  "welcomeFile": "webapp/",
  "authenticationMethod": "none",
  "routes": [{
      "source": "^/webapp/(.*)$",
      "target": "$1",
      "localDir": "webapp/"
  }, {
      "source": "^(.*)$",
      "destination": "srv_api"
  }]
}

Run the appliation locally connecting to HDI

set CDS_ENV=production
cf run

There are some ambiguous output but can just be ignored. We can easily prove that it is actually connecting to the remote HDI instead of the SQLite by inserting some data from webide.

Build MTA

Historically the tool to make an MTA archive locally is using this old tool called “Multi-Target Application Archive Builder“, as java jar file library and run like a java -jar command (need JDK installation…) so it sounds like a quite unpleasant job.

Now we have got a new tool called mbt (Cloud MTA Build Tool), which we can run from node environment and generate the mtar file. Check here to download and install the dependencies (like Chocolatey and Make)

once done we can start the build by run

npm run build:mta
#    "build:mta": "cds build/all && mbt build -p=cf",

cds build will check the “db” and “srv” module from the .cdsrc.json file . csn.json file is also generated.

finally mbt tool will package all the modules into one mtar file (mta_archives folder) ready for deployment.

Deploy

before deployment, make sure to install the “MultiApps CF CLI plugin” so that cf deploy can be run.

npm run deploy:cf
#    "deploy:cf": "npm run build:mta && cf deploy mta_archives/${npm_package_name}_${npm_package_version}.mtar"

# for Windows Users
npm run build:mta && cf deploy mta_archives/%npm_package_name%_%npm_package_version%.mtar

Go to the CF cockpit and check the services and apps deployed.

Some glitches

Looks like HANA build/deployer tool is not generating the uuid type key field values while they are not supplied in csv file load. It did not get this problem in SQLite. So I have changed the key field temporarily to string type.

Next Step

There are some really interesting things I want to try next, a league ladder (ranking table)? or try out the really exciting “SAP Business Application Studio” so if it fits in CAP development.

 

Stay tuned. #epl-app

 

Part4 branch in github

 

One More Thing

In case you want to delete all the things we created, it can be done either delete the apps and services from cf cockpit one by one, or a better way is to run an cf undeploy

cf a
# 3 apps
cf s
# 2 services

cf undeploy sap-cap-epl --delete-services -f

cf a
# 0 apps
cf s
# 0 services

Assigned Tags

      2 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Tarun Jain
      Tarun Jain

      Excellent blog series Kevin nicely explained !

      Thanks I had some questions for a while about deployment of local CAPM project to SAP HANA / CF MTA. This answers most of them.

       

      Thanks,

      Tarun

      Author's profile photo Luiz Gomes
      Luiz Gomes

      in webide> build path: db/

      "src/.hdiconfig": Configuration does not define a build plugin for file suffix "cds"