Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
qmacro
Developer Advocate
Developer Advocate
This is a searchable description of the content of a live stream recording, specifically “Ep.50 - Calling our Business Rule via the runtime API” in the “Hands-on SAP dev with qmacro” series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the “Catch the replays” section of the series blog post.

This episode was streamed live on Fri 17 Jan 2020 and is approximately 60 minutes in length. The stream recording is available on YouTube.



(Nominal title: "Chinposin'")

Brief synopsis: We have a simple business rule defined - now it’s time to use the runtime API to call it. As it’s on Cloud Foundry and authentication is via OAuth, we’re in for a fun and interesting episode! (There is a slight change to the plans here after the cloud platform outage yesterday, which has left me bereft of a fully functional Cloud Foundry trial account.)

00:01:20 Saying good morning to everyone, including 31a8856c1f6f4bcfa7f3d890a0b88fd2 in Melbourne, who is working his way through the Hands-on SAP Dev live stream recordings and came across an interesting gotcha in Ep.1, relating to interpolation in the Bash shell. Nice one!

00:05:20 Highlighting an SAP CodeJam that I’ll be running next week, on Thu 23 Feb, (on CAP - Node.js) in Lyon, organised by pdominique-bio which is documented in his blog post SAP CodeJam goes south!.

00:07:05 Also there’s another SAP CodeJam I’ll be running shortly after that, in Oslo, on the same subject, organised this time by the one and only rsletta. It’s on Wed 05 Feb and there are still some places available.

00:08:15 I wander briefly off piste trying to think of why “railway tracks” reminded me of Bob Dylan. Turns out (having just looked it up) it was a combination of two things that combined to bring about this distraction. In 1975 he released the (now legendary) album “Blood On The Tracks” and also painted a art piece called “Train Tracks”. So there you have it, not a completely bonkers association 🙂

00:08:40 Bringing to our attention the re>≡CAP (un)conference! Organised by a wonderful bunch of folks (get involved, head on over to the recap2020 Slack channel) - it will be in Heidelberg on 15 May this year. The website will be appearing soon, so watch out for it!

00:13:50 Talking about OpenFaaS as a Functions-as-a-Service project that we can run on a Raspberry Pi based cluster (amongst other places!) - this is something I definitely want to start looking into as a means to an end (something to run over a Docker Swarm or Kubernetes managed cluster) and an end in itself (FaaS is an important layer in today’s cloud world).

00:14:05 Don’t forget to subscribe to iinside’s YouTube Channel - Max is live streaming on a regular (monthly) basis.

00:14:40 A quick shoutout to a great blog post from rsletta talking about their live streaming approach and equipment for SAP Inside Track Oslo last year - “Stream on! - A look at how we broadcasted SAP Inside Track OSLO 2019”.

00:14:55 This brings us neatly on to a related and relatively new initiative called OpenSIT, from oliver and friends. The mission is “to make the session content recorded at SAP Inside Tracks easily accessible to SAP community members”. Check out this introductory blog post to find out more: “Introducing OpenSIT”.

00:16:00 Talking about yesterday’s outage in Frankfurt, which caused me to rethink the plans for this live stream quite quickly! 🙂 In any case, you can use the SAP Cloud Platform status page to monitor services.

00:18:50 Delighted that christian.drumm picked up on the two command line utilities I used to create a little logo, those are figlet and lolcat.

00:20:00 Setting the scene for something I wanted to share on CAP environments, and managing, say, development and production profiles separately. This means you can use SQLite in development locally, for example, but use HANA for production, and use the same build and deploy process, controlling the environment switch with an environment variable (NODE_ENV). The scene is the last exercise in the CAP - Node.js CodeJam content, specifically step 6 where I had just modified the instructions to get the attendees to overwrite the DB type in package.json. There’s a better way, which I knew at the time but just couldn’t remember fully.

The better way, as we see from the short demo (using the relatively new design time package @Sap/cds-dk), is to use the ‘profile’ concept, described in the Runtime Configuration for Node.js section of the CAP docs.

(And yes, during this demo I installed the mbt build tool for no reason - as I was only intending to do a cds build/all - doh!).

00:34:30 Another short distraction on languages that start indexing at 1 rather than 0, ABAP being one of them, as well as FORTRAN, Julia, Smalltalk and APL. I dug into this just now a little bit more and it seems that zero-indexing came from BCPL and specifically from compilation of that language for the classic IBM 7094 mainframe from the early 1960’s. My vague (and ultimately incorrect) musing that Pascal or Modula 2 might also be languages that are one-indexed was because I was confusing the father of these two languages, Nilkaus Wirth, with Edsger W. Dijkstra, who wrote a paper “Why numbering should start at zero”. The amount of rubbish that is wrongly tangled up in my brain is ridiculous.

00:36:30 This is what the resulting modified cds -> requires section of package.json looks like after the addition of a [production] section:
"cds": {
"requires": {
"db": {
"kind": "sqlite",
"model": [
"db/",
"srv/",
"app/"
],
"[production]": {
"kind": "hana"
},
"credentials": {
"database": "db.db"
}
}
}

When NODE_ENV is set to [production], the value of “hana” is used for cds -> requires -> db -> kind rather than “sqlite”. Neat! (Thanks very much to david.kunz2 who was very patient with me on this yesterday, by the way!)

You may be interested to know that I’ve now updated the CodeJam content (exercise 10 step 6) to include this, with this commit: update 10.6 to describe use of profiles.

00:40:30 Moving over to Business Rules and the API Hub now, to complete the chain of HTTP calls that we need to invoke API endpoints in the set of Business Rules APIs, following on from where we left off in Ep.49.

00:43:25 Looking at what we had in that previous episode, which I now have formed into a simple shell script (gettoken) for the purposes of demonstration and quick re-running:
#!/bin/bash

curl \
--silent \
--netrc \
--header "Content-Type: application/x-www-form-urlencoded" \
--data "grant_type=password&response_type=token&username=dj.adams@sap.com&password=$PASSWORD" \
https://i347491trial.authentication.eu10.hana.ondemand.com/oauth/token

This is a call to the OAuth 2.0 authentication endpoint, to request a bearer token with which to make subsequent calls to the actual Business Rules API endpoints.

00:45:10 Using command substitution, we grab the token using jq --raw-output '.access_token' and save the value in the TOKEN environment variable (the name of this variable is not significant).

00:47:50 We can then use this token to authenticate an actual API call, which we pick from the rules repository (i.e. design time) endpoint. The call looks like this:
#!/bin/bash

curl \
--silent \
--netrc \
--header "Authorization: Bearer ${TOKEN}" \
https://bpmrulerepository.cfapps.eu10.hana.ondemand.com/rules-service/rest/v1/projects

And in fact, there’s something we used in this call that was actually completely unnecessary - can you spot what it was? Let me know in the comments to this blog post.

00:52:10 We finish off with a little bit of jq syntax, to pull out properties from the response returned from the design time API endpoint:
./getprojectinfo | jq '.[0].name'

… as well as a short discussion on OAuth 2.0 authorisation endpoints, real API endpoints, and CSRF (XSRF) tokens.

And that just about wraps it up for this episode - thanks as always for helping make our live stream series fun and engaging!
2 Comments