Event Information
Annotated links: Episode 53 of Hands-on SAP dev with qmacro
This is a searchable description of the content of a live stream recording, specifically “Ep.53 – Starting to construct a HandsOnSAPDev info API” in the “Hands-on SAP dev with qmacro” series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the “Catch the replays” section of the series blog post.
This episode was streamed live on Fri 14 Feb 2020 and is approximately 60 minutes in length. The stream recording is available on YouTube.
Brief synopsis: In previous episodes, and particularly in Ep.52, we chatted about introducing a way to programmatically search for episodes of this series, by title, tag, date, or someother metadata. We talked about the possibility of an OData service (CAP-powered, of course) and / or an npm module on the GitHub package registry. In this episode we’ll make some first steps towards this goal.
00:01:00 Stream starts, welcome to the Valentine’s Day stream!
00:02:30 Moving to the description of what we’re going to do on the stream today, with a bit of an explanationof how the topics we want to cover can interweave with other topics that I also want to work on such as the Workflow Service on Cloud Foundry. As Andrew Barnard said in the chat, too – so many topics, so little time!
00:04:50 Celebrating the fact that Pieter Janssens has caught up with every episode recording, which is quite a feat! Some stickers coming your way, Pieter, congratulations!
00:05:45 Pointing out the Valentine’s Day gift I gave to M this morning – the Amicable Numbers pair of keyrings that I got from Maths Gear. Very nice!
00:07:30 Talking about the approach to what we’re setting off on doing, building as we go, making “the simplest thing that could possibly work”, a phrase I learned from one of my heroes Ward Cunningham (inventor of the Wiki and much more). You can listen to an interview with him on my Tech Aloud podcast, specifically this episode: The Simplest Thing that Could Possibly Work, A conversation with Ward Cunningham, Part V with Bill Venners.
00:09:10 Thinking of the two different forms the API could or should take – an HTTP-based service (CAP-based OData, of course) (and then perhaps a GraphQL endpoint if there’s interest) plus a JSON representation of the entityset in the form of the value exported from an installable Node.js npm module. Sort of an “online and offline” pair of API endpoints.
00:11:15 A quick shoutout to UI5con Belgium, including atweet from Nicolas Goris listing the live stream (and recording) links.
00:12:10 Sharing link to a great video from Harry Wolff which takes you through the new GitHub Package Registry in 10 mins, showing how to create and publish a package and thenconsume it. Well worth watching: How to use the GitHub Package Registry. There’s also the GitHub documentation on this subject, which is what we’re about to loosely follow to get things set up: Configuring npm for use with GitHub Packages.
00:13:20 A quick discussion on npm package scopes. Originally there was just the main npm registry (at npmjs.com) but since then there are other registries for so-called “scoped” packages, i.e. packages with a sort of n~amespace prefix. We know of one of those, of course, the SAP npm registry, which is at npm.sap.com and contains lots of goodness including the @sap/cds-dk
package family. With initiativeslike the GitHub Package Registry, package sharing and reuse is now democratised, in that anyone can publish packages in their own scope, which in this case reflects that person’s usernameon GitHub. So for example (and this is what we’re about to do) I can publish packages in the @qmacro
scope.
00:16:30 Mentioning another GitHub feature that appeared last year, and that’s GitHub Actions, which is what I’d also like to include in this melting pot of tech and ideas. It’s a way of automating software development workflows, and is super powerful. I use it every day in the management of my own activities tracking (which I do as issues in a project in a GitHub repo, displayed and managed in a Kanban style board), and at this point I demonstrate a version of what I’d previously built – a GitHub action to add an issue to a project’s column.
The idea is that we could potentially use the GitHub Actions feature to automatically build new versions of the npm flavour of the API, and publish a new version, when another episode was added to the OData service and appeared there. But I haven’t yet thought any of that through – I think it will be fun and educational to explore this together.
00:21:20 Starting to go through the process of publishing a simple npm package (module) on the GitHub Package Registry, in my own namespace (scope) @qmacro
.
First, we generate a Personal Access Token, with the appropriate authorisation scopes set, with which we can then use to authenticate the npm CLI with the GitHub Package Registry endpoint.
I have to be careful here not to reveal the token (altho I’ve already deleted it since the live stream ended) which I do by juggling screen displays around … whereupon long-time #HandsOnSAPDev family member Roberdinho reminds us that I used to use a “super sekrit” scene in OBS.
00:23:20 To store the freshly generated token I use the trusty pass
program to save it in a new node.
00:24:25 Next we create a new barebones npm package hosd
(short for HandsOnSAPDev of course) thus:
→ mkdir hosd && cd $_ && npm init -y
00:25:15 Now we can authenticate our local npm environment (via the npm
command) with the remote package registry at GitHub which we do like this:
→ npm login --registry=https://npm.pkg.github.com
Username: qmacro
Password: <this is where we paste the token in>
Email: (this IS public): dj.adams@pobox.com
supplying our username, token, and email address.
This results in the authentication being stored in our home .npmrc
file, which we have a brief look at shortly.
00:26:50 At this stage we’re ready to start building out our fledgling package, and so compose the “simplest thing that could possibly work” in terms of what we want to export by default, in index.js
:
module.exports = 42
We can test this out immediately and locally like this:
→ node
> const life = require('./index')
undefined
> life
42
>
00:28:20 Now we create a project-local .npmrc
file containing a reference to the user-specific npm package registry endpoint on GitHub, that reflects “my” packages:
registry=https://npm.pkg.github.com/qmacro
00:28:50 The last step in setting this package up for publishing is to ensure the right values are specified in the package.json
file, paying particular attention to the value for the name
(@qmacro/hosd
) and repository:url
(git://github.com/qmacro/hosd.git
) properties. We end up with contents like this:
{
"name": "@qmacro/hosd",
"version": "1.0.1",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"url": "git://github.com/qmacro/hosd.git"
},
"keywords": [],
"author": "",
"license": "ISC"
}
00:30:10 We’re almost ready to publish the package! But first, there’s something we need to do, of course, andthat’s to get the repository set up locally (on my machine) and also remotely (on GitHub). So first we set everything up locally:
→ git init
→ git add .
→ git commit -m 'initial commit'
Now we have to set up the repo on GitHub. And for this, rather than use the native GUI in the browser (yes, kids, don’t forget, native means Web … anything else is “OS-specific”! ;-)) we use one of the two (two!) command line GitHub clients.
Announced this week was the new gh but there’s also the existing hub which I only discovered a week or so ago, and have been enjoying using ever since.
With hub
we create the repo remotely, which still feels a little bit like magic:
→ hub create hosd
Updating origin
https://github.com/qmacro/hosd
This not only creates the repo on GitHub but also adds it as a remote to this local repo, as we can see when we check:
→ git remote -v
origin git@github.com:qmacro/hosd.git (fetch)
origin git@github.com:qmacro/hosd.git (push)
We make immediate use of this by pushing our changes:
→ git push -u origin master
00:34:25 Now is the time for publishing our fledgling package!
Just before we do, we take a look at what files we have, and the sight of the project-local .npmrc
file reminds me to share a peek inside the main file in my home directory, i.e. ~/.npmrc
. I use cut
to only show a certain slice of characters, so as not to reveal the entire token, and with that we can see the configuration I have:
→ cut -c1-45 ~/.npmrc
@sap:registry=https://npm.sap.com
@qmacro:registry=https://npm.pkg.github.com
depth=0
//npm.pkg.github.com/:_authToken=0f868fab3c0b
This includes pointers to the registries for @sap
scoped packages and for @qmacro
scoped packages, and in the last line you can see how the token auth is stored.
00:36:42 So, let’s do it:
→ npm publish
npm notice
npm notice ? @qmacro/hosd@1.0.0
npm notice === Tarball Contents ===
npm notice 20B index.js
npm notice 295B package.json
npm notice === Tarball Details ===
npm notice name: @qmacro/hosd
npm notice version: 1.0.0
npm notice package size: 335 B
npm notice unpacked size: 315 B
npm notice shasum: ca4f0f7bbd3bda931e05d49899fae6e6376b4fd0
npm notice integrity: sha512-UdkPTdS4IUvMa[...]1Fch9aOqm1Chg==
npm notice total files: 2
npm notice
+ @qmacro/hosd@1.0.0
Success! And we see this package reflected in the repo on GitHub, in the “Package” section, with instructions on how to consume it (which is basically the same as everywhere else, i.e. npm install @qmacro/hosd@1.0.0
). So we give it a quick try, first by installing it:
→ mkdir consume && cd $_ && npm init -y
→ npm install @qmacro/hosd
And then by using it in a simple script:
const life = require('@qmacro/hosd')
console.log(life)
Which happily (but unsurprisingly) produces the result we want:
→ node index.js
42
00:41:35 At this stage we can start to imagine what this could eventually look like, i.e. instead of the scalar value 42, we’d emit an object that reflected the metadata of the HandsOnSAPDev series of episodes, a bit like this:
module.exports = [
{ ID: 0,
title: "Getting our feet wet",
topics: "adventofcode,javascript,nodemon"
}
]
This could be the basis of a local, “train-compatible” API, but what’s more, it could be sourced from the “upstream” online API which will be in the form of a RESTful endpoint (yes, the CAP Node.js OData service that we’re about to start building).
00:44:10 Now it’s time to turn our attention to kicking off this OData service, by having a look at the metadata I gathered just before going live on the stream. It’s (for now, to keep things simple) a list of episodes, with ID, title and some comma-separated topics, and in an online spreadsheet that I can easily export to CSV.
00:44:40 I’d originally experimented with storing the metadata as issues in a GitHub repo and pulling those out programmatically, but while GitHub repo issues are awesome, they didn’t quite stretch to the various bits of metadata I wanted to store about the Episodes. I did however show how to pullissues, in the terminal, on the command line, from a repo, with the new gh
command line tool mentioned earlier, which works out which repo is relevant based on the local repoyou’re already in (in this instance, qmacro/test):
qmacro@penguin:~/local/projects/github.com/qmacro/test
→ gh issue list
Issues for qmacro/test
#35 handsonsapdev demo issue (bug)
#34 and another issue, hurray (episode, javascript, nodejs)
#33 new issue (cap, episode)
(I don’t particularly like the “Issues for qmacro/test” heading that is output, it reminds me of how the Cloud Foundry command line tool cf
produces output, with headers that also get in the way, but anyway, what do I know?)
00:46:30 Starting a new CAP project to kick things off from the OData service perspective, and at least at theoutset I intend to follow the “simplest thing that could possibly work” approach again. There are many ways to organise data like this (episodes, topics, and many-to-many relationships between them), including the use of link entities, as described in the Domain Models – Managed Many-To-Many Associations section of the CAP documentation.
But for now let’s keep it simple, and just have a single String type property with the topics in there, comma-separated. That’s not half as bad as it seems, as we’ll be able to use thepower of OData system query options such as contains.
00:47:30 Checking we have the latest @sap/cds-dk
installed globally (remember the depth=0
setting in ~/.npmrc
is limiting the output here to something much more useful):
→ npm list -g
/home/qmacro/.config/nvm/versions/node/v10.17.0/lib
├── @sap/cds-dk@1.4.4
├── mbt@1.0.8
└── npm@6.13.7
we initiate a new CAP Node.js project thus:
→ cds init handsonsapdev
[cds] - creating new project in ./handsonsapdev
done.
...
and add the simplest combination of data model and service definition that we can get:
db/schema.cds
:
namespace qmacro;
entity Episodes {
key ID : Integer;
title : String;
topics : String;
}
srv/service.cds
:
using qmacro from '../db/schema';
service handsonsapdev {
entity Episodes as projection on qmacro.Episodes;
}
We do take a look at the CodeList
aspect as something we might use at some stage for the management of the topics, but save that up for another time.
00:52:10 We bring in the downloaded CSV file into the right place, as the right name (db/csv/qmacro-Episodes.csv
of course), install the npm sqlite3
package, and deploy. And lo & behold, we have our OData service. Thanks CAP!
00:56:20 At this stage we can grab the value of the value
property in the JSON representation of the Episodes entityset, using the classic combo of curl
and jq
, like this:
→ curl http://localhost:4004/handsonsapdev/Episodes | jq -rc '.value'
[{"ID":0,"title":"Getting our feet wet","topics":"javascript,nodemon,nodejs,adventofcode"},{"ID":1,"title":"Setting up for the Node.js flavoured version of SAP’s Cloud Application Programming Model","topics":"nodejs,cap,npm,vscode,editorconfig,cf,cloudfoundry,fzf,jq,cds"},{"ID":2,"title":"Starting to build a bookshop backend service with CAPM","topics":"bookshop,cap,nodejs,cds"},{"ID":3,"title":"Special guest edition: Interview with core CAPM developers from the mothership!","topics":"guest,christiangeorgi,vscode,cap,cds"},{"ID":4,"title":"Debugging CAPM and a look at the cds REPL","topics":"nodejs,repl,debugging,cap,cds"},{"ID":5,"title":"Continuation ...","topics":"businessrules,cloudplatform,cloudfoundry,api,curl,netrc,ansible,raspberrypi"},{"ID":50,"title":"Calling our Business Rule viathe runtime API","topics":"businessrules,cloudplatform,cloudfoundry,api,curl,netrc,bash,oauth,cds,cap"},{"ID":51,"title":"More fun with Business Rules API and OAuth","topics":"businessrules,cloudplatform,cloudfoundry,api,curl,netrc,bash,oauth,apihub,javascript"},{"ID":52,"title":"Tidying up the JS script for Business Rules OAuth flow","topics":"ssh,javascript,businessrules,api,oauth,docker,dry"}]
(output redacted)
00:57:50 Adding this JSON to the @qmacro/hosd
package’s index.js
, i.e. setting the data to be the thing that is exported by default, we can now access that data with require
. Lovely!
00:59:10 Incrementing the patch level of the @qmacro/hosd
package (in package.json
) and republishing the package, we then use npm update
in our consumer project and see that it has indeed worked:
→ node
> const eps = require('@qmacro/hosd')
undefined
> eps.filter(x => x.topics.match(/businessrules/))
[ { ID: 40,
title: 'Catchup from SAP TechEd',
topics:
'cap,cds,composition,association,deepinsert,capcommunityrepo,teched,apihub,businessrules' },
{ ID: 41,
title: 'Business Rules API on the BTP',
topics: 'businessrules,cloudplatform,neo,bash,curl,api' },
{ ID: 44,
title: 'Business Rules in Cloud Foundry – setting things up',
topics: 'cloudplatform,cloudfoundry,businessrules' },
{ ID: 48,
title: 'Continuing with the Business Rules service on CF',
topics: 'businessrules,mta,cloudplatform,cloudfoundry,cf,mbt' },
{ ID: 49,
title: 'Defining our first Business Rules project',
topics:
'businessrules,cloudplatform,cloudfoundry,api,curl,netrc,ansible,raspberrypi' },
{ ID: 50,
title: 'Calling our Business Rule via the runtime API',
topics:
'businessrules,cloudplatform,cloudfoundry,api,curl,netrc,bash,oauth,cds,cap' },
{ ID: 51,
title: 'More fun with Business Rules API and OAuth',
topics:
'businessrules,cloudplatform,cloudfoundry,api,curl,netrc,bash,oauth,apihub,javascript' },
{ ID: 52,
title: 'Tidying up the JS script for Business Rules OAuth flow',
topics: 'ssh,javascript,businessrules,api,oauth,docker,dry' } ]
>
Wonderful!
That’s it for this episode – a lot of stuff to take in, but hopefully it all fits together in our brains. And sets us up for some interesting learning and discovery together too.
Thanks for joining and taking part … and remember – #TheFutureIsTerminal!
Hi DJ Adams,
I was trying to create one application using CAP and CDS framework provided by SAP. I am using nodejs for it. But in the process of it I have come across a problem. Can you please suggest me what to do to that ?
The Problem:
am returning a promise from one of srv.before's event handler. The promise will be resolved after getting some data from database. And will be rejected if there is no data in the db table. Also I have to do some processing on the fetched data before resolving the promise.
My Code Snippet
ResourceModel.getById()
cds.transaction()
is one of cds framework provided method by SAP.What I expect it to do is after successfully fetching all the records it should call the provided callback where I have my processing logic written and call
resolve()
after the processing is done.I am receiving the data in the callback, able to print
Data available
and the received data object end even thePromise Resolved
on the console. But surprisingly the promise is not getting resolvedI can say the "promise is not getting resolved" because this returned promise will be collected in a
Promise.all()
by the framework (as stated in the official docs - https://cap.cloud.sap/docs/node.js/api#service-before) and return a response after resolving. But I am not getting any response (neither success nor failure). Postman stays in loading... state forever.The above example works fine with
setTimeout()
example.where am I going wrong?
I have posted the question on stackoverflow also. here is the link
https://stackoverflow.com/questions/60246014/promise-is-not-resolving-even-after-calling-resolve
Please help, if you can.
Hi Arunava - thanks for getting in touch. But a comment to a blog post is not the ideal place for asking a detailed question. I'd suggest you ask it in the Q&A community section - there's a specific tag for CAP there too: https://answers.sap.com/tags/9f13aee1-834c-4105-8e43-ee442775e5ce
Asking a question there will perhaps get you help quicker, and the question and answer will benefit others too. Cheers!
Hi Arunava, Did you try to plug your code in an
on
handler instead, I think that one is better suited for your task? Also be aware ofcds.transaction/cds.tx
. If you don’t provide thereq
object as an argument, you’ll need to commit it yourself. So the better logic would be:Can you try something like this? Btw: It would be better if you just return the Promise, as our framework will then return an empty array, that’s the standard behaviour according to the OData specification. Thanks a lot and best regards, David