Skip to Content
Event Information
Author's profile photo DJ Adams

Annotated links: Episode 28 of Hands-on SAP dev with qmacro

This is a searchable description of the content of a live stream recording, specifically “Episode 28 – Digging into Workflow user task UIs” in the “Hands-on SAP dev with qmacro” series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the “Catch the replays” section of the series blog post.

This episode, titled “Digging into Workflow user task UIs”, was streamed live on Wed 05 Jun 2019 and is approximately one hour in length. The stream recording is available on YouTube.

Below is a brief synopsis, and links to specific highlights – use these links to jump directly to particular places of interest in the recording, based on ‘hh:mm:ss’ style timestamps.

Brief synopsis

Continuing on from the previous episode, we dig in a little deeper to user task UIs, looking at the Workflow API and task info, as well as the My Inbox API, to understand how the generic UI component actually works.

00:02:50: Mentioning Test-Driven Development with ABAP webinar which was to take place directly after this live stream episode – obviously it’s over now, but you should definitely search for the recording and check it out if you can.

00:03:55: Looking briefly at a tweet via John Patterson at a blog post that was doing what I’ve seen others do before, which is to treat an OData service as a plain JSON endpoint, which technically you can do, but probably you don’t really want to be doing. In a great way, community member Jakob Marius Kjaer provides some helpful hints and even some sample code to nudge the blog post author in the right direction. Nice work Jakob!

00:07:28: Talking about the recent news that Apple will replace the default Bash shell with the Z shell in upcoming releases of macOS, noting that they have different licences, which may be part of the reason for the move on Apple’s part.

00:09:45: Looking at where we left off last time, which is that we’re at the stage where we have a service task and a user task in the definition of the workflow that we’re building in the SAP Web IDE.

We have a running workflow instance, with a user task waiting for us in My Inbox, referencing product HT-1000, but (as we remember from the end of the previous episode), we note that the breakpoint we’d set wasn’t being reached.

00:14:13: Starting to figure out why the breakpoint wasn’t being reached, as well as looking at the bindings in the view, and what they are – there are bindings to the default model and to the “app” model.

00:16:10: In looking at the component startup, I mention that there’s a specific blog post on this topic in my “Discovering SCP Workflow” series – called Component Startup.

00:20:50: In the course of debugging, we look briefly at the documentation for the actual Workflow API we are using, which is available on the SAP API Business Hub: Workflow API for Neo – the actual resource we’re retrieving is this one:


00:23:21: It turns out that the breakpoint wasn’t being reached as there were some undefined values that were needed. So at this stage we add a script task to the workflow definition to specify those values.

Having inserted the script task before the user task, we comment out the sample code inside it and specify the following, based on analysis of what’s required:

$.context.genericusertask = {
  control: {
    source: "/productdata/d",
    properties: [ "Name", "StockQuantity", "AverageRating" ]

00:28:28: We create a new instance of the workflow definition to test this out (obviously, because the existing instance doesn’t have this required data in its context).

This time we see that the breakpoints are indeed reached, and on continuing, the task UI correctly shows what we’re expecting, i.e. the data from the three product properties. Nice!

00:30:20: A quick digression into something we talked about in the previous episode, i.e. why JSON representations of OData start with ‘d’. Basically, the focus is not on the choice of ‘d’ for the property name, it’s on the choice of actually embedding the real data within an object (a map) in the resource returned, to foil attempts at cross-site scripting attacks.

For more information about this, see the section “JSON Representations” in the JSON Format (OData Version 2.0) documentation.

00:33:10: We now switch back to the Workflow API home on the SAP API Business Hub, to see if we can work out how to find the workflow instance that’s still running.

While there’s a sandbox API environment to try things out on generically, you can also define your own environments, and that’s what we do here, to reflect our SAP Cloud Platform trial subaccount.

00:36:43: We first make a more general call to the /v1/task-instances endpoint to list all of our workflow instances (included those already in “Completed” status, for example).

00:37:35: It’s worth at this stage trying the API out directly (i.e. not within the SAP API Business Hub), and what better way than to use curl. It turns out that curl supports the .netrc standard (for storing and using authentication details), which you can read about here:

00:39:48: We invoke curl against the /v1/task-instances endpoint thus (lines only split for readability):

curl -n

We get all sorts of data back, albeit in a way that’s hard to read as a human.

00:40:45: So, enter jq, which is described as a “lightweight and flexible command-line JSON processor”. I’ve mentioned jq before and even recorded a short video on it: “Using jq to parse out SCP destination info”. This will enable us to more easily read the JSON output and pull data from it.

We start first with “the simplest thing that could possible work”* and pipe the output from curl into jq:

curl -n
  %24orderby=createdAt%20asc" | jq

*one of my favourite phrases, coined, or at least nurtured by that hero of great technology and thinking, Ward Cunningham (who, amongst other things, is the father of the Wiki).

This gives us a much more readable layout:

    "activityId": "usertask1",
    "claimedAt": null,
    "completedAt": "2019-05-30T10:15:29.119Z",
    "createdAt": "2019-05-30T09:28:32.853Z",
    "description": "Please review this request for 25 of stock item Notebook Basic 19.",
    "id": "4ae5e1b5-82bd-11e9-9df2-00163e8e2888",
    "processor": null,
    "recipientUsers": [
    "recipientGroups": [],
    "status": "CANCELED",
    "subject": "Request for Notebook Basic 19",
    "workflowDefinitionId": "orderprocess",
    "workflowInstanceId": "49075dac-82bd-11e9-9df2-00163e8e2888",
    "priority": "MEDIUM",
    "dueDate": null,
    "createdBy": "P2001351149",
    "definitionId": "usertask1@orderprocess",
    "lastChangedAt": "2019-05-30T10:15:29.119Z"
    "activityId": "usertask1",
    "claimedAt": null,
    "completedAt": "2019-05-30T10:15:21.280Z",
    "createdAt": "2019-05-30T09:34:57.591Z",
    "...": "..."

00:43:20: As we’ve digressed slightly to look at jq, we might as well digress a little further and have a look at involving the excellent fzf which in fact we did look at briefly in Ep.1!

First, we supply jq with some instructions, thus (noting the addition of the -r parameter to ask for “raw” output that will honour the “\\t” tab characters):

jq -r '.[] | "\(.id)\t\(.status)\t\(.subject)"'

This gives us:

4ae5e1b5-82bd-11e9-9df2-00163e8e2888  CANCELED  Request for Notebook Basic 19
3037056b-82be-11e9-9df2-00163e8e2888  CANCELED  Request for Notebook Basic 19
bde70c73-82c3-11e9-9df2-00163e8e2888  CANCELED  Request for Notebook Basic 19
e05c9cd0-82c3-11e9-9df2-00163e8e2888  CANCELED  Request for Notebook Basic 19
2125625c-82cf-11e9-9df2-00163e8e2888  COMPLETED  Request for Notebook Basic 19
84c1b859-82cf-11e9-9df2-00163e8e2888  COMPLETED  Request for Notebook Basic 19
e54a0188-8377-11e9-a4b8-00163e8e2aef  COMPLETED  The Notebook Basic 15 !
b7d42a9d-8378-11e9-a4b8-00163e8e2aef  CANCELED  The Notebook Basic 15 !
0839d7ed-82c4-11e9-9df2-00163e8e2888  READY  Request for Notebook Basic 19

This allows us to search quickly for the items given, say, a status criteria.

00:46:40: We go one step further now to use cut to modify what fzf gives us, thus:

jq -r '.[] | "\(.id)\t\(.status)\t\(.subject)"' | cut -f1

to give us just the ID of the item selected. Lovely.

00:48:00: Moving on to what we really want to do, which is to modify the status of an instance, we look at the following API endpoint (and method):

PATCH /v1/task-instances/{taskInstanceId}

In order to make an API call that has side effects, we need a Cross Site Request Forgery (CSRF) token, and we can retrieve one with another Workflow API endpoint:

GET /v1/xsrf-token

00:51:15: So we do that, using curl in a script (called “fetch”) like this (we could just write the whole thing out on the command line, but having it in a script like this makes it easier to remember and modify):



curl \
        --netrc \
        --header "X-CSRF-Token: Fetch" \
        --cookie-jar cookiejar.dat \
        --verbose \

Running this gives us back a token in the X-CSRF-Token header in the HTTP response, a token that we then save in an environmental variable:

export CSRFTOKEN=...

00:52:15: This is then referenced in a second script “post”, that looks like this:



curl \
        --netrc \
        --header "Content-Type: application/json" \
        --header "X-CSRF-Token: $CSRFTOKEN" \
        --cookie cookiejar.dat \
        --request PATCH \
        --verbose \
        --data @data.json \

(where <task-instance-id> is the ID of the task instance for which we want to modify the priority).

The content of the data.json file referred to in the --data parameter looks like this:

  "priority": "VERY_HIGH"

Note that the default HTTP method used when the --data parameter is specified is “POST”, so we needed to explicitly specify the “PATCH” method with the --request parameter.

00:57:18: On execution, the response is returned quickly, and note the HTTP 204 status code, signifying a successful result with no content returned. Checking in the MyInbox app we see that the user task’s priority is now marked as “Very High” – success.

That brings us to the end of this episode, which I thought was quite exciting. Hope you did too!

Assigned tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Jakob Marius Kjær
      Jakob Marius Kjær

      Thanks for the mention DJ Adams, We can all do ours to help out the community and learn together. Worst case we can start a new series of SAPUI5 Gore similar to the Share your "ABAP gore"

      blog that we also discussed on the latest episode of the SAP Coffee Corner Radio Podcast