This is a searchable description of the content of a live stream recording, specifically “Episode 27 – Continuation of the Cloud Platform Workflow project” in the “Hands-on SAP dev with qmacro” series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the “Catch the replays” section of the series blog post.
This episode, titled “Continuation of the Cloud Platform Workflow project”, was streamed live on Fri 31 May 2019 and is approximately one hour in length. The stream recording is available on YouTube.
Below is a brief synopsis, and links to specific highlights – use these links to jump directly to particular places of interest in the recording, based on ‘hh:mm:ss’ style timestamps.
In this episode we continue where we left off in Ep.25, to build out the workflow scenario on SAP Cloud Platform, exploring features as we go for service and script tasks, and more.
Links to specific highlights
00:03:03: Looking at where we left off last time – in the workflow editor in the SAP Web IDE, in the middle of creating a service task to access an OData service on the ES5 system, via the SAP Cloud Connector that we’d set up, running in a Docker container. We have a brief look inside the
Dockerfile that was used to create the image from which the container was instantiated.
00:08:40: Checking what we have via the SAP Cloud Connector, by looking at the “Cloud Connectors” page in the SAP Cloud Plaform Cockpit – an exposed backend system with the virtual name “virtuales5” on port 8000, with all resources at
/sap/opu/odata and below available.
00:10:00: Similarly, checking the destination “shopservice” that we set up, we see that it uses this backend system to point to a specific OData service, thus:
which, with the additional property
sap-client=002, translates to:
(Don’t forget, you’ll need a logon to ES5 to access this URL.)
00:12:45: Reviewing our “destinationproxy” test HTML5 app, specifically the details inside the application descriptor (
neo-app.json), which include a route that points to the destination referred to above.
00:15:03: Following an observant comment from Fabien about the scheme in a destination URL, we talk briefly about the use of HTTP (vs HTTPS) in an SAP Cloud Connector based situation, where the traffic is secured not at the protocol level (i.e. not with HTTPS) but at the tunnel level.
00:17:00: Looking briefly at the SAP Fiori launchpad site we set up with the SAP Cloud Platform Portal service last time – which contains the 4 tiles related to the Workflow service.
00:18:45: Now turning our attention to the workflow definition as we left it at the end of the previous episode, we start to add details to the service task’s properties to point it to where we want it to go.
00:22:20: Looking at the Products entityset, and changing from the default representation content type (which is XML) to our preferred JSON representation (with query parameter
As a small digression, I have a bit of a rant about how people distinguish between OData and “REST APIs”, suggesting they’re different, and implying that OData is less RESTful. That’s nonsense. In fact, one of the lovely things about OData is that it’s a formalised approach to a REST API. Point me to something that’s called a “REST API” and I’ll either point out what that API does not do that is RESTful (or does do that is not RESTful) or point out 10 other APIs that are designated REST APIs that are in fact not. Or different in approach. The point is that REST is not a definition of a protocol, or how a protocol (or API) should work. REST is an architectural style, and OData is as much a REST API as any other so-called REST APIs. So there – stick that in your pipe and smoke it 😉
00:24:50: Using the relative path of the URL we just tried in the browser, i.e.
Products('HT-1000'), and specifying it in the Path property for the service task definition, noting also that right now, the service task mechanism doesn’t support the
sap-client additional property that one can define in the destination on the SAP Cloud Platform (so we end up adding
?sap-client=002 to the Path property’s value).
00:26:25: Taking a small digression into the difference between OData operations ‘READ’ and ‘QUERY’, and noting that both map to the HTTP method ‘GET’. Going deeper into how OData maps onto familiar technologies, we look at the difference between the (XML) representations of responses to READ and QUERY operations, where the former have
<entry> as the root element and the latter have
<feed>, exposing the Atom and Atom Publishing Protocol ancestry of OData quite nicely.
If you’re interested in learning more about this, I recommend you follow the following SAP Developers tutorial: Learn about OData Fundamentals which is part of the tutorial group Get an Introduction to OData which in turn is part of the mission Take a Deep Dive into OData.
00:27:14: I don’t know what was in my coffee that day but I have a micro rant also about the major difference between “blog” and “post”. Goodness me!
00:29:25: Deploying the workflow definition to the SAP Cloud Platform Workflow service for the first time (where we see that the definition is given an incrementing ID, starting at 1).
We can see the definition using the “Workflow Monitor – Workflow Definitions” app. This is where we can start a new instance with sample data, which we do, and then jump to look at the instance, which of course is not found directly as the status filter omits instances in “Completed” status … and of course this is the status our newly created instance already has!
00:32:38: Looking at the details of the completed instance, including the Execution Log and the Workflow Context, which shows the data that was retrieved via the service task stored in the
productdata property. Success!
00:34:10: Wondering about why JSON representations of OData resources always start with a property called ‘d’. If you want to know why this is, look at the subsequent episode (https://bit.ly/handsonsapdev#ep28) where I explain (I had to research it between this episode and the next one).
00:35:40: Now defining a user task, and making use of the fact that we can include JUEL expressions in the values for some of the properties in this user task, e.g.:
00:38:05: Specifying the details of the user interface in this user task, and talking about the difference between specifying a custom-built UI5 Component for the user interface, and using a Forms feature based UI which is super easy to create. We opt for a Forms based user UI in this case to show some of the product data that’s been retrieved and stored in the context.
There was a question as to whether the texts and labels defined in Workflow forms can be translated; I asked internally after the stream, and found that it’s currently not supported, but is definitely on the backlog and being prioritised.
00:44:10: After deploying, we start a new workflow instance and examine it – it’s in the “Running” status as the user task is, of course, still pending.
00:45:10: In the last part of this live stream we now try to bring in a generic user task UI Component that I’d previously built (before the advent of Forms), just to show how one would do it.
Called “genericusertask”, it lives in another instance (and workspace) of the SAP Web IDE so I export it from there and bring it into my SAP Web IDE that I’m using for this live stream.
Looking at the application descriptor in this “genericusertask” project, we see that there’s a route reference to destination “bpmworkflowruntime”, which was added automatically when I enabled the Workflow service. Of course, we can guess from the name of this destination that it points to the Workflow API, which we’ll need in the user task UI Component to access the task instance data.
00:56:30: We deploy the “genericusertask” app to the SAP Cloud Platform, so we can use it in the workflow user tasks, and then modify the user task specification in the workflow definition to use this UI Component instead of the form we created.
At this point we start to dig in to how the UI Component starts up, by debugging the loading of the Component. But we run out of time, learning the lesson again that if you’re going to start a debugging session, the one thing (apart from coffee) that you need to ensure you have is … time.
So we leave the debugging until the next episode!