Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
qmacro
Developer Advocate
Developer Advocate
This is a searchable description of the content of a live stream recording, specifically “Episode 25 - System reset” in the “Hands-on SAP dev with qmacro” series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the “Catch the replays” section of the series blog post.

This episode, titled “System reset”, was streamed live on Wed 24 May 2019 and is approximately one and a half hours in length. The stream recording is available on YouTube.



Below is a brief synopsis, and links to specific highlights - use these links to jump directly to particular places of interest in the recording, based on ‘hh:mm:ss’ style timestamps.

Brief synopsis


The past week has been a little bit hectic and distracting, so I use this episode as a sort of “system reset” to figure out where things are, what we want to work on, and share some items that have come up recently.

00:30:00: Coming to you live from the shared offices of Innov8ion + iQibt in Utrecht, just before the start of the SAP CodeJam on the SAP Cloud Application Programming Model that day. Turns out that one of the live stream family members roberdinho is due to attend the CodeJam too!

00:02:00: Saying hello to wim.snoep2, or at least trying to, as I fail to manage my audio equipment properly - doh!

00:06:05: Talking of the Elgato Streamdeck which is a handy hardware device for managing activities while live streaming, including, for example, switching between scenes in OBS.

00:06:55: Starting with rsletta’s dotfiles which inspired me to get my act together and share some of my own dotfile stuff. I’ve tidied everything up and republished my dotfiles in the form of a repo in the new SAP-samples namespace on GitHub: https://github.com/SAP-samples/devenv-dotfiles-qmacro.

As you can tell from the name of this namespace, it’s primarily for us to be able to publish sample code (and for folks to find it).

My dotfiles are based upon Mathias Bynens’ dotfiles which Ronnie drew my attention to via his dotfiles repo. I liked and used his bootstrap script, which uses rsync, which we look at briefly here.

00:10:50: A nice question from helmut.tammen2 on whether it’s possible to “follow” an entire namespace on GitHub. If you know the answer, please let us know in the comments to this post!

00:12:00: I’ve started to use a new plugin manager for Vim - vim-plug which is working very nicely for me.

00:14:35: There’s a new version of the cds command line client - 3.10.0. The updates appear every 3 or 4 weeks.

The CAP team have also started to make available one of the SAP CDS packages which until now wasn’t directly available in the SAP NPM registry (it was previously only available inside the VS Code extension archive). This is the @Sap/cds-lsp package which as you may know from previous episodes is the code that provides server-based services for the Language Server Protocol. Hurray!
=> npm info @sap/cds-lsp

@sap/cds-lsp@2.1.4 | SEE LICENSE IN LICENSE.txt | deps: 8 | versions: 2
Language server for CDS

dist
.tarball: https://npm.sap.com/@sap/cds-lsp/-/cds-lsp-2.1.4.tgz
.shasum: 5d9a7720a6278cc4299271194229b9c4f6b616e9
.integrity: sha512-NAoXcRviGbFMHZZwg7dY4+VNoocS4KABIsxbPwMS7PhYCig1naT6xCKac+VCUQYpQOiatvQgHooFrZOjz7lo9g==

dependencies:
@sap/cds-compiler: 1.15.0
@types/antlr4: 4.7.0
fs.realpath: 1.0.0
ignore: 5.0.4
ts-md5: 1.2.4
vscode-languageserver-protocol: 3.14.1
vscode-languageserver: 5.2.1
vscode-uri: 1.0.6

maintainers:
- https-support.sap.com <do.not.reply@sap.com>

dist-tags:
latest: 2.1.4

published a week ago by https-support.sap.com <do.not.reply@sap.com>

Now we can start to think about how we best incorporate this package into the vim-cds Vim plugin.

00:19:48: Starting to think about sharing my preparation work for another (new) SAP CodeJam on SAP Cloud Platform in general, and in particular on connectivity and the SAP Cloud Platform Workflow service, and how I might share that preparation in combination with the live stream.

00:21:18: Looking at a tweet from Nabi pointing to a post in a series of posts that he’s been writing: “Installing SAP Cloud Connector into Docker and connecting it to SAP Cloud Platform”.

This is his repo on GitHub: https://github.com/nzamani/sap-cloud-connector-docker.

The idea of using Docker specifically, and a container-based approach in general, is great - it abstracts and neutralises any OS-specifics and is, in my opinion, very aligned with the cloud-first approach to computing.

00:23:40: We also look at the blog post describing the availability of the ES5 system which works well as a classic ABAP stack backend system to connect to with the SAP Cloud Connector. While it is actually on the public Internet, we can “pretend” it’s an on-prem system and consume information from it from a service task in the Workflow service via SAP Cloud Connector powered connectivity.

00:26:00: Cloning Nabi’s “sap-cloud-connector-docker” repo into my dockerbuilds/ directory.

00:27:30: Looking at the details inside of the Dockerfile, noting that the image is based on a CentOS based distribution (which uses the yum package manager).

Noting how the software components are automatically downloaded from the SAP Development Tools website (via the use of Cookie headers in the wget requests) - I will get the CodeJam attendees to download these components manually so they can check and accept the licence agreements themselves.

00:28:25: Examining the use of the RUN command, which is used in Dockerfiles to add layers to a Docker image.

The two components we need are the SAP Cloud Connector and a Java runtime, which are both available from the website.

00:30:10: Thinking about trying the Dockerfile out, as it stands, checking beforehand which versions of the components will be downloaded, before examining the rest of the commands in the Dockerfile, including those that switch to a bash shell and also expose port 8443 in containers that are created from the image.

We build the image thus:
=> docker build -t sapcc:2.12.0.1 .

(don’t forget that final period!)

00:33:00: We note that the specific version of the SAP JVM requested, 8.1.053, was no longer available, so we fix it by changing the reference to 8.1.055 that is shown on the website.

00:34:10: As we wait for the components to download, nabheet.madan3 shares some information in relation to a question I had earlier about how to restrict the number of columns in Docker command output.

We try this out immediately, first the base case with which gives us a whole load of columns in the output:
=> docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
debiancap latest a8fd0c3eb330 2 months ago 293MB
node 10 64c810caf95a 3 months ago 899MB
node lts 64c810caf95a 3 months ago 899MB
centos 7 9f38484d220f 3 months ago 202MB
debian latest a0bd3e1c8f9e 5 months ago 101MB

Now, we try various --format options, ending up with something like this:
=> docker image ls --format '{{.Repository}}\t{{.Size}}'
debiancap 293MB
node 899MB
node 899MB
centos 202MB
debian 101MB

Nice!

00:36:30: While we’d been playing around with --format, the build of the sapcc Docker image completes successfully! So it’s time to create an instance of this image, i.e. a container, thus:
=> docker run -p 8443:8443 -h mysapcc --name sapcc -d sapcc:2.12.0.1

00:39:20: We can see that the container has been successfully created, with the docker ps -a command, and check (with netstat -atn | grep LISTEN) that we can, from our Docker host machine (i.e. my laptop), reach port 8443, which is a locally exposed tunnel through to the 8443 in the container.

00:39:50: Opening up the URL https://localhost:8443 we’re presented with the familiar login screen of the SAP Cloud Connector. Nice! Logging in as the administrator, we proceed to add basic configuration to have it connect to the subaccount related to a new Workflow CodeJam related trial account.

00:44:00: We use “sapcc-docker” for the Location ID property, because it allows us to think about what the Location ID means, and where it’s useful. After that, the setup is done and the SAP Cloud Connector should be already connected to the subaccount.

00:44:50: We check in the SAP Cloud Platform Cockpit, and it is indeed connected, although we see that (of course) there are no backend systems configured.

00:45:38: We go to the “Cloud To On-Premise” settings to add the ES5 system as a backend ABAP stack system. We’re going to use a specific OData service available in this system, which is the Enterprise Procurement Model (EPM) Reference Apps “Shop” service, or EPM_REF_APPS_SHOP_SRV for short.

00:47:35: Talking about the fact that in the ES5 system, the default client is 002, not 000, which has caused some issues in the past with folks unable to authenticate, or rather, unable to remember that they have to authenticate with the right client.

00:48:30: Setting up the “Cloud To On-Premise” connection to the ES5 system, using a virtual hostname and port number that are different from the real hostname and port number, for added security.

00:51:00: Once this is saved, we now see (in the Cockpit) that there’s a system available via the SAP Cloud Connector connection, but that there are no resources yet available. We address this now, by defining a “path and all sub-paths” resource for the main set of OData services available in that SAP system: /sap/opu/odata. We take a little digression looking at what “OPU” actually stands for (“occasional platform user”), and why.

00:53:02: Next thing we do is to make sure we can access resources in this ES5 system that’s available via the SAP Cloud Connector connection, which is to use a pretend HTML5 application that’s stored in and served from the SAP Cloud Platform subaccount, which only has an application descriptor file, otherwise known as the neo-app.json file.

00:53:00: Before we create this pretend app, we create the destination that will point to the remote (ES5-based) service. In the approach to this, I discuss briefly the choice of destination definition - how specific, how narrow you should make it. To a backend system in general, to a collection of OData (or other Web-based) services, or to a specific service endpoint. Of course, I guess it depends, but it’s an interesting question on how one should use the destinations.

00:57:10: The destination name we create is “shopservice”, and we need to use the virtual hostname and port we specified earlier when we construct the URL to the destination endpoint (we also include the “sapcc-docker” Location ID of course, and add an additional property to specify that the “sap-client” should be 002.

01:03:15: Next we start up the SAP Web IDE Full-Stack to create a super-simple empty application (“destinationproxy”), with just an application descriptor file (neo-app.json) with the following content:

{
"authenticationMethod": "none",
"routes": [
{
"path": "/myshop",
"target": {
"type": "destination",
"name": "shopservice"
},
"description": "Shop service backend"
}
]
}


01:07:50: After deploying this application to the SAP Cloud Platform, we see that there’s an application URL for us to try out, which we do, first appending neo-app.json to actually have a look at the contents of the application descriptor file, but then appending myshop, which is of course correctly resolved by the Connectivity service, to travel along the SAP Cloud Connector connection down to the “on-premise” ABAP stack system ES5 and to the EPM_REF_APPS_SHOP service. I find this way of testing reverse proxy connectivity very useful.

00:15:10: Within the SAP Web IDE Full-Stack, we turn on the “Workflow Editor” extension. We also check that we have the Portal service enabled in SAP Cloud Platform, and that there’s a Fiori launchpad based website available to us. There is: “Cloud Platform Workflow CodeJam”, and it contains tiles for each of the four standard Workflow related activities:

  • My Inbox (All Tasks)

  • Monitor Workflows (Workflow Definitions)

  • Monitor Workflows (Workflow Instances)

  • My Inbox (Expert View)


01:19:20: A quick reminder that I wrote a series of Workflow service related deep-dive posts, called “Discovering SCP Workflow” … and we take a brief look at that series here.

01:20:40: Creating a very simple Workflow definition “shopstuff” to try out a Service Task that retrieves data from ES5 via our previously created destination. At this point, I run out of time, and prompted by folks having to leave the stream, I make a decision to call time, leaving the specification of the values for the properties of the Service Task to next time - a nice little cliff hanger 🙂

Until next time!