SAP BTP, ABAP Environment Pipeline: Introducing ABAP Unit
In mid 2020 the first iteration of the “ABAP Environment Pipeline” was released. This opened the CI/CD (“Continuous Integration / Continuous Delivery”) space for the SAP BTP, ABAP Environment aka Steampunk. Many customers started to adopt CI/CD practices and we received a lot of positive feedback. On the flip side, there were missing building blocks – ABAP Unit – and a few hurdles for newcomers to the technology: the pipeline configuration was seen as too complicated. In the past one and a half year, we (continuously) addressed such issues and with the addition of ABAP Unit in late 2021, now is a good time to recap and look ahead.
New to the topic? My previous blog post “CI/CD Tools for SAP BTP ABAP Environment” covers the basics.
So, let’s get started:
ABAP Unit in the ABAP Environment Pipeline
ABAP Unit (AUnit) is the standard tool in the ABAP world for executing unit tests and thus ensuring the functional correctness of your software.
With the communication scenario “ABAP Unit Test Integration”, a service was introduced that enables the execution of ABAP Unit Tests via HTTPS. Of course, we implemented this API in our “ABAP Environment Pipeline”, so you can use it in your CI/CD processes. The newly created AUnit stage runs in parallel to the ATC stage:
What you gain from this stage is a result file which contains the findings of ABAP Unit in the JUnit XML format. This was chosen, so you can utilize available JUnit plugins of – for example – Jenkins. In the following picture, you can see the test results on a Jenkins server with all the necessary details to analyze a failed test, including the “Error Details” and the “Stack Trace”.
The responsible JUnit plugin also creates a “Trend” diagram that lets you see the number of failed, skipped and passed tests of past Jenkins jobs:
So overall, you get a nice representation of the results of ABAP Unit runs. After looking at the results of the ABAP Unit stage, let’s have a look how you need to configure it. Of course, you must specify which objects (e.g. packages or software components) should be checked. And there are two options to do this.
The first option is: you create a configuration file. Here is an example:
title: My AUnit Run context: abapEnvironmentPipeline options: scope: ownTests: true foreignTests: true riskLevel: harmless: true dangerous: true critical: true duration: short: true medium: true long: true objectSet: softwarecomponents: - name: /DMO/SWC - name: /DMO/REPO
You can fill out “title” and “context” according to your own preference. The “options” allow a more detailed configuration.
- You can specify the “scope” – most notably, you can execute tests via “test relations” with the option “foreignTests”
- You can limit the “risk level” to specific values
- You can limit the “duration” to specific values
The “options” are, however, optional and the values default to “true” if not specified. Finally, you need to define the “objectSet”. Here, you can list either packages or software components. If you want to know all the details, please have a look at the documentation of the AUnit API. With that, the first option: you create a configuration file, is covered.
The second option is: you don’t
And this ties directly into the next section.
With yet another configuration file, the initial setup of the abapEnvironmentPipeline became more and more complicated. Therefore, we decided to make those configuration files for ATC, AUnit and the required service keys optional. With this step, only three mandatory files remain: the “Jenkinsfile”, the technical configuration file “.pipeline/config.yml” and the “repositories.yml” (or “addon.yml”), where you define your software components / repositories. With the latter, the pipeline generates a default configuration for AUnit and ATC. Sure, if you need to configure one of the tools in more detail, setting up dedicated configuration files is the way to go.
A minimal configuration looks like this (taken from this sample repository):
repositories: - name: '/DMO/SWC' branch: 'main'
general: cfApiEndpoint: 'https://api.cf.sap.hana.ondemand.com' cfOrg: 'myOrg' cfSpace: 'mySpace' cfCredentialsId: 'cfUser' cfServiceInstance: 'qSystem' stages: Clone Repositories: repositories: 'repositories.yml' strategy: 'Pull' ATC: # In order to be executed, the ATC stage needs at least one configuration entry # If the ATC stage should not be executed, delete the whole section execute: stage AUnit: # In order to be executed, the AUnit stage needs at least one configuration entry # If the AUnit stage should not be executed, delete the whole section execute: stage
Apart from the mentioned software component / repository definition, it is only required to define the strategy for the Clone Repositories stage and to provide the connection details. Those can be found in the SAP BTP Cockpit: the organization, space and service instance name of the ABAP Environment system as well as a user with access to the space. The credentials for the user must be saved in the Jenkins Credentials store using an identifier – ‘cfUser’ in this example.
With this change, the entry barrier is lower than before. We hope you are encouraged to take your first steps in the CI/CD world, while we continue to work on extending the CI/CD possibilities.
In fact, there is one area where we want to improve on next. Currently, it is only possible to create scheduled Jenkins pipelines using the ABAP Environment Pipeline. That means, the pipeline runs during a predefined timeslot. Ideally however, the Continuous Integration process starts to run when a new commit is registered in the repository. This would allow for more direct feedback to the developers. Please stay tuned for more updates on this topic.
If you have feedback or questions, please let me know, and feel free to post in the comment section below (or head to the Answer Hub for general questions).
- My previous blog post, covering the basics of the ABAP Environment Pipeline
- Current roadmap for the SAP BTP, ABAP Environment
- CI/CD in ABAP – An Outside-in View
Notice: images are my own.
Having a scheduled job to report on unit tests is good, and this pipeline helps, as its probably difficult to set it up in Steampunk.
However, I'll challenge which parts of such a setup relates to CI? See SAP's CI principles
Though, it might be a building block for a CI/CD setup along with the prepare system step, but as discussed before, I'd guess it becomes a Continuous Waiting setup instead of Continuous Integration.
thanks for your comment. Here are some thoughts:
In the pipeline, a specific commit is pulled to the ABAP Environment system before executing the tests. So you can at least determine the commit that caused the error. How many development objects are part of the commit is a different question...
With your second comment you refer to a Pull Request workflow. And yes, typically you don't have many issues in the mainline. This setup is, however, currently not possible with the ABAP Environment (as mentioned in "What's Next"). In different setups someone may find a Trend diagram helpful.
The "Prepare System" step can dynamically create an ABAP Environment system - this takes some time. It is still possible to use a static ABAP Environment system. Then, you receive ATC and AUnit results quite fast. In the end, it is a time vs cost discussion.
Thanks for the very useful blog to supplement the Project Piper documentation. I have a few observations/questions.
thank you for your feedback.
The configuration you described should - in principle work. Could you provide some more details regarding your configuration? I would suggest that you open an issue in the GitHub repository. Alternatively you can also open an incident.
Ideally, you would add the parameter "verbose: true" in the general section of the "config.yml", so that the Jenkins log contains more details. Please include, at least, the content of the Jenkins log and the pipeline configuration (Jenkinsfile, config.yml) in your issue or incident.
In this SAP-Samples repository, you can find an example configuration that also includes the PullGitRepo step.
Thanks for the quick reply. I've added the verbose: true and therefore have a bit more info in the log
I will open an incident so that I can share the log files confidentially. In the mean time, here is an extract.
Using Postman I have managed to successfully read the header data of this MANAGE_GIT_REPOSITORIES/Pull API but I did have to pass in the username and password via Basic Auth - which I suspect is not happening here. How are these credentials provided to the API?
Jenkins is using a credentials store. You have to save your username/password in the credentials store and provide an ID for it. This ID is then used as "abapCredentialsId" (or "cfCredentialsId") in the config.yml file.
During the pipeline execution, the credentials are fetched and used for basic auth against the ABAP system (or the SAP BTP).
I am using the credentials store to save both the cfCredentialsId and the abapCredentialsId. If I use the cfCredentialsId (which hold my cf CLI login details), the login is fine up to the point mentioned in my earlier comment. If I use the abapCredentialsId (which holds the comms user/password of my ABAP instance) this fails the cf login (naturally). I understand from the documentation that the clientid/secret for the ABAP instance comes from the service key specified though?
As that hasn't worked, I have been trying to add the abapCredentialsId in the stages/Clone Repositories section but it doesn't work.
I've added my config.yml below (with actual details changed). Anything obviously wrong?
As discussed offline, I have now had a successful run after creating the service key manually with the same parameters shown in the cfServiceKeyConfig entry above.
Also useful to know that any repository I choose in my repositories.yml file, must be linked to a cloned software component.