Personal Insights
Integrate a CI/CD Pipeline on Azure DevOps with DevOps-Related Services from SAP BTP, Cloud Foundry
For the sake of readability, we often use SAP BTP as a short form of the complete “SAP Business Technology Platform”, Alert Notification as a short form of the complete “SAP Alert Notification service for SAP BTP”, “CTMS” as a short form of the complete “SAP Cloud Transport Management” and SAP Automation Pilot as Automation Pilot
Introduction
In this blogpost you will learn how to create an SAP BTP development CI/CD pipeline integrated with Azure DevOps – allowing you to run pipelines on Azure DevOps that continuously deploy new features to SAP BTP, Cloud Foundry environment while still being able to benefit from complementing DevOps-related SAP BTP services, such as around transport, alerting and technical ops automation. With this, you get an example of how you can perform agile development on SAP BTP using a third-party CI/CD infrastructure – complementing the option to start simple by using SAP Continuous Integration and Delivery, where no separate CI/CD infrastructure would be required.
Let`s assume that you have for example an SAP Fiori application in a GitHub repository, SAP BTP QA and PROD subaccounts and execute some automated tests. This would give us the opportunity to verify the application, fix any inconsistencies and retrigger the pipeline, without interfering with the productive version. At the same time, we want to receive alerts whenever there is a change in the status of the pipeline.
To make the complete picture, we will add several SAP BTP services as:
- CTMS that allows you to transport <our changes between the different subaccounts we have with additional control (such as for auditing reasons and where you can define who is allowed to perform changes in which subaccount),
- Automation Pilot that will give us the opportunity to execute smoke tests e.g.,
- And the Alert Notification service that will send us alerts whenever defined conditions are matched within our pipeline execution.
Prerequisites:
In order to complete the setup, you will need the following:
Azure DevOps account (Trial account is also an option)
SAP BTP account (Trial account is also an option)
CTMS subscription plan ‘standard’
Alert Notification instance plan ‘standard’
Automation Pilot subscription plan ‘free’
GitHub repo, but any other Git repo is sufficient
Configuration
This section will be separated into two part. The first section will be about the configuration of SAP BTP. The majority of these steps will be covered by providing a link to detail information. The second part is the configuration in the Azure DevOps. Let’s start with the configuration
Configuration in SAP BTP
We will start the configuration with providing all the SAP BTP DevOps services that have to be enabled, to get the whole process working. For detail information and examples how the services must be configured, follow this blog post: https://blogs.sap.com/2021/11/24/establish-an-automated-continuous-end-to-end-development-process-on-sap-business-technology-platform/
In CTMS, you have to create and configure the nodes which are needed for the route. You need to enable the alerts and create the needed destinations in the SAP BTP subaccount. As described in the blogpost above, follow steps 9 and 10.
For the Alert Notification configuration, create 2 separate subscriptions for the alerts from CTMS and SAP BTP. Additionally, another subscription for the Automation Pilot alert is needed. For the Automation Pilot part, we have to create a command and an execution trigger event. Follow the steps 11,12,13 from the above blogpost.
Configuration steps in Azure
The following configuration part is only about the required steps in the Azure DevOps and how you can setup the pipeline with SAP BTP services. After you have all the SAP BTP services configured and setup, you can follow the steps that are described here:
In order to start with Azure DevOps, you should have an Azure account and navigate to the DevOps organization. Create an organization if you don`t have one and create a project:
- From there, select the pipeline from the left and then click top right on ‘Create New Pipeline’
- In my case, my code is in GitHub, so select the third option
- Now, you are prompted to enter your username and password for the GitHub authorization
- Select the branch which you want to use for this test
- Select the type of application you are using. In my case, it is Node.js app, so I select the first option:
6. In this step we have to create our .yml file or so called later “script” for the pipeline. I have modified a template provided in this blog
Code snippet of my extension for the deploy to CF and Upload to CTMS:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
name: cd-openui5-sample-app
resources:
containers:
- container: mta
image: 'ghcr.io/sap/mbtci-java11-node14:latest'
options: --user 0:0
- container: cfcli
image: 'ppiper/cf-cli'
options: --user 0:0 --privileged
- container: node
image: 'geekykaran/headless-chrome-node-docker:latest'
options: --privileged
trigger:
- master
stages:
- stage: build
displayName: Build MTA for SAP Cloud Platform
jobs:
- job: build
container: mta
steps:
- bash: 'mbt --platform cf --mtar MySampleApp.mtar -t . build'
- publish: $(System.DefaultWorkingDirectory)/.
artifact: WebApp
- stage: test
displayName: Run Karma Test Suite
jobs:
- job: test
pool:
vmImage: 'ubuntu-latest'
container: node
steps:
- bash: 'npm config set @sap:registry "https://npm.sap.com" && npm install && npm run-script test'
- publish: $(System.DefaultWorkingDirectory)/.
artifact: TestResult
- stage: deploy
displayName: Deployment to SAP Cloud Platform (cf)
jobs:
- job: deploy
pool:
vmImage: 'ubuntu-latest'
container: cfcli
steps:
- download: current
artifact: WebApp
- bash: 'cf login -u "$(CF-USER)" -p "$(CF-PASSWORD)" -a "$(CF-API)" -o "$(CF-ORG)" -s "$(CF-SPACE)" && cf deploy $(Pipeline.Workspace)/WebApp/MySampleApp.mtar -f'
- stage: upload
displayName: Upload to CTMS
jobs:
- job: upload
pool:
vmImage: 'ubuntu-latest'
container: cfcli
steps:
- download: current
artifact: WebApp
- bash: |
TOKEN=$(curl -s -X POST -u "<TMS Service Key uaa.clientid>:<TMS Service Key uaa.clientsecret>" -d "grant_type=client_credentials&response_type=token" <TMS Service Key uaa.url>/oauth/token | sed -n '/ *"access_token": *"/ {s///; s/{//g ;s/".*//; p; }')
BODY=$(curl -s --location --request POST '<TMS Service Key uri>/v2/files/upload' --header "Authorization: Bearer $TOKEN" --header 'Cookie: JSESSIONID=D11A4F1DE5C6638B18925D58307B360D; __VCAP_ID__=8aa9e193-d2a1-492c-76bc-288a' --form 'file=@"$(Pipeline.Workspace)/WebApp/MySampleApp.mtar"' | awk -F ":" '{print $2}' | grep -Po "\\d+")
curl --location --request POST '<TMS Service Key uri>/v2/nodes/upload' --header 'Content-Type: application/json' --header "Authorization: Bearer $TOKEN" --header 'Cookie: JSESSIONID=D11A4F1DE5C6638B18925D58307B360D; __VCAP_ID__=8aa9e193-d2a1-492c-76bc-288a' --data-raw '{ "nodeName": "<Name of your TMS QA node>", "contentType": "MTA", "storageType": "FILE", "entries": [ { "uri": '"$BODY"' } ], "description": "<Description optional>", "namedUser": "<User name optional>" }'
Note: There are some adjustments in the bash lines that have to be done under “Upload to CTMS” stage, in order to get the working script for you, so fill the <> with the required fields:
- From the top right, select ‘Variable → ‘Create new Variable’ and you should see this window. Now you have to enter all the variables we have in the script:
CF-API – In the SAP BTP subaccount → Overview → API Endpoint
CF-ORG – In the SAP BTP subaccount → Overview → Org Name
CF-SPACE – In the SAP BTP subaccount → Cloud Foundry→ Name of the Space
CF-USER – The username used for authentication in the SAP BTP cockpit
CF-PASSWORD – The password used for authentication in the SAP BTP cockpit
Select ‘Save Pipeline’ → If “commit to the master branch” or “the branch of your choice” in the Pipeline creation is checked, the Pipeline will be triggered, because there will be a new commit to the GitHub repo.
Test the setup
Once you have all the configuration steps completed, let`s see if the process is working as expected.
First, for that reason, we should make a change in the code using the Code Editor and push it to the central GitHub repository.
Second, open Pipelines in the Azure DevOps. You should see the running pipeline with the corresponding commit id and the master branch. You can observe the different stages while the process is running.
Third, after the “Deploy to CF” phase is finished, your application should be running on the SAP BTP Dev subaccount.
Fourth, after the “Upload to CTMS” phase is completed, a new commit should show up in CTMS on the QA Node, which is waiting for the import.
Fifth, start the import in CTMS and wait for the success status. Also, you will get a notification in the configured channel when the import is started and finished.
When the import is finished, the application with the new changes should be running on the SAP BTP QA subaccount. At the same time in CTMS on the Prod node, the new commit should show up, waiting for its import.
This are the notifications you will receive from Alert Notification for staring and finishing the execution:
:
Automation Pilot will send alerts, whenever there is a status change of the execution, so we expect two more alerts:
Note: The following notifications will be delivered only if you configured the Automation Pilot
The last step is to trigger the import into the Prod node and after it`s finished to check that the new application version is running in the SAP BTP Prod subaccount. Again, we expect to get 4 notifications when the second import is started.
That’s all!
Try it out on your own using the explained configurations or play around, so it can match your landscape specifics.
Hi Angel Tonchev,
interesting post. It would be great to learn why you favor TMS over zero-down-time deployments such as blue-green or canary. Have a look here on my perspective.
Looking forward to the thought exchange.
KR
Martin
Hi Martin Pankraz ,
thanks a lot for your immediate feedback 😊
I think we are talking about two different use cases here: when it comes to real blue-green deployment like you describe in your great blog post (where two versions of the application exist in one environment and can tested against each other), the approach described by Angel does not work because SAP Cloud Transport Management (currently) only supports a limited version of blue-green deployment. Here we immediately switch to the new version after deployment, thus minimizing the application downtime, but without the testing option.
However, CTMS has some features which might be required in other use cases. Just to name a few:
I think it is not so much a question of using blue-green deployment OR SAP Cloud Transport Management, but rather when to use what.
Therefore, it is great that Angel Tonchev has shown that it is possible to combine Azure DevOps with CTMS and the other tools mentioned in his blog post.
Kind regards
Harald
Hi Harald,
you made some interesting points. A couple additional thoughts:
Orchestrating the release process through CF CLI from Jenkins, Azure DevOps or GitHub Actions gets you also a fine level of granularity to restrict access, auditing capabilities and a nice visual where your artifacts went (dev, staging, prod etc). I described deploying to prod only, because that is the cloud native way of things 😉 No one is keeping you from applying the process to staging approaches too.
The other bullet points are SAP opinionated and create tight coupling with SAP tooling. It is up to the customer to consider ease of use with SAP native software over vendor lock-in.
Looking forward to additional content in this area!
KR
Martin
Hi Martin,
I can't really follow your argumentation about vendor lock-in. On the contrary I see this blog post as a very good example how to overcome vendor lock-in by combining tools from two different vendors using open APIs...
Kind regards
Harald
For the orchestration part yes. That is a good step forward to highlight different integration patterns.
My argument was about the dependency on CTMS and ChaRM being required to move artifacts in an consistent way. It could be addressed for instance with "shadow-releases", to avoid the need to move backend changes in sequence and synchronously with frontend. Especially because backend changes are very often much slower and less frequent. Any thoughts on that?
KR
Martin
Hi,
As you might be aware the storage capacity is 30 days in CTMS for the transports and so we want to integrate CTMS with Github to solve this issue. Is there any blog or documentation on that?
Also, can you confirm if we can use Charm to move the transports in CTMS after 30 days. Let's assume we assign one TR today and so is it possible to import the Transport to quality after 30 days.
Thanks
Neha
Hi Neha,
let me clarify the the 30 days retention time for transport requests in cTMS: as long as the transport is waiting for import in any import queue it will not be deleted. The 30 days counter only starts after the transport request has been imported into the last target account (normally the productive account).
This means for your example above that when you assign the transport request to a change document in ChaRM it still waits in the queue of the quality account (the assignment doesn't change that). When you then trigger the import into quality for testing, the transport request gets automatically forwarded to the import queue of the following account (for example preprod or prod). So it is now waiting for import in this queue and therefore the deletion timer still doesn't start. Only after the import into production has been done the transport request (or to be more exact the attached content) gets deleted after another 30 days.
So don't worry that your transport requests get deleted before you are done with them...
Please also check the detail documentation of the deletion process:
Kind regards
Harald
Hi Angel,
Thanks for the interesting post. I have followed the same steps and I am able to build the application and able to deploy to the dev sub account of the cloud foundry. However, upload to the CTMS node is not happening in the pipeline. I can’t see the transport in the Cloud Transport Service. I am not getting any error during the run. I have made the changes for the TMS service key and others as per the TMS service. Kindly assist.
Hello Vinay,
I`m glad you liked the blog post.
I`m sorry to tell you that I don`t have access to the demo environment anymore, but by the time of writing this post everything was working fine
Please also keep in mind that this is only an example and it is not the source for the official SAP documentation
You can try debugging the issue by you side, for example:
- As the problem is with the upload to the CTMS, you can check if the node name is correct
- You can try to run the script in the CLI line by line to check which part causes the problem
- Maybe add some intermediate information output to see of the responses are correct
- Check for typos, spaces or special characters
Kind Regards,
Angel