Technical Articles
Working with Integration Suite Piper commands and Microsoft Azure DevOps
Dear community,
Some days ago, I started to work on a PoC about the CI/CD Azure pipeline for Integration contented. The very initial steps were to know how works the Piper commands.
For more information about project “Piper” and the other CI and CD offerings by SAP, see Overview of SAP Offerings for CI and CD. And more about CI/CD for SAP Integration Suite, see CI/CD for SAP Integration Suite? Here you go!
And of course, the blog post that has inspired me to do this blog post, see Working with Integration Suite Piper commands.
Let’s follow the below steps:
- Creating Azure DevOps Project
- Configure Cloud Integration API service key in Azure DevOps as security credentials
- Creating a new pipeline project in Azure DevOps
- Conclusion
Creating Azure DevOps Project
Let’s create the project “SCP-Pipeline” and a new repo “Garage.SAPCI.PoC”, and finally clone in VS Code.
Configure Cloud Integration API service key in Azure DevOps as security credentials
Get the service key, explained in step 2 in the blog post.
Below a two steps to perform with the service key file.
- Convert from service key payload from JSON to JSON string. I found useful this online tool to perform this task.
- Beautify the JSON String file by changing the double quote(“) for an apostrophe (‘) at the beginning and the end and finally clean each backslash (\). The beautified JSON String file show looks like the below one. With this small change, Piper will send a complete client secret in the HTTP request and not just the initial part before the character $ that could cause an HTTP 401 error.
'{"oauth":{"clientid":"xx-xxxxxxx-85yy-zzz-a56b-xxxxxx99!a99999|it!a99999","clientsecret":"x1x1x1x1x-9z9z9z-9y9y9-9x9x9-9x9x9x9x9x$Mi_xxxxYYYYYYzzzzzzzzzz-xxxxxxYZ=","tokenurl":"https://xxxxxxyztrial.authentication.us10.hana.ondemand.com/oauth/token","url":"https://xxxxxxyztrial.it-cpitrial05.cfapps.us10-001.hana.ondemand.com"}}'
Create a variable “CREDENTIALS” and add as value the modified JSON String payload.
Creating a new pipeline project in Azure DevOps
The important part is to download piper from github. Then you can use the piper executable in the jobs. In the below example the first job gets piper and puts the executable into a cache. The next job gets piper from the cache and does some action for cloud integration content.
a.- Get piper and put the executable into a cache.
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- main
jobs:
- job: downloadPiper
pool:
vmImage: ubuntu-latest
steps:
- checkout: none
- task: Cache@2
inputs:
key: piper-go-official
path: bin
cacheHitVar: FOUND_PIPER
displayName: Cache piper go binary
- script: |
mkdir -p bin
curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.199.0/piper
chmod +x bin/piper
condition: ne(variables.FOUND_PIPER, 'true')
displayName: 'Download Piper'
- script: bin/piper version
displayName: 'Piper Version'
b.- Piper command to deploy an iFlow
- job: deployiFlow
dependsOn: downloadPiper
variables:
- group: development
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Cache@2
inputs:
key: piper-go-official
path: bin
displayName: deploy iflow
- script: |
bin/piper integrationArtifactDeploy --verbose --apiServiceKey $(CREDENTIALS) --integrationFlowId "TestCICD"
Running pipeline project and verifying results
Below the results of the pipeline execution.
The deployment status also in SAP CI Web UI->Manage Integration Content
You can combine these piper commands and build a complex scenario.
Conclusion
Finally, with the above instructions, we can perform the Piper commands using a Microsoft Azure DevOps pipeline. Based on this one, we can create more complex scenarios.
I hope you find useful this blog post. You are very welcome to provide feedback or thoughts in the comment section. And thanks Mayur Belur Mohan for supporting me in this journey!
Related to this topic you can also find Q&A and post questions by the tags DevOps, SAP Integration Suite, SAP BTP, Cloud Foundry environment.
Hi Alex, Thanks for the blog.
We are looking to download the iflow and commit it to repository. How do you do that? I mean Where would you provide branch name and commit message?
I have tried this configuration, but I can not see any files getting committed to the folder.
tried repository/root dir/sub folder , rootdir/subfolder. it did not work.
If we are downloading, what is the file extension? is it a zip or folder?