Technical Articles
Working with Integration Suite Piper commands
SAP implements tooling for continuous delivery in opensource project “Piper“. The goal of project “Piper” is to substantially ease setting up continuous delivery in your project using SAP technologies.
For more information about project “Piper” and the other CI and CD offerings by SAP, see Overview of SAP Offerings for CI and CD.
Configuring project piper in your Jenkins server is explained in this blog.
SAP Integration Suite has contributed not only many Jenkins pipelines, but also many piper library steps, which helps you to create your own CI/CD pipeline, automating various tasks.
For example, in the case of SAP Integration Suite capability “Cloud Integration”, we can automate below scenario
- Update the integration flow design time configuration parameter.
- Deploy an integration flow.
- Get the service endpoint of the deployed integration flow.
- Invoke the service endpoint with the HTTP request.
- Get message processing log (MPL) status of the integration flow.
- If the MPL status is completed in previous step, download integration flow artifact from design time.
- Store integration flow artifact in the GitHub repository.
Let’s understand all the Piper commands provided for SAP Integration Suite capabilities, mainly Cloud Integration and API Management.
SAP Cloud Integration Piper Steps
Piper Step Name | Description | Documentation Link | Pipeline Example |
integrationArtifactDeploy |
Deploy an integration flow in to the SAP Cloud Integration runtime |
Link |
Link |
integrationArtifactDownload |
Download integration flow runtime artefact |
Link |
Link |
integrationArtifactGetMplStatus |
Get the MPL status of an integration flow |
Link |
Link |
integrationArtifactGetServiceEndpoint |
Get an deployed integration flow service endpoint |
Link |
Link |
integrationArtifactResource |
Add, Delete or Update an resource file of integration flow design time artifact |
Link |
Link |
integrationArtifactUnDeploy |
Undeploy a integration flow |
Link |
Link |
integrationArtifactUpdateConfiguration |
Update integration flow configuration parameter |
Link |
Link |
integrationArtifactUpload |
Upload or Update an integration flow design time artifact |
Link |
Link |
integrationArtifactTransport |
Integration Package transport using the SAP Content Agent Service |
Link |
Link |
SAP API Management piper steps
Piper Step Name | Description | Documentation Link | Pipeline Example |
apiProxyDownload |
Download a specific API Proxy from the API Portal |
Link |
Link |
apiKeyValueMapDownload |
Download a specific Key Value Map from the API Portal |
Link |
Link |
apiProxyUpload |
Upload an api proxy artifact in to the API Portal |
Link |
Link |
Going forward, more piper steps would be contributed for both Cloud Integration and API Management.
Now let’s take an example of consuming Cloud Integration piper command “integrationArtifactDeploy” in the Jenkins server.
This involves below steps:
- Creating Jenkins pipeline project in GitHub which consumes cloud integration piper command
- Configure Cloud Integration API service key in the Jenkins server as security credentials
- Configure the piper library in the Global pipeline libraries
- Creating new pipeline project in Jenkins based on pipeline script from SCM approach.
- Running pipeline project and verifying results
Creating Jenkins pipeline project in GitHub which consumes Cloud Integration piper command
Let’s consume integrationArtifactDeploy piper command which is responsible for deploying an integration flow in to the Cloud Integration runtime.
First step is to create GitHub repository as shown below
Repository has directory called .pipeline and file Jenkinsfile. Jenkins file has the groovy script code, which has logic to invoke the integrationArtifactDeploy piper command
Here “integrationArtifactDeploy script: this” line executes the cloud integration piper command
.pipeline has the config.yaml file which provide input arguments for the integrationArtifactDeploy piper command
Here we pass the integration flow ID which has to be deployed and the API service key details which is configured in the Jenkins system as security credential.
Configure Cloud integration API service key in the Jenkins server as security credentials
API service key need to be configured as Jenkins credentials so that all pipeline projects can use it.
Select the Manage Jenkins configuration option in the Jenkins server home page.
Select the Manage Credentials option
Select the Global credentials
Copy the service key JSON text from the SAP BTP account cockpit, this can be found in below location.
SAP BTP Cockpit Subaccount home page -> instances and subscriptions -> instance name (API plan)
->Service keys ( view and copy the json text)
Create new credentials of type secret text under add credentials option, paste the service key text under secret text input box and save it.
The same ID used here need to be passed as input for cpiApiServiceKeyCredentialsId configuration parameter in the config.yml file as shown below
Configure the piper library in the Global pipeline libraries
Provide the piper library runtime configuration in the Jenkins configuration -> Global pipeline libraries section.
Creating new pipeline project in Jenkins based on pipeline script from SCM approach.
Select new Item -> pipeline project and provide the project name
Configure the repository URL, branch to pull and script file name as shown below
Click on save/apply to save the project. This step will create Jenkins pipeline project to pull the SCM repository configured in the GitHub and execute the Jenkins file which has logic to execute the SAP cloud integration piper command.
Running pipeline project and verifying results
This step involves build and running the Jenkins pipeline project and verify the SAP cloud integration piper command execution results.
Click on build now and see the latest build results
If you select the specific build and check the console output, you can validate weather piper command successfully executed or not!
You can combine these piper commands and build a complex scenario, where you can manage the end to end lifecycle of an integration flow artefact for CI/CD, starting from configure to deploy, check execution status, download and store in git etc.
If you want to build your own custom piper command for Integration Suite, you can contribute to opensource SAP Project Piper (https://www.project-piper.io/) and follow the developer guide for building your own custom shared library steps.
Below is the example hello World piper command sample, which you can refer to build it in your own forked piper GitHub repository from piper master repository ( i.e. https://github.com/SAP/jenkins-library) and test in Jenkins server.
Hello world example step development commits, which show the workflow: https://github.com/marcusholl/jenkins-library/commits/helloWorld
Pipeline to build and run the code in a Jenkins: https://github.com/marcusholl/helloWorld/blob/main/Jenkinsfile#L10
Example of a groovy wrapper with username, Password credentials: https://github.com/SAP/jenkins-library/blob/master/vars/cloudFoundryCreateSpace.groovy
Great Post @Mayur!
Any idea how I can implement the Piper commands for SAP Integration Suite-Cloud Integration and Azure DevOps?
Thank you @Alexander Clen Riva
it seems 2 options
Option 1
using piper docker image and running integration Suite piper commands as explained in the blog for other commands https://blogs.sap.com/2019/10/24/how-to-use-project-piper-docker-images-for-cicd-with-azure-devops/
Option 2
wrap a Jenkins CI job inside an Azure pipeline. In this approach, a build definition will be configured in Azure Pipelines to use the Jenkins tasks to invoke a CI job in Jenkins, download and publish the artifacts produced by Jenkins
Best Regards,
Mayur
important part is to download piper from github.
Then you can use the piper executable in the jobs.
In the example the first job gets piper and put the executable into a cache.
The second job gets piper from the cache and does some action.
Here is an example (snippet):
jobs:
- job: downloadPiper
pool:
vmImage: 'ubuntu-latest'
container:
image: <your docker image>
options: -u 0
steps:
- checkout: none
- task: Cache@2
inputs:
key: piper-go-official
path: bin
cacheHitVar: FOUND_PIPER
displayName: Cache piper go binary
- script: |
mkdir -p bin
curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.190.0/piper
chmod +x bin/piper
condition: ne(variables.FOUND_PIPER, 'true')
displayName: 'Download Piper'
- script: bin/piper version
displayName: 'Piper Version'
- job: piperDoesSomeAction
dependsOn: downloadPiper
pool:
vmImage: 'ubuntu-latest'
container:
image: <your docker image>
options: -u 0
steps:
- task: Cache@2
inputs:
key: piper-go-official
path: bin
displayName: resolve piper go binary from cache
- script: |
bin/piper version
displayName: 'some action'
Thank you for your answer Mayur Belur Mohan
I am working on the option 1. I guess I can use the docker image mentioned in the blog https://blogs.sap.com/2019/10/24/how-to-use-project-piper-docker-images-for-cicd-with-azure-devops/.
That's great, let me know how it goes and also any enhancements you need to Integration Suite piper commands
Hi Mayur Belur Mohan
I am using the image "https://github.com/SAP/devops-docker-images" but I am not sure if it is the right one for Cloud Integration artifacts.
I have the below pipeline.
But I am getting the below error. Any idea? 🙂
This https://github.com/SAP/devops-docker-images is a github repository where we describe which images we provide and where to find them. As you are calling just APIs from Piper, please remove the images and run directly on the vm.
Hi Alex,
Did you able to resolve and make it work end to end with azure dev ops for integration artifacts .
Regards
Vijay
is the issue resolved Alex.
Best Regards,
Mayur
Hi Mayur Belur Mohan the below issue is still a blocker.
not sure why is asking for .pipeline/config.yml. I am working with Azure DevOps - azure-pipelines.yml
issue is the service key json document is invalid. When you uses it in a secret environment variable, I suggest using the json document as a string without CR or CR/LF.
any luck alex? is issue solved?
Hi Mayur,
Now the error is different. It looks like that something is missing in the conversion from json file to string.
I verified from Postman that the Client Id and Client Secret are working and also the permission to deploy iflow using the API are also validated. I mean from Postman a can deploy an iFlow.
Below the logs(I changes a little the client id and client secret values but keeping the format)
there are 2 service key one for API plan, another for integration-flow plan. which one have you used?. you have to use API plan service key.
Hi Mayur,
I am using the API plan service key and I verified from Postman and it is working fine to deploy the iflows.
Hi Mayur Belur Mohan The root cause is that the Client Secret contains the character $. This character is cutting the Client Secret value, and it is always returning 401 as the response code.
I encoded the Client Secret but it is still not working.
please schedule a call
Sure, I sent you the meeting invite.
Hi, Alexander Clen Riva and Mayur Belur Mohan :
I'm glad that you were able to connect, but please note that sharing personal email addresses publicly violates our rules of engagement (https://community.sap.com/resources/rules-of-engagement). I've removed that from the comment. When members want to connect, we recommend following a member (via his or her profile), then leaving a comment asking that member to follow back. When members follow each other, they then have the ability to connect via the community's private messages.
Kind regards,
--Jerry
Moderation Lead
When is SAP expecting to support management of new artifact types, such as SOAP and REST API, via the OData API so it can be leveraged with Project Piper?
Is there any expectation that all assets, such as Script Collections and Value Mappings, will also be supportable via the same processes used for Integration Flows?
Additionally, the https://api.sap.com/api/IntegrationContent/resource with the "New API Hub" option selected does not support environment configurations for Cloud Foundry.
Hi Mayur,
Thank you for sharing an insightful and well-explained blog.
For a project requirement, we were trying to use GitLab instead of Jenkins and GitHub. Where we are trying to establish the source code and CI/CD.
Can you please guide us on how to go about it and if it is possible in the first place? We tried using a similar piper code but it doesn't build.
If not, what would be the next option, perhaps, source code in GitLab and CI/CD in BTP.
Please guide us on how to go about it.
Thanks,
Ramya
Please refer to the youtube video https://www.youtube.com/watch?v=jUiKi6FWYrg which explains how to create basic CI/CD pipeline in GitLab.
After successfully completing it, replace that YAML and build YAML like below
Take the credentials from ServiceKey JSON of your process integration instance in subaccount.
Please note that:
Gitlab pipelines are based on YAML, so does the Azure Devops pipeline. I suggest to prepare Gitlab pipeline YAML looking in to working Azure Devops pipeline YAML showcased here Working with Integration Suite Piper commands and Microsoft Azure DevOps | SAP Blogs
There is one example of interoperability between both YAMLs explained here continuous integration - Gitlab yaml to azure pipelines yaml file - Stack Overflow
Hi Mayur,
Thank you for replying back. I did try this and got the below error while building:
I have followed the exact procedure. Please help me understand where I could have made a blunder.
Also, additionally please confirm if we can automate the deployment of SAP CPI Iflows using GitLab alone.
Thanks,
Ramya
may be curl command didn't work and not downloaded the piper binary. if the curl command doesnt exist, it need to be download using native OS commands
please give the gitlab logging for the JOB you have executed. we will know the status of each command execution under the script.
may be you can check in below order
Hi Mayur,
Attaching the job screenshots. While the job for download piper shows successful, I can't see the directory in the filesystem.
Can you please help me out there?
Attaching the gitlab logging for the job .
does both JOBS uses separate VMs when getting executed?. then first job output of downloaded binary may not be available for second job!.
can you club both in single job and print the downloaded binary signature using echo or ls command before executing any command from piper binary?
Hi Mayur,
Both JOBS use a single VM. I clubbed both in a single job and ended up with the following :
Thanks,
Ramya
Now i see piper binary is executed and made POST request as well. only issue i am seeing here is , Iflow Is not found which is being used for deploy. that's what error at line no 65 specifies. i.e. 404 error code means resource(iflow) not found.
please check
Hi Mayur,
The flow id is correct. Attaching the screenshot of how I am getting the Iflow ID.
Also, I tried different iflow ids and yet I get a 404 error even in postman:
The actual endpoint from CPI looks like this, maybe that is why it is unable to find the right path.
Please let me know if I am configuring the flow id wrong or where else I could be at fault.
Thanks ,
Ramya
please take the CPI host URL from the service key created for Process integration runtime instance API Plan. please refer to script https://github.com/SAP/apibusinesshub-integration-recipes/blob/master/Recipes/for/CICD-DeployIntegrationArtefactGetEndpoint/Jenkinsfile on how to use iFlowdeploy API
Hi Mayur,
Thank you so much for all the guidance. I am able to do it successfully now.
While I can deploy integration artefact using piper commands in GitLab, I was wondering if I could implement this scenario:
Complete syncing of SAP CPI with GitLab, where any changes in the integration artefact should be making changes in GitLab as well.
Will that be possible?
Any leads would be appreciated.
Thanks,
Ramya
is the issue resolved ramya?
Hi Mayur,
The previous issue was resolved and I was trying version management of Iflows in GitLab/ GitHub using AzureDevops
Can we do a version comparison and only store the latest version of the artifact in the repository on Gitlab/GitHub using Azure DevOps for CI/CD using the Piper Commands?
Thanks,
Ramya
I get the below error while trying to use CAS and TMS to deploy Integration Artificats..
+ ./piper integrationArtifactTransport info integrationArtifactTransport - Using stageName 'integrationArtifactTransport Command' from env variable info integrationArtifactTransport - Project config: '.pipeline/config.yml' info integrationArtifactTransport - CPI serviceKey read successfully info integrationArtifactTransport - fatal error: errorDetails{"category":"undefined","correlationId":"http://35.240.x.x:8080/job/MyfirstTransport/6/","error":"failed to fetch Bearer Token: parse \":///oauth/token?grant_type=client_credentials\u0026response_type=token\": missing protocol scheme","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactTransport","time":"2023-02-12T15:51:27.393681667Z"} fatal integrationArtifactTransport - step execution failed - failed to fetch Bearer Token: parse ":///oauth/token?grant_type=client_credentials&response_type=token": missing protocol scheme info integrationArtifactTransport - Step telemetry data:{"StepStartTime":"2023-02-12 15:51:27.390897599 +0000 UTC","PipelineURLHash":"a356f5cb09f600cb0f7b0fb2a239b558680599c9","BuildURLHash":"e3e09af87e68ce44b9b48cbf6c0a37f9a098b57a","StageName":"integrationArtifactTransport Command","StepName":"integrationArtifactTransport","ErrorCode":"1","StepDuration":"3","ErrorCategory":"undefined","CorrelationID":"http://35.240.x.x:8080//job/MyfirstTransport/6/","PiperCommitHash":"54d0c68feb5e9fc92b5567c3701ccecf302fbe49","ErrorDetail":{"category":"undefined","correlationId":"http://35.240.137.76:8080/job/MyfirstTransport/6/","error":"failed to fetch Bearer Token: parse \":///oauth/token?grant_type=client_credentials\u0026response_type=token\": missing protocol scheme","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactTransport","time":"2023-02-12T15:51:27.393681667Z"}}
I have added the Service Key JSON text of CAS Service Instance with the Alias caspocServiceKeyCred as described in integrationArtifactTransport/config.yml at main · mayurmohan/integrationArtifactTransport (github.com)
Regards,
Senthil
the serviceKey you format like below before storing as secret text in the Jenkins Service
{
"oauth": {
"createdate": "2022-09-12T10:51:55.474Z",
"clientid": "sb-64f36979-xxx-468b-b36e-a5a9e19c6a9b!b1509|it!b68",
"url": "https://content-agent-engine.cfapps.xxx.hana.ondemand.com",
"clientsecret": "af770xx-ab05-4868-a52c-9b87f8aaafd5$cLPOilWdzF2hpNt_9Lq140AH4v6c35RItT5zr9YeD6I=",
"tokenurl": "https://xxxxx.authentication.xxx.hana.ondemand.com/oauth/token"
}
}
this you can get from CAS instance application service plan which has more fields, format JSON like mentioned above and store it as secret text. from the error its clear that piper step not able to get URL details properly
Hi Mayer,
Thanks ! I got it to working now. I am now able to run a pipeline job to trigger a transport request that lands up in SAP TMS !
One question though, what is the easiest way to get "resourceID" as I don't see it in clear text anywhere in the package.
Regards,
Senthil
use the odata API, mainly /api/v1/IntegrationPackages?$format=json
Hi Mayur,
I have followed your blog and SAP TechEd 2022 session IN180 to configured the steps to deploy the integration flow using the piper commands via Jenkins. In the session I couldn't find the configuration regarding the shared library path details. I followed the share library path provided in the blog and ran the build Now option. I received the below error message. Could you please help me to configure the shared library path.
seems like its fetching some wrong repo. you are fetching some wrong Git Repo
Hi Mayur,
In your teched session I couldn't see any steps regarding the shared library. From your blog I have referred to map the git repository below in my Jenkins Global configuration. Could you help me what are the shared libraries to be added for artifact upload,deplopy and get MPL status.
Shared library repo sample i already given in the above blog. for example
integrationArtifactDeploy -- https://github.com/mayurmohan/IntegrationArtifactDeploy
This command may be the solution to "long paths" if you are running the Jenkins in Windows.
Hi Mayur Belur Mohan
Great blog!
I´ve trying to get this working in my own VM.
Does this work on a Jenkins running on windows?
I get the following error when deploying:
I´ve reading about this error and looks like under the hood my jenkins pipeline may throwing "sh" commmands.
Any clues?
Thanks in advace
Hi Mayur Belur Mohan
Great post. Would you mind telling me how you created the
apimApiServiceKeyCredentialsId
for working with proxies?Thank you in advance.