Skip to Content
Technical Articles
Author's profile photo Mayur Belur Mohan

Working with Integration Suite Piper commands

SAP implements tooling for continuous delivery in opensource project “Piper“. The goal of project “Piper” is to substantially ease setting up continuous delivery in your project using SAP technologies.

For more information about project “Piper” and the other CI and CD offerings by SAP, see Overview of SAP Offerings for CI and CD.

Configuring project piper in your Jenkins server is explained in this blog.

SAP Integration Suite has contributed not only many Jenkins pipelines, but also many piper library steps, which helps you to create your own CI/CD pipeline, automating various tasks.

For example, in the case of  SAP Integration Suite capability “Cloud Integration”, we can automate below scenario

  1. Update the integration flow design time configuration parameter.
  2. Deploy an integration flow.
  3. Get the service endpoint of the deployed integration flow.
  4. Invoke the service endpoint with the HTTP request.
  5. Get message processing log (MPL) status of the integration flow.
  6. If the MPL status is completed in previous step, download integration flow artifact from design time.
  7. Store integration flow artifact in the GitHub repository.

Let’s understand all the Piper commands provided for SAP Integration Suite capabilities, mainly Cloud Integration and API Management.

SAP Cloud Integration Piper Steps

Piper Step Name Description Documentation  Link Pipeline  Example
integrationArtifactDeploy
Deploy an integration flow in to the SAP Cloud Integration runtime
Link
Link
integrationArtifactDownload
Download integration flow runtime artefact
Link
Link
integrationArtifactGetMplStatus
Get the MPL status of an integration flow
Link
Link
integrationArtifactGetServiceEndpoint
Get an deployed integration flow service endpoint
Link
Link
integrationArtifactResource
Add, Delete or Update an resource file of integration flow design time artifact
Link
Link
integrationArtifactUnDeploy
Undeploy a integration flow
Link
Link
integrationArtifactUpdateConfiguration
Update integration flow configuration parameter
Link
Link
integrationArtifactUpload
Upload or Update an integration flow design time artifact
Link
Link

SAP API Management piper steps

Piper Step Name Description Documentation  Link Pipeline  Example
apiProxyDownload
Download a specific API Proxy from the API Portal
Link
Link
apiKeyValueMapDownload
Download a specific Key Value Map from the API Portal
Link
Link
apiProxyUpload
Upload an api proxy artifact in to the API Portal
Link
Link

Going forward, more piper steps would be contributed for both Cloud Integration and API Management.

Now let’s take an example of consuming Cloud Integration piper command “integrationArtifactDeploy” in the Jenkins server.

This involves below steps:

  1. Creating Jenkins pipeline project in GitHub which consumes cloud integration piper command
  2. Configure Cloud Integration API service key in the Jenkins server as security credentials
  3. Configure the piper library in the Global pipeline libraries
  4. Creating new pipeline project in Jenkins based on pipeline script from SCM approach.
  5. Running pipeline project and verifying results

 

Creating Jenkins pipeline project in GitHub which consumes Cloud Integration piper command

Let’s consume integrationArtifactDeploy piper command which is responsible for deploying an integration flow in to the Cloud Integration runtime.

First step is to create GitHub repository as shown below

Repository has directory called .pipeline and file Jenkinsfile. Jenkins file has the groovy script code, which has logic to invoke the integrationArtifactDeploy piper command

Here “integrationArtifactDeploy script: this” line executes the cloud integration piper command

.pipeline has the config.yaml file which provide input arguments for the integrationArtifactDeploy piper command

Here we pass the integration flow ID which has to be deployed and the API service key details which is configured in the Jenkins system as security credential.

Configure Cloud integration API service key in the Jenkins server as security credentials

API service key need to be configured as Jenkins credentials so that all pipeline projects can use it.

Select the Manage Jenkins configuration option in the Jenkins server home page.

 

Select the Manage Credentials option

 

Select the Global credentials

Copy the service key JSON text from the SAP BTP account cockpit, this can be found in below location.

SAP BTP Cockpit Subaccount home page -> instances and subscriptions -> instance name (API plan)

->Service keys ( view and copy the json text)

Create new credentials of type secret text under add credentials option, paste the service key text under secret text input box and save it.

 

The same ID used here need to be passed as input for cpiApiServiceKeyCredentialsId configuration parameter in the config.yml file as shown below

Configure the piper library in the Global pipeline libraries

Provide the piper library runtime configuration in the Jenkins configuration -> Global pipeline libraries section.

Creating new pipeline project in Jenkins based on pipeline script from SCM approach.

Select new Item -> pipeline project and provide the project name

 

Configure the repository URL, branch to pull and script file name as shown below

 

Click on save/apply to save the project. This step will create Jenkins pipeline project to pull the SCM repository configured in the GitHub and execute the Jenkins file which has logic to execute the SAP cloud integration piper command.

Running pipeline project and verifying results

This step involves build and running the Jenkins pipeline project and verify the SAP cloud integration piper command execution results.

Click on build now and see the latest build results

If you select the specific build and check the console output, you can validate weather piper command successfully executed or not!

You can combine these piper commands and build a complex scenario, where you can manage the end to end lifecycle of an integration flow artefact for CI/CD, starting from configure to deploy, check execution status, download and store in git etc.

If you want to build your own custom piper command for Integration Suite, you can contribute to opensource SAP Project Piper (https://www.project-piper.io/) and follow the developer guide for building your own custom shared library steps.

Below is the example hello World piper command sample, which you can refer to build it in your own forked piper GitHub repository from piper master repository ( i.e. https://github.com/SAP/jenkins-library)  and test in Jenkins server.

Hello world example step development commits, which show the workflow: https://github.com/marcusholl/jenkins-library/commits/helloWorld

Pipeline to build and run the code in a Jenkins: https://github.com/marcusholl/helloWorld/blob/main/Jenkinsfile#L10

Example of a groovy wrapper with username, Password credentials: https://github.com/SAP/jenkins-library/blob/master/vars/cloudFoundryCreateSpace.groovy

 

 

Assigned Tags

      33 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Great Post @Mayur!

      Any idea how I can implement the Piper commands for SAP Integration Suite-Cloud Integration and Azure DevOps?

       

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      Thank you @Alexander Clen Riva

      it seems 2 options

      Option 1

      using piper docker image and running integration Suite piper commands as explained in the blog for other commands https://blogs.sap.com/2019/10/24/how-to-use-project-piper-docker-images-for-cicd-with-azure-devops/

      Option 2

      wrap a Jenkins CI job inside an Azure pipeline. In this approach, a build definition will be configured in Azure Pipelines to use the Jenkins tasks to invoke a CI job in Jenkins, download and publish the artifacts produced by Jenkins

      Best Regards,
      Mayur

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      important part is to download piper from github.
      Then you can use the piper executable in the jobs.
      In the example the first job gets piper and put the executable into a cache.
      The second job gets piper from the cache and does some action.

      Here is an example (snippet):
      jobs:
      - job: downloadPiper
      pool:
      vmImage: 'ubuntu-latest'
      container:
      image: <your docker image>
      options: -u 0
      steps:
      - checkout: none
      - task: Cache@2
      inputs:
      key: piper-go-official
      path: bin
      cacheHitVar: FOUND_PIPER
      displayName: Cache piper go binary
      - script: |
      mkdir -p bin
      curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.190.0/piper
      chmod +x bin/piper
      condition: ne(variables.FOUND_PIPER, 'true')
      displayName: 'Download Piper'
      - script: bin/piper version
      displayName: 'Piper Version'
      - job: piperDoesSomeAction
      dependsOn: downloadPiper
      pool:
      vmImage: 'ubuntu-latest'
      container:
      image: <your docker image>
      options: -u 0
      steps:
      - task: Cache@2
      inputs:
      key: piper-go-official
      path: bin
      displayName: resolve piper go binary from cache
      - script: |
      bin/piper version
      displayName: 'some action'

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Thank you for your answer Mayur Belur Mohan

      I am working on the option 1. I guess I can use the docker image mentioned in the blog https://blogs.sap.com/2019/10/24/how-to-use-project-piper-docker-images-for-cicd-with-azure-devops/.

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      That's great, let me know how it goes and also any enhancements you need to Integration Suite piper commands

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Hi Mayur Belur Mohan

      I am using the image "https://github.com/SAP/devops-docker-images" but I am not sure if it is the right one for Cloud Integration artifacts.

      I have the below pipeline.

      # Starter pipeline
      # Start with a minimal pipeline that you can customize to build and deploy your code.
      # Add steps that build, run tests, deploy, and more:
      # https://aka.ms/yaml
      
      trigger:
      - main
      
      jobs:
        - job: downloadPiper
          pool:
            vmImage: ubuntu-latest
            container:
              image: https://github.com/SAP/devops-docker-images 
              options: -u 0
          steps:
          - checkout: none
          - task: Cache@2
            inputs:
              key: piper-go-official
              path: bin
              cacheHitVar: FOUND_PIPER
            displayName: Cache piper go binary
          - script: |
                mkdir -p bin
                curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.190.0/piper
                chmod +x bin/piper
            condition: ne(variables.FOUND_PIPER, 'true')
            displayName: 'Download Piper'
          - script: bin/piper version
            displayName: 'Piper Version'
      
        - job: piperDoesSomeAction
          dependsOn: downloadPiper
          pool:
            vmImage: 'ubuntu-latest'
            container:
              image: https://github.com/SAP/devops-docker-images 
              options: -u 0
          steps:
          - task: Cache@2
            inputs:
              key: piper-go-official
              path: bin
            displayName: resolve piper go binary from cache
          - script: |
              bin/piper version
            displayName: 'some action'
      
        - job: deployiFlow
          dependsOn: downloadPiper
          variables:
          - group: development
          pool:
            vmImage: 'ubuntu-latest'
            container:
              image: https://github.com/SAP/devops-docker-images 
              options: -u 0
          steps:
          - task: Cache@2
            inputs:
              key: piper-go-official
              path: bin 
            displayName: deploy iflow
          - script: |
              bin/piper integrationArtifactDeploy --apiServiceKey $(CREDENTIALS) --integrationFlowId $(IFLOWID)

      But I am getting the below error. Any idea? 🙂

      Starting: CmdLine
      ==============================================================================
      Task         : Command line
      Description  : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
      Version      : 2.198.0
      Author       : Microsoft Corporation
      Help         : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
      ==============================================================================
      Generating script.
      Script contents:
      bin/piper integrationArtifactDeploy --apiServiceKey {     "oauth": {         "clientid": "xx-xxxxxxxxx-xxx-xxxxxxxx|it!xxx",         "clientsecret": "xxxxx-xxx-xxxxxxxxxxxxxxxxxxxxxxxxxx=",         "url": "https://xxxxxxtrial.it-cpitrial05.cfapps.us10-001.hana.ondemand.com",         "tokenurl": "https://xxxxxxtrial.authentication.us10.hana.ondemand.com/oauth/token"     } } --integrationFlowId CostCenter_Replication_from_X_to_Y
      ========================== Starting Command Output ===========================
      /usr/bin/bash --noprofile --norc /home/vsts/work/_temp/f57a3325-1a48-416a-bcc0-887986890779.sh
      info  integrationArtifactDeploy - Using stageName '__default' from env variable
      info  integrationArtifactDeploy - Project config: NONE ('.pipeline/config.yml' does not exist)
      info  integrationArtifactDeploy - fatal error: errorDetails****"category":"undefined","correlationId":"https://dev.azure.com/xxxxxxx/SCP-Pipeline/_build/results?buildId=38","error":"error unmarshalling serviceKey: unexpected end of JSON input","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactDeploy"}
      fatal integrationArtifactDeploy - step execution failed - error unmarshalling serviceKey: unexpected end of JSON input
      ##[error]Bash exited with code '1'.
      Finishing: CmdLine
      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      This https://github.com/SAP/devops-docker-images is a github repository where we describe which images we provide and where to find them. As you are calling just APIs from Piper, please remove the images and run directly on the vm.

      Author's profile photo Vijay Gonuguntla
      Vijay Gonuguntla

      Hi Alex,

      Did you able to resolve and make it work end to end with azure dev ops for integration artifacts .

      Regards

      Vijay

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      is the issue resolved Alex.

      Best Regards,

      Mayur

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Hi Mayur Belur Mohan the below issue is still a blocker.

      info  integrationArtifactDeploy - Project config: NONE ('.pipeline/config.yml' does not exist)
      info  integrationArtifactDeploy - fatal error: errorDetails****"category":"undefined","correlationId":"https://dev.azure.com/xxxxxxx/SCP-Pipeline/_build/results?buildId=38","error":"error unmarshalling serviceKey: unexpected end of JSON input","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactDeploy"}
      fatal integrationArtifactDeploy - step execution failed - error unmarshalling serviceKey: unexpected end of JSON input
      ##[error]Bash exited with code '1'.

       

      not sure why is asking for .pipeline/config.yml. I am working with Azure DevOps - azure-pipelines.yml

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      issue is the service key json document is invalid. When you uses it in a secret environment variable, I suggest using the json document as a string without CR or CR/LF.

       

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      any luck alex? is issue solved?

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Hi Mayur,

      Now the error is different. It looks like that something is missing in the conversion from json file to string.
      I verified from Postman that the Client Id and Client Secret are working and also the permission to deploy iflow using the API are also validated. I mean from Postman a can deploy an iFlow.

      Below the logs(I changes a little the client id and client secret values but keeping the format)

      Starting: CmdLine
      ==============================================================================
      Task         : Command line
      Description  : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
      Version      : 2.198.0
      Author       : Microsoft Corporation
      Help         : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
      ==============================================================================
      Generating script.
      Script contents:
      bin/piper integrationArtifactDeploy --verbose --apiServiceKey "{  \"oauth\": {    \"clientid\": \"sb-aaaaaaa-85ad-zzzz-yyyy-xxxxxxxxxxxx|it!b99999\",    \"clientsecret\": \"rrrrr-zzzzzzz-89be-pppppppp$Mi_uuuuuuuuuuuuuu-ooooooo=\",    \"tokenurl\": \"https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token\",    \"url\": \"https://16072687trial.it-cpitrial05.cfapps.us10-001.hana.ondemand.com\"  }}" integrationFlowId 'TestCICD'
      ========================== Starting Command Output ===========================
      /usr/bin/bash --noprofile --norc /home/vsts/work/_temp/dabe8869-117f-4fb4-8550-b4acecf71ec4.sh
      info  integrationArtifactDeploy - Using stageName '__default' from env variable
      debug integrationArtifactDeploy - Reading file from disk: .pipeline/commonPipelineEnvironment/custom/gcsFolderPath.json
      info  integrationArtifactDeploy - Project config: NONE ('.pipeline/config.yml' does not exist)
      debug integrationArtifactDeploy - Skipping fetching secrets from Vault since it is not configured
      info  integrationArtifactDeploy - CPI serviceKey read successfully
      debug integrationArtifactDeploy - Using Basic Authentication ****/****
      debug integrationArtifactDeploy - no trusted certs found / using default transport / insecure skip set to true / : continuing with existing tls config
      debug integrationArtifactDeploy - Transport timeout: 3m0s, max request duration: 0s
      info  integrationArtifactDeploy - [DEBUG] POST https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token?grant_type=client_credentials
      debug integrationArtifactDeploy - --------------------------------
      debug integrationArtifactDeploy - --> POST request to https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token?grant_type=client_credentials
      debug integrationArtifactDeploy - headers: map[Accept:[application/json] Authorization:[<set>]]
      debug integrationArtifactDeploy - cookies: 
      debug integrationArtifactDeploy - --------------------------------
      debug integrationArtifactDeploy - <-- response 401 https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token?grant_type=client_credentials (441.58ms)
      debug integrationArtifactDeploy - --------------------------------
      
      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      there are 2 service key one for API plan, another for integration-flow plan. which one have you used?. you have to use API plan service key.

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Hi Mayur,
      I am using the API plan service key and I verified from Postman and it is working fine to deploy the iflows.

       

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Hi Mayur Belur Mohan The root cause is that the Client Secret contains the character $. This character is cutting the Client Secret value, and it is always returning 401 as the response code.
      I encoded the Client Secret but it is still not working.

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      please schedule a call

      Author's profile photo Alexander Clen Riva
      Alexander Clen Riva

      Sure, I sent you the meeting invite.

      Author's profile photo Jerry Janda
      Jerry Janda

      Hi, Alexander Clen Riva and Mayur Belur Mohan :

      I'm glad that you were able to connect, but please note that sharing personal email addresses publicly violates our rules of engagement (https://community.sap.com/resources/rules-of-engagement). I've removed that from the comment. When members want to connect, we recommend following a member (via his or her profile), then leaving a comment asking that member to follow back. When members follow each other, they then have the ability to connect via the community's private messages.

      Kind regards,

      --Jerry

      Moderation Lead

      Author's profile photo Geoff Beglau
      Geoff Beglau

      When is SAP expecting to support management of new artifact types, such as SOAP and REST API, via the OData API so it can be leveraged with Project Piper?

      Is there any expectation that all assets, such as Script Collections and Value Mappings, will also be supportable via the same processes used for Integration Flows?

      Additionally, the https://api.sap.com/api/IntegrationContent/resource with the "New API Hub" option selected does not support environment configurations for Cloud Foundry. 

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

      Thank you for sharing an insightful and well-explained blog.

       

      For a project requirement, we were trying to use GitLab instead of Jenkins and GitHub. Where we are trying to establish the source code and CI/CD.

      Can you please guide us on how to go about it and if it is possible in the first place? We tried using a similar piper code but it doesn't build.

      If not, what would be the next option, perhaps, source code in GitLab and CI/CD in BTP.

      Please guide us on how to go about it.

       

       

      Thanks,

      Ramya

       

       

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      Please refer to the youtube video https://www.youtube.com/watch?v=jUiKi6FWYrg which explains how to create basic CI/CD pipeline in GitLab.

      After successfully completing it, replace that YAML and build  YAML like below

       

      Take the credentials from ServiceKey JSON of your process integration instance in subaccount.

      Please note that:

      Gitlab pipelines are based on YAML, so does the Azure Devops pipeline. I suggest to prepare Gitlab pipeline YAML looking in to working Azure Devops pipeline YAML showcased here Working with Integration Suite Piper commands and Microsoft Azure DevOps | SAP Blogs

      There is one example of interoperability between both YAMLs explained here continuous integration - Gitlab yaml to azure pipelines yaml file - Stack Overflow

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

      Thank you for replying back. I did try this and got the below error while building:

      I have followed the exact procedure. Please help me understand where I could have made a blunder.

       

      Also, additionally please confirm if we can automate the deployment of SAP CPI Iflows using GitLab alone.

       

      Thanks,

      Ramya

       

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      may be curl command didn't work and not downloaded the piper binary. if the curl command doesnt exist, it need to be download using native OS commands

      please give the gitlab logging for the JOB you have executed. we will know the status of each command execution under the script.

      may be you can check in below order

      1. does curl command exist in OS
      2. weather curl downloaded the piper binary or not
      3. weather bin/piper directory created properly in the filesytem
      4. output of piper command execution

       

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

       

      Attaching the job screenshots. While the job for download piper shows successful, I can't see the directory in the filesystem.

      Can you please help me out there?

      Attaching the gitlab logging for the job .

       

       

       

       

       

       

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      does both JOBS uses separate VMs when getting executed?. then first job output of downloaded binary may not be available for second job!.

      can you club both in single job and print the downloaded binary signature using echo or ls command before executing any command from piper binary?

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

      Both JOBS use a single VM. I clubbed both in a single job and ended up with the following :

       

       

       

      Thanks,

      Ramya

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      Now i see piper binary is executed and made POST request as well. only issue i am seeing here is , Iflow Is not found which is being used for deploy. that's what error at line no 65 specifies. i.e. 404 error code means resource(iflow) not found.

      please check

      1. iflow ID is valid
      2. try to deploy it using postman once to crosscheck you can use that iflow ID
      3. once API works in postman, provide same iflow ID to piper command

       

       

       

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

      The flow id is correct. Attaching the screenshot of how I am getting the Iflow ID.

       

      Also, I tried different iflow ids and yet I get a 404 error even in postman:

       

      The actual endpoint from CPI looks like this, maybe that is why it is unable to find the right path.

       

       

      Please let me know if I am configuring the flow id wrong or where else I could be at fault.

       

       

      Thanks ,

      Ramya

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      please take the CPI host URL from the service key created for Process integration runtime instance API Plan. please refer to script https://github.com/SAP/apibusinesshub-integration-recipes/blob/master/Recipes/for/CICD-DeployIntegrationArtefactGetEndpoint/Jenkinsfile on how to use iFlowdeploy API

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

       

      Thank you so much for all the guidance. I am able to do it successfully now.

       

      While I can deploy integration artefact using piper commands in GitLab, I was wondering if I could implement this scenario:

      Complete syncing of SAP CPI with GitLab, where any changes in the integration artefact should be making changes in GitLab as well.

      Will that be possible?

       

      Any leads would be appreciated.

       

      Thanks,

      Ramya

       

       

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan
      Blog Post Author

      is the issue resolved ramya?

      Author's profile photo Ramya P
      Ramya P

      Hi Mayur,

      The previous issue was resolved and I was trying version management of Iflows in GitLab/ GitHub using AzureDevops

      Can we do a version comparison and only store the latest version of the artifact in the repository on Gitlab/GitHub using Azure DevOps for CI/CD using the Piper Commands?

       

       

      Thanks,

      Ramya