Skip to Content
Technical Articles
Author's profile photo Irina Kirilova

SAP Continuous Integration and Delivery for SAP Integration Suite Artifacts

With its latest release, SAP Continuous Integration and Delivery provides a predefined pipeline for SAP Integration Suite Artifacts. In this blog post I would like to show you the steps how to use this new functionality and configure a CI/CD pipeline for the development of integration content in SAP Cloud Integration.

Prerequisites

Procedure

  • First, you have to transfer your integration artifacts from SAP Cloud Integration to a repository of your source code management system. To do this, perform the following steps:
    • In SAP Cloud Integration, open your integration package and choose  Artifacts.

    • Choose Actions→ Download.

    • Save the *.zip file to your choice of destination.

    • Extract the file and upload its contents into your repository.

      Note: Repeat the procedure in case you make changes in your integration flow in SAP Cloud Integration.

  • In SAP Continuous Integration and Delivery, configure a new job as described in Create a Job. As Pipeline, choose SAP Integration Suite Artifacts.

  • In this demo, I will use Job Editor from the Configuration Mode dropdown list.
  • In the Stages tab, specify the general parameters. Then switch on and configure the stages you want to execute.
  • Don’t forget to save your job.

Of course, if you prefer, you can use the Source Repository Option for configuring your job.
In that case you can follow our documentation instead.

After you have created the job, you can either run it manually, or you can create Webhook to automate your builds.

The picture below shows the result of a job run. In this view, you can monitor its outcome.

 

Documentation & Further Readings

If you have found this article useful and want to learn more and use the SAP Continuous Integration and Delivery to manage your SAP Integration Suite Artifacts, take a look at the following resources:

Not what you have expected?

If you think that we’ve missed important use cases, please help us improve our offering by leaving a note to this blog or submit your idea in SAP Business Technology Platform Continuous Influence Program.

Assigned Tags

      15 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Srini Reddy
      Srini Reddy

      Hello Irina Kirilova

      I have a few questions after going through the documentation.

      1. Does this use any integration artifacts from the source code repository or pulling the artifact from the Integration Suite instance?
      2. Do I need to create multiple pipelines to deploy multiple iflows or Can I deploy multiple iflows from the same pipeline?
      3. How to handle iflow configuration parameters in the pipeline?
      4. Any plans to support Azure Repos for the webhooks?

      Thanks,

      Srini

      Author's profile photo Irina Kirilova
      Irina Kirilova
      Blog Post Author

      Hi Srini,

      1. The service takes the artifatcs from a Git Repository. In the procedure above, I have described one possibility how to do that.
      2. You can deploy one flow per pipeline. In case you have multiple flows in one package, you have to create multiple pipeline for them.
      3. Currently not supported in the service. In case you need more flexibility in your pipeline, you could consider using Project "Piper" which covers this feature already - https://www.project-piper.io/steps/integrationArtifactUpdateConfiguration/
      4. We get frequent customer requests for this, so we are considering the feature for 2022.

      Best regards,

      Irina

      Author's profile photo Ramoji Sabnekar
      Ramoji Sabnekar

      Hi,

      I have one question related to SAP CPI DS.

      Question :  CPI-DS option is getting covered in SAP BTP Integration suite?

      Please clarify.

       

      Thanks, Ramoji

      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan

      No CPI-DS is not part of SAP BTP Integration suite.

      Author's profile photo Mani Rajendran
      Mani Rajendran

      Hi Irnia,

      Upload step is failing when I include the Upload step in the Job. If I only include the Deploy step it is running successfully.

      I tried with and without having the Integration Flow in the package in my trial CF account. In both cases Upload step is failing.

      <error>

      <code>Bad Request</code>

      <message xml:lang="en">INVALID_INTEGRATION_PROJECT_NO_MANIFEST</message></error>

      Log looks like as follows:

      fatal integrationArtifactUpload - step execution failed - HTTP PUT request to https://****/api/v1/IntegrationDesigntimeArtifacts(Id='testflow',Version='Active') failed with error: <?xml version='1.0' encoding='UTF-8'?><error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"><code>Bad Request</code><message xml:lang="en">INVALID_INTEGRATION_PROJECT_NO_MANIFEST</message></error>: Request to https://****/api/v1/IntegrationDesigntimeArtifacts(Id='testflow',Version='Active') returned with response 400 Bad Request
      [2021-10-13T09:46:41.471Z] + ./piper readPipelineEnv
      
      But the zip in the Git Hub has the correct Manifest file.
      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan

      Hi Mani,

      Upload Steps works only with integration flow archive but not integration package archive. are you

      sure you are using right archive(i.e. Integration flow)?

      Best Regards,

      Mayur

      Author's profile photo Mani Rajendran
      Mani Rajendran

      Hi Mayur,

      Yes thats the Integration Flow archive only, not the package archive. Still it is failing as above.

      fatal integrationArtifactUpload - step execution failed - HTTP PUT request to https://****/api/v1/IntegrationDesigntimeArtifacts(Id='testflow',Version='Active') failed with error: <?xml version='1.0' encoding='UTF-8'?><error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"><code>Bad Request</code><message xml:lang="en">INVALID_INTEGRATION_PROJECT_NO_MANIFEST</message></error>: Request to https://****/api/v1/IntegrationDesigntimeArtifacts(Id='testflow',Version='Active') returned with response 400 Bad Request
      [2021-10-13T09:46:41.471Z] + ./piper readPipelineEnv
      
      But the zip in the Git Hub has the correct Manifest file.
      Author's profile photo Mayur Belur Mohan
      Mayur Belur Mohan

      is this only iflow archive this issue is happening or for all iflow archives same issue exist?

      Author's profile photo Virender Singh Rana
      Virender Singh Rana

      Hello Irina,

      As discussed on our call last month, do we have some video handy for the same. Its not very clear with the steps and getting problem in replicating the same.

      Thanks in advance !

      Virender Singh

      Author's profile photo Peter Schleinitz
      Peter Schleinitz

      The link create Webhook is broken.

       

      Thanks, Peter

      Author's profile photo Linda Siebert
      Linda Siebert

      Hi Peter,

      Here is the correct link to the documentation for creating a Webhook:

      https://help.sap.com/docs/CONTINUOUS_DELIVERY/f3d64e9188f242ffb7873da5dfad4278/a273cffe863b4663b23942a9bb73071d.html?locale=en-US 

      I will talk to Irina about fixing the link in the blog post. 🙂

      Best regards,

      Linda

      Author's profile photo Irina Kirilova
      Irina Kirilova
      Blog Post Author

      Updated. Thanks, Linda!

      Author's profile photo Rashmi Joshi
      Rashmi Joshi

      Hi,

      Thanks for the blog. I wanted to know how to pass input file for testing?

      Also how upload feature works? does it import iflow to prod tenant?

       

      is there any method to upload iflows to github repository rather than manual download n upload?

      Regards,

      RJ

      Author's profile photo Jeswani Jitendra
      Jeswani Jitendra

      Hello Irina Kirilova Mayur Belur Mohan Linda Siebert Please advise.

      Pt. 1- I understand that behind CICD service, SAP is calling Piper library steps.

      Pt. 2- My assumption is that Input file for Integration Test Step, need to be maintained in configured repository itself for test to get successful.

      Pt. 3-I understand we need to create multiple pipelines (Jobs) per IFlow & per Environment.

      Example: If we have 1 IFlow & we need to execute the pipeline in 3 Environments, we need to create 3 Jobs (Pipelines) for each environment as the service key (/authentication) would be different for each Tenant. Similarly additional Jobs for additional IFlows.

      Pt. 4-My understanding is there is no OOTB feature provided as of today to Push the Tenant Artefacts to the Repository directly and we need to upload / Push manually any changes from Tenant into repository.

      Looking forward to your response. Thx.

      Author's profile photo Linda Siebert
      Linda Siebert

      Hi Rashmi and Jeswani,

      I can't answer all of these questions but I can answer a few! 🙂

      To pass the input file to the test you can use the "Message Body File Path" parameter in the Job Editor. You can put the path to your file there. This file needs to be in your source repository.

      Yes, 1 job is needed per each integration flow. And right now we have to manually download the integration flow and upload it into GitHub, there are no separate tools for this.

      Hope that helps!

      Linda Siebert