Technical Articles
SAP Continuous Integration and Delivery for SAP Integration Suite Artifacts
With its latest release, SAP Continuous Integration and Delivery provides a predefined pipeline for SAP Integration Suite Artifacts. In this blog post I would like to show you the steps how to use this new functionality and configure a CI/CD pipeline for the development of integration content in SAP Cloud Integration.
Prerequisites
-
You have enabled SAP Continuous Integration and Delivery. See Enabling the Service.
-
You are an administrator of SAP Continuous Integration and Delivery. See Assigning Roles and Permissions.
- You have provisioned the Cloud Integration capability as part of SAP Integration Suite. See Initial Setup of SAP Cloud Integration in the Cloud Foundry Environment.
- You’ve created the Process Integration Runtime service instance of plan api and a service key to authenticate your pipeline against SAP Cloud Integration. See Setting Up OAuth Inbound Authentication with Client Credentials Grant for API Clients, Cloud Foundry Environment.
Procedure
- First, you have to transfer your integration artifacts from SAP Cloud Integration to a repository of your source code management system. To do this, perform the following steps:
-
In SAP Cloud Integration, open your integration package and choose Artifacts.
-
Choose Actions→ Download.
Save the *.zip file to your choice of destination.
-
Note: Repeat the procedure in case you make changes in your integration flow in SAP Cloud Integration.
-
- In this demo, I will use Job Editor from the Configuration Mode dropdown list.
- In the Stages tab, specify the general parameters. Then switch on and configure the stages you want to execute.
- Don’t forget to save your job.
Of course, if you prefer, you can use the Source Repository Option for configuring your job.
In that case you can follow our documentation instead.
After you have created the job, you can either run it manually, or you can create Webhook to automate your builds.
The picture below shows the result of a job run. In this view, you can monitor its outcome.
Documentation & Further Readings
If you have found this article useful and want to learn more and use the SAP Continuous Integration and Delivery to manage your SAP Integration Suite Artifacts, take a look at the following resources:
- Our Product Documentation – SAP Continuous Integration and Delivery Documentation
- A very useful blog post containing information for the different scenarios – CI/CD for SAP Integration Suite? Here you go!
Not what you have expected?
If you think that we’ve missed important use cases, please help us improve our offering by leaving a note to this blog or submit your idea in SAP Business Technology Platform Continuous Influence Program.
Hello Irina Kirilova
I have a few questions after going through the documentation.
Thanks,
Srini
Hi Srini,
Best regards,
Irina
Hi,
I have one question related to SAP CPI DS.
Question : CPI-DS option is getting covered in SAP BTP Integration suite?
Please clarify.
Thanks, Ramoji
No CPI-DS is not part of SAP BTP Integration suite.
Hi Irnia,
Upload step is failing when I include the Upload step in the Job. If I only include the Deploy step it is running successfully.
I tried with and without having the Integration Flow in the package in my trial CF account. In both cases Upload step is failing.
<error>
<code>Bad Request</code>
<message xml:lang="en">INVALID_INTEGRATION_PROJECT_NO_MANIFEST</message></error>
Log looks like as follows:
Hi Mani,
Upload Steps works only with integration flow archive but not integration package archive. are you
sure you are using right archive(i.e. Integration flow)?
Best Regards,
Mayur
Hi Mayur,
Yes thats the Integration Flow archive only, not the package archive. Still it is failing as above.
is this only iflow archive this issue is happening or for all iflow archives same issue exist?
Hello Irina,
As discussed on our call last month, do we have some video handy for the same. Its not very clear with the steps and getting problem in replicating the same.
Thanks in advance !
Virender Singh
The link create Webhook is broken.
Thanks, Peter
Hi Peter,
Here is the correct link to the documentation for creating a Webhook:
https://help.sap.com/docs/CONTINUOUS_DELIVERY/f3d64e9188f242ffb7873da5dfad4278/a273cffe863b4663b23942a9bb73071d.html?locale=en-US
I will talk to Irina about fixing the link in the blog post.
Best regards,
Linda
Updated. Thanks, Linda!
Hi,
Thanks for the blog. I wanted to know how to pass input file for testing?
Also how upload feature works? does it import iflow to prod tenant?
is there any method to upload iflows to github repository rather than manual download n upload?
Regards,
RJ
Hello Irina Kirilova Mayur Belur Mohan Linda Siebert Please advise.
Pt. 1- I understand that behind CICD service, SAP is calling Piper library steps.
Pt. 2- My assumption is that Input file for Integration Test Step, need to be maintained in configured repository itself for test to get successful.
Pt. 3-I understand we need to create multiple pipelines (Jobs) per IFlow & per Environment.
Example: If we have 1 IFlow & we need to execute the pipeline in 3 Environments, we need to create 3 Jobs (Pipelines) for each environment as the service key (/authentication) would be different for each Tenant. Similarly additional Jobs for additional IFlows.
Pt. 4-My understanding is there is no OOTB feature provided as of today to Push the Tenant Artefacts to the Repository directly and we need to upload / Push manually any changes from Tenant into repository.
Looking forward to your response. Thx.
Hi Rashmi and Jeswani,
I can't answer all of these questions but I can answer a few! 🙂
To pass the input file to the test you can use the "Message Body File Path" parameter in the Job Editor. You can put the path to your file there. This file needs to be in your source repository.
Yes, 1 job is needed per each integration flow. And right now we have to manually download the integration flow and upload it into GitHub, there are no separate tools for this.
Hope that helps!
Linda Siebert
Hello Linda Abell , Irina Kirilova MANI RAJENDRAN :Could you please help here, I am trying to follow the same steps however getting below issue in upload step:
Hello Ankush,
please use Support Channel to request official support for your issue.
Thanks and regards,
Irina
Check the service key type.
You may be using a "integration-flow" SK instead of an "api" SK.
Also check the SK has the following roles:
"MonitoringArtifactsDeploy",
"WorkspaceArtifactLocksDelete",
"WorkspaceArtifactLocksRead",
"WorkspaceArtifactsDeploy",
"WorkspacePackagesConfigure",
"WorkspacePackagesEdit",
"WorkspacePackagesRead",
"WorkspacePackagesTransport",
"MonitoringDataRead"
Some of them may be unnecessary, but it gets the pipeline working
Best regards
Mikel