Skip to Content
Technical Articles
Author's profile photo Nuno Pereira

SAP CPI: CI/CD from from zero to hero

The purpose of these blog series is to describe possible approaches for SAP Cloud Integration (aka CPI) CI/CD addressing some of what I consider pitfalls or limitations. If you’re aware of SAP standard internal mechanisms to deal with it just let me know. Each of the topics below will be linked when the blog part is available. After each section I try to highlight the motivation and value added from each feature/development done.

Just a disclaimer, each observation I make on these series is my personal opinion and of course is very debatable, so I encourage you to comment on alternatives or possible issues with the approaches I took.

When I first joined my current employer, we we’re just starting to use SAP CPI, so I had the chance to influence, propose and implement some ideas in regards to CI/CD and also to establish some ground for future processes. Nowadays we have around 400 integration flows running so there’s definitely some governance needed.

Some context information:

  • Our company uses 4 system landscape (dev, test, preprod and prod)
  • We use cloud foundry instances only (no neo)

I’ll now present how we addressed the following topics:

Each one of these areas will be an in-depth explanation with most of the steps to help you creating a similar platform. When we started this, project piper was still on early stages, from my understanding it would only run on linux and we had a windows VM. There was always the option to use Docker, but using Docker totally free would mean run it without Docker Desktop which we assumed (perhaps wrongly) that would be a big effort to configure it so despite we use some of the same platforms as used in piper (like Jenkins, we’re not using the official project piper implementation)

We started this initiative as kind of a PoC to play a bit with CI/CD and see how could we benefit from it, therefore we tried to stick to open source and no costs whenever possible. Crucible came later to the game and we decided that the licensing cost was totally worth it so we ordered it.

  • Jenkins – We can think of Jenkins as just a scheduler, a trigger for our automations
  • Gitea – Our source control repository on premise. If you don’t have any issues to store your code on cloud, maybe github is a better option for you since it supports github actions
  • Crucible – A tool from Atlassian to allow you to create and implement code reviews on your source code

Special thanks to Antonio Vaz, who contributed to this solution on most of these topics.

Backup Binaries and Source Code

Cloud integration is a great tool that allows you to create integrations using a simple UI. It has the concept of versions, but those versions are freely created by developers using freely naming convention. While this is great in regards to flexibility is not that great in regards to consistency since each developer will follow their own version convention. We decided to use the semver model but even then, there’s no impediment for a developer to create the same semver all over again, or even worse which is creating outdated semver versioning creating a big mess in the end. On top of that, all of your code is only “saved” on SAP servers, so if you delete your integration suite tenant by mistake, there’s no way for you to recover any of your work (trust me, I’ve been through that…).

Our landscape has currently 4 environments (DEV, TEST, PREPROD and PROD).

So since my early days of joining that I wanted to have backups for our binaries (packages) as well as for the iflows source code. We’ve asked for a new on premise server (windows) and we installed on this server:

  • Jenkins
  • Gitea
  • Crucible

Main%20general%20architecture

Main general architecture

Binaries Backup

Idea was to have a pipeline scheduled on Jenkins and synchronize on a daily basis (per environment). The code uses CPI API to retrieve all packages binaries and stores them into git. What we realized then was that if a package has iflows in Draft status, you can’t download the full package, so we went to download every binary possible for the package (at the time we developed only value mappings and integration flows were supported).

Package%20binary%20file%20stored%20on%20git%20as%20a%20zip%20file

Package binary file stored on git as a zip file

Iflows%20and%20value%20mappings%20zip%20files%20inside%20the%20package

Iflows and value mappings zip files inside the package

As the end result, all our CPI binaries would be stored as zip files inside git. The advantage is that if you need to restore it would be quite easy to import the zip files again. The disadvantage is that for source control itself and history tracking of changes, binary files are not optimal for analysis. Therefore we thought that it would be interesting to have not a single backup pipeline for everything but also a pipeline per cpi package running package specific checks and logic.

Next step was in regards to security: we wanted to be able to versioning the keystore and security materials, so using CPI API we were able to download all the certificates from all environments and also to synchronize all our security materials with a keepass file. We then “gitted” all of these on our binaries repository per environment.

Some technical details

In order to save some disk space we only keep track of the latest 5 builds. We run it daily at 2 AM. For confidentiality reasons I changed our internal urls into dummy one, but you get the idea

Jenkins%20pipeline%20details

Jenkins pipeline details

Jenkins%20pipeline%20details

Jenkins pipeline details

We’re basically instructing Jenkins to execute file admin_SAPCPIBackup_DEV that is located in a Jenkins repository stored inside git. This file contains the instructions to use a coded pipeline with multiple stages.

import groovy.json.JsonSlurper
def GitBranch  = "master"
def GitComment = "Backup" 
def GitFolder  = "IntegrationPackages"         
def CertificatesFolder = "Keystore"
def packageResponsible = "testdummy@domain.com"

pipeline {
	agent any

	options {
		skipDefaultCheckout()
	}

	stages {
		stage('Download integration artefacts and store it in git for CPI DEV') {
			steps {
				script {

We based our code on the following basic recipe from Axel Albrecht (@axel.albrecht)

Link to the complete blog from him here

Value added

Easy restore in case we lose an instance

Synchronization of packages to pipelines

Next logical step would be to find a way to automatically create Jenkins pipelines for each package and make them execute similar checks. Each of our pipeline would reference a git repository and their respective JenkinsFile inside it containing the logic to run for that package..

So idea of this job was to pick all cpi package list and check if a new jenkinsFile would need to be created on our git and a respective jenkins pipeline with the same name of the cpi package.

Sync%20all%20packages%20on%20Cloud%20Integration%20creating%20a%20pipeline%20per%20package

Sync all packages on Cloud Integration creating a pipeline per package

<Package specific pipeline>

Pipeline%20configuration%20retrieving%20the%20code%20to%20execute%20from%20a%20git%20repository

Pipeline configuration retrieving the code to execute from a git repository

All our generated jenkinsFile followed the same structure which was:

  • Backing up the source code for that particular cpi package into a git repository. If no git repository was yet created, a new one was created using gitea API
  • Create a crucible repository, connect it to the gitea git repository, create a crucible project and a crucible code review to make sure we keep track of unreviewed files
  • Crawl through the source iflows trying to find MessageMappings. When found, submit them to an extraction tool which would generate a cool html report with syntax coloring for mappings done there (this would serve for documentation purposes) Documentation%20of%20message%20mapping%20in%20html%20format
  • Automatic markdown generation for git repository containing all package information, all iflows there, description per iflows, screenshot of the iflow as well as the message mapping documentation table. Example:Markdown%20automatic%20generation%20with%20iflow%20screenshot%20and%20message%20mapping%20documentation
  • Running CPI Lint (discussed in detail later)
  • Automatically running unit testing for regular groovy files of your iflows, xspec for xslt unit testing as well as unit testing for message mappings (discussed in detail later)Test%20result%20trend
  • Email notifications to the package responsible in case of any issues found on any steps above

The Jenkinsfile’s were generated based on a template containing placeholders that was just replaced when the synchronization job is running. In case the template change since we want to introduce a new functionality, all we need to do is to delete all the jenkins file from git and this job would be able to regenerate them and commit them to git again.

Value added

if some developer started working on a new cpi package, without he even noticing, there is a job collecting the source code and binaries for his new cpi package and starting to store this on our git repository. If for instance the source code was not following our development guidelines the responsible would be notified to check it.

Package Responsible

You may realize that for each package we identify a package responsible. This is needed since we want to have the concept of ownership and responsibility. Ultimately it’s the goal of the team to make sure all packages are ok, but if we have one package responsible it would help a lot.

The package responsible is the person that receives emails notifications in case the package wasn’t built successfully. How do we calculate it?

We get the most recent date from the following evaluations:

  • Last modified date on the package level
  • Last deploy made on one of the artifacts of the package (iflow, value mappings, …)
  • Package creation date

For each of the dates above there’s a user associated with it. We consider the user as the package responsible. We implemented a delegation table by package regular expression and considering begin dates and end dates so that we could accommodate absences from the package responsible.

Another failsafe mechanism we did was that if a package fails for 10 consecutive days, we send the email not only to the package responsible but we cc the whole team DL as well so that someone can act on it.

Value added

We always have a main contact per package. It can be that this is not the most familiar person with the package but at least is a person who lastly interacted with it.

 

Summary

In this first part, we introduced the CI/CD tools we used, we highlighted the importance of using backups, how to get these backups, what quality measures are we enforcing to all CPI packages, what is currently being documented and how do we calculate a package responsible. On the next part we’ll discuss in more detail what we did in regards to quality control

I would invite you to share some feedback or thoughts on the comments sections. You can always get more information about cloud integration on the topic page for the product.

Assigned Tags

      17 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Pedro Aires
      Pedro Aires

      Nice article. Please continue.

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Thanks Pedro

      Author's profile photo Rob Hofman
      Rob Hofman

      Very cool approach! Looking forward to the next chapters!

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Thanks Rob, the next one is ready and should be published soon

      Author's profile photo Morten Wittrock
      Morten Wittrock

      Psyched to see that you are using CPILint in your pipeline - with some cool extensions even!

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      When you have structured and well written code to learn and extend from it is quite straightforward to implement extensions per my needs, so kudos to you for your excellent job developing the tool. Next article is precisely on cpilint and was just submitted for review so it should be available in few days. Hope you read that one as well and provide your valuable feedback

      Author's profile photo Morten Wittrock
      Morten Wittrock

      I shall look forward to it!

      Author's profile photo Naresh Dasika
      Naresh Dasika

      Hello Nuno Pereira,

      Nice blog. Very Informative.

      I don't see CI/CD service is available in my BTP login to activate/enable.

      Is it not a free service from SAP?

      Below screenshot from SAP BTP:

       

      Regards,

      Naresh

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Naresh,

      We're not using the CI/CD btp service, but if you're starting with it on your company, the service provides a straightforward alternative approach where you just configure the connections to your git but you don't need to code any of the pipelines or host it on your own servers.

      If you want to setup that standard service I strongly advise you to follow the step by step process described here https://developers.sap.com/tutorials/btp-app-ci-cd-btp.html

      Thanks,

      Nuno

      Author's profile photo Tom Bendrath
      Tom Bendrath

      Hi Nuno,

      very well-structured blog. I also loved seeing the "integration" of CI/CD pipeline with SAP Cloud Integration artifacts!

      To complete Naresh's question, I would like to point out that there is even a pipeline template for Integration Suite artifacts in the SAP BTP Continuous Integration & Delivery Service. Irina Kirilova has a great blog on this as well. In addition to the configuration steps that are kept pretty simple via the "Job Editor", the pipeline can also be adapted to a greater extent in the BTP service if the pipeline configuration is made in the source repository.

      Also, I would like to ask you if you have an updated Jenkins pipeline configuration that already takes into account the Integration Suite-related steps available in the Piper Library as outlined by Mayur?

      Thanks,
      Tom

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Tom,

      Thanks for bringing up extra good readings for the community. As I mentioned in the blog, we're not using anything from the piper library. I have to check the migration effort and advantages of using it, because right now, with the approach we have, the pipelines are totally self sustained, meaning, when you have a new CPI package, you don't need any manual step (such as creating a git repo and uploading the package there, or creating webhooks to connect it to ci CD service) to have your new package synchronized and part of the new checks. Also, at this point in time I haven't investigated if having GitHub actions is the only supported way for piper or if it was the comfortable/chosen way to do it. Reason I'm stating this is that there are several git implementations (GitHub, gitlab, bitbucket, gitea to name a few) and each one seems to follow their own CI CD approach, so if you rely too much on the inner tools of your git implementation you're likely to get dependent on it, but that's just my view. Can you share what's your thoughts on that?

       

      Thanks

      Nuno

       

      Author's profile photo Tom Bendrath
      Tom Bendrath

      Hi Nuno,

      Thank you for your detailed answer. You have highlighted a very important added value of your approach, which I would like to emphasise again. You call it "self sustained", with the description that no manual steps are necessary to export the SAP Cloud Integration (formerly CPI) artifacts first and upload them to any Git repository. From my perspective, this is extremely practical in the IT administrator's workflow and saves a significant amount of time. The pipeline template I referenced from the SAP BTP Continuous Integration & Delivery Service unfortunately requires a manual export of SAP Cloud Integration artifacts and subsequent manual pushing into a Git repository.

      The interested reader is usually faced with the comparison between the simplicity of the technical setup via the SAP BTP CI/CD service vs. the more functionalities of a custom Jenkins approach, which in turn requires more effort for the initial setup and maintainability.

      Regarding your question about the possible uses of the Piper library, I would like to add that the SAP BTP CI/CD service is based on the Piper library from a technical perspective and that Project Piper is therefore a very fundamental component of SAP's DevOps strategy. The possibility you mentioned of integrating the Piper library steps in a GitHub action is a great thing for customers who are looking for a CI setup that can be set up quickly and is of manageable complexity similar to the SAP BTP CI/CD service. I am happy to refer you to the What you Get section on the Project Piper landing page for more details on the different facets of Project Piper.

      SAP is also aware that there are various tools and different preferences on the market. In order to mitigate the dependency on one provider, Docker images have existed as part of Project Piper for several years. However, these should only be used if the other options have functional gaps. To underline this platform independence, the Project Piper library steps were implemented in Go, which allows them to be used not only in a Jenkins CI/CD server but also in Azure DevOps pipelines, for example.

      Now I really hope that I could make Project Piper more appealing to you and that you might consider it in future pipelines.

      Best regards,
      Tom

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Tom,

      Many thanks for such good inputs, this was exactly what I was looking for when started to write it. I would definitely do a PoC on piper on a personal level and if I see it's very easy to migrate it, I would do it on what we have for the sake of standardization and perhaps leverage any existing standard pipeline. To be honest, in order to fully fulfil customer needs I think native git integration embedded into integration suite is a must. Having either to manually upload artifacts into git or having a scheduled job on a daily basis as we do sounds more like workarounds than definitive solution from such a professional and successful product as Integration Suite is. Even BAS or Web IDE brings native git integration as developer tool.

      I still see value on CI CD btp service as the simplest way to get these builds running with little effort, knowledge and infrastructure needs. But if you have 200 packages, having to create all these git repos by hand don't seem like an optimal solution for such cases.

      Regarding piper, I see that a merge between the standard pipelines for integration and some of the ideas we implemented would be the right compromise for us and to the community.

      From what I've seen, the standard piper Jenkins shared libraries for cloud integration are more like a facade over the cloud integration odata apis. Since our shared libraries are doing more steps at once (for instance our backup binaries shared library is also downloading both packages and individual artifacts at one go, or our documentation shared library takes care of message mapping documentation as well as markdown generation), we would have to break it into very individually reusable pieces to make it consistent with the approach you're following with piper.

      If time allows I would definitely try piper to have both visions and take the best of both worlds.

      Once again, many thanks for your feedback, it's really nice to exchange ideas on these topics.

      Nuno

       

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Tom Bendrath ,

      as a follow up on this discussion I created this blog with my journey on piper installation on kyma. At this time, after understanding how piper is working, I still don't see a big potential for us. I mean, if you're starting from scratch, having a shared library already with most of the steps would be a big advantage and definitely worth it. When you already have your custom shared libraries already doing what you need, the migration would be more a standardization effort than a real value on top of what we had.

      Also I'm thinking that we'll always have specific processes with few potential for reuse (usage of crucible or gitea), so most likely one sooner or later we would have to deviate it anyway from the standard shared libraries or to pollute the standard shared libraries via pull requests with this uncommon scenarios.

      Anyway just wanted to thank you for your comments, they were so valuable that I wanted to try out piper and share my experience with the others.

      Nuno Pereira

      Author's profile photo Martin Pankraz
      Martin Pankraz

      Interesting effort Nuno Pereira! Liked your thoughts on Project Piper.

      The community profits from referencing existing publiciations and prototypes, so we can build on and learn from each other. See here additional contributions with focus on groovy scripts in CPI.

      KR

      Martin

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Martin,

      great blog you linked, I also inspired myself on Vadim Klimov posts to get access to the jars on our container so that we can run our junit tests offline and we can also code them with content assist on eclipse having the jars as dependencies on classpath. 

      Interesting that you've used Azure DevOps, we also have Azure so that would have been an interesting approach as well.

      Also in regards to reuse of shared script code, despite the shared script collections nowadays are visible across packages, last time I checked you can only link them via UI, not allowing you to call it from another local script for instance.I still have groovy script code that allows you to call any function from any script collection from another script in your local project so if you're interested just let me know. 

      Thanks,

      Nuno

      Author's profile photo Martin Pankraz
      Martin Pankraz

      Always worth linking Vadim's materials 🙂