Skip to Content
Technical Articles

The day my iFlows graduated from DevOps-High – adding automated UnitTesting for CI with groovy

Dear community,

Last time we spoke about applying some nice DevOps practices to your iFlow developments on SAP CPI. We covered agile development with Azure Boards integration, linking code changes to user stories or issues, releasing the groovy scripts to SAP CPI and finally deploying the updated iFlow in your tenant automatically. You can find the corresponding blog post here.

However, my continuous integration process on the first blog was incomplete. I would like to complete the example today with automated unit-testing using JUnit and the Spock framework with code coverage. Once that is done, I can release my precious little iFlows into the wild to fend for themselves.

An evolution of testing your SAP CPI groovy Scripts

Most examples on this awesome CPI community space refer to testing your groovy scripts manually on your local development machine. Down below an evolution of the matter and how I perceived it.

All these options increase the quality of your SAP CPI development practices. However, they don’t allow you to go all in with DevOps for continuous integration and continuous deployment (CI/CD). I know you are anxious to finally see some code and pictures, so let’s dive into it.😊

The moving parts to make it happen

Like I said in the beginning Unit-Testing is the missing piece in my example for a more complete continuous integration DevOps practice.

Automated%20UnitTesting%20Overview

Fig.1 Integration project overview

The integration journey starts on the bottom right of fig.1 with your IDE. In my case that is Eclipse, because I wanted to be able to debug my CPI groovy scripts locally. Not all the IDEs support groovy fully for that. There are other examples on the community for IntelliJ for instance.

  1. Once you do a Git Pull-Request or Push you trigger the whole chain of CI/CD. There are some keywords for Azure Boards and Jira Cloud to automatically update the status of the User-Story or Issue to complete/done. In my example with Azure Boards this is “Fixes #AB414”.

Fig.2 Item status transition based on GitHub commit

  1. The GitHub integration triggers the update of the User Story based on the request comment linking the ID and starting the build process on Azure Pipelines.

Fig.3 Screenshot from GitHub request

  1. To be able to execute unit tests from Azure Pipeline runners I configured my project with the build-management tool gradle. Please note, that I used specific versions that create the desired results now. However, it might need some refactoring going forward. At this point it was most important to me to have repeatable results locally and on the Azure DevOps runner. Gradle developers would probably polish a couple of my setups 😉.The unit tests are contained in “src/test/groovy/”. The actual source of the iFlow and the groovy scripts can be found under “src/main/groovy/”. That way I can keep using my python script “Templates/download-package.py” to get updates from the CPI-tenant for changes on the web-ui, while keeping the ability to run unit-tests at the same time.

Fig.4 Gradle project structure

  1. To make sure that a decent level of unit testing is applied by the integration developers a code coverage scan is performed. Azure Pipelines has built-in support for JaCoCo or Cobertura. I configured gradle to use JaCoCo. The code coverage threshold is 75%. Meaning the build fails in case the unit tests don’t cover enough “ground” on my groovy scripts.You can configure various flavours of coverage types like “by line”, code blocks, branches (e.g. created by if-conditions). In software engineering there are many metrics to measure the “completeness” of your testing. That quickly becomes a mathematical problem or even a stochastic one if you have many branches and conditions. With our boxed uses cases in iFlow development it shouldn’t matter much if you go for straight forward line or branch coverage for instance.I went for “C0” coverage. Here is a nice post with a simple overview on the different magnitudes of testing completeness describing the metrics from the software testing discipline. On my project there are examples for message headers, properties and whole XML.

Fig.5 Screenshot from Azure build pipe and code coverage settings

Fig.6 Screenshot from JaCoCo output for script1.groovy

Fig.7 Screenshot from unit-test run on eclipse

  1. Once the build completes it passes on the project structure to the release pipeline automatically. We don’t have any compiled software parts because we need only the groovy scripts to update the code on SAP CPI. The release pipe performs the actual update for the continuous deployment part.

Fig.9 Screenshot from release pipe on Azure DevOps

6-7  I integrated with SAP CPI’s API using a python script to do the update of the groovy scripts of the targeted iFlows. After the update, the corresponding iFlow is automatically deployed. That way you get true CD 😊

Ok, we have established how to do proper continuous integration with unit testing and how it fits into the overall picture of DevOps practices. Looks great, doesn’t it? I agree but there is a coding practice, where the approach really starts to shine compared to local testing.

Sharing scripts across iFlows

One of the well-established design principles of software engineering is “separation of concerns” and “reusability “. Often with larger integration projects in CPI there is a chance that you will need the same method to create or modify a message multiple times in different places. An example could be the need to create an OData-Timestamp.

I’d like to show you two options how to re-use that function above in any iFlow on your tenant. With that future enhancements or bug fixes need to happen only in one place.

First, you need a way to host the shared script.

Option1: The mighty but unsupported utils iFlow

A great way to do that is a separate iFlow. That way you have only one instance of the script and deployment is straight forward. To load the groovy script from the Utils-iFlow during runtime, we need to rely on the findings from @Vadim Klimov. He investigated the CPI and found that SAP is using OSGi as service runtime.

Vadim also helped me with the snippet for the request:

String flowName = 'iFlowUtils'
String scriptName = 'myUtils.groovy';
// Get bundle context and from it, access reusable iFlow bundle
BundleContext context = FrameworkUtil.getBundle(Message).bundleContext
Bundle utilsBundle = context.bundles.find { it.symbolicName == flowName }
	    
// Within the bundle, access reusable script and read its content
customStringUtilsScriptContent = utilsBundle.getEntry("script/$scriptName").text
// Parse script content and execute its function
Script customStringUtilsScript = new GroovyShell().parse(customStringUtilsScriptContent)
result = customStringUtilsScript.addPrefix(prefix, delimiter, value)

The shared method looks like this:

package iFlowUtils.src.main.resources.script

static String addPrefix(String prefix, String delimiter, String value){
    return "$prefix$delimiter$value"
}

static String getCurrentOdataTime(String timemills){
	return "/Date($timemills)/";
	//https://blogs.sap.com/2017/01/05/date-and-time-in-sap-gateway-foundation/
}

Unfortunately, this approach is prone to breaking changes by SAP. It relies on the current way SAP deploys the CPI runtime. In case SAP drops OSGi for instance it will stop working.

Option2: Attached jar-files with single build step

I am leveraging SAP’s iFlow feature to add custom libraries to your groovy scripts as JAR-files.

This means every iFlow, that wants to call the “shared” method needs a copy of the JAR file.

Fig.9 Screenshot from SAP CPI web-ui iFlow resources

But how does this work from a single source file? Well, gradle can create JAR files and the API of SAP CPI offers not only the option to upload groovy but also JARs. With the JAR attached you can simply import the class to your groovy script.

Fig.10 Screenshot of gradle build script

This build step adds the jar to lib folder on the target iFlow “TriggerError”. That doesn’t matter for the deployment much but resembles the internal Iflow structure. My goal was, that you could always zip the folder of an iFlow and be able upload it immediately on SAP CPI if you want.

Fig.11 GitHub folder structure for iFlow sources

Fig.12 Screenshot from python-script to update files on CPI

Just alter the python script to distribute the JAR file with the shared groovy script to as many iFlows as you want. Cool, right? This way you have a single source but can still re-use the functions in multiple iFlows. One of the downsides is, that you cannot look at the groovy script inside the jar-file at runtime. On the upside is a lower risk of breaking changes by SAP because this is a supported approach of adding libraries to your iFlows.

Thoughts on pipelining-strategy and iFlow development

In my example the CI/CD process updates all the mentioned iFlows using the Python script shown in Fig.12 even though you updated only a particular script maybe. To become more flexible, you could do one of the following things:

Create a whole Azure DevOps project per iFlow

I believe this is a little overkill because you won’t have different governance tasks from organizing the developer team or for the SCRUM process in Azure Boards.

Create a pipeline per iFlow with individual git repos

From a transparency perspective this is very clear and code-wise cleanly separated. It creates Git Repos maintenance overhead though.

Create a pipeline per iFlow with single git repos and path filter for build triggers

This approach gets the best of both options mentioned before. The setup is easy to understand and you save the maintenance effort on multiple Git repos. The filter setup to run the trigger only for your target iFlow looks like this:

Getting test messages for your unit-tests

A straightforward way to get CPI messages to test your groovy developments is the CPI trace feature. From there you can download the payload.

Fig.13 Screenshot of message trace view on CPI web-ui

In the sense of automation of this post on CI/CD I wanted to use the API of CPI to retrieve the last five messages for test runs. However, it seems that part of the API is not implemented yet by SAP. I will investigate some more and let you know once the python script to download the message traces becomes available 🙂

Fig.14 Screenshot of Postman for missing implementation of TraceMessages interface

Hints for replicating the environment

You will need to get the artifacts to enable debugging and unit-test execution for the Message class. You can find guidance on the structure in Cloud Foundry here:

GitHub Repos: https://github.com/MartinPankraz/SAPCPI-Az-DevOps

Azure DevOps project: https://dev.azure.com/mapankra/SAP-CPI-Integration-Az-DevOps

Final Words

Aaaaand done! A full-cyle DevOps process for your SAP CPI iFlows with Azure DevOps. We have agile planning support with Azure Boards (or Jira Cloud), continuous integration with unit-testing (Junit + Spock) and a code coverage quality gate. In addition to that I showed how you could re-use groovy script methods across multiple iFlows. In such a setup CI/CD becomes very valuable, because changes to the shared methods impact all dependent groovy scripts on the other iFlows. In addition to that you need to update only one script for feature enhancements or bug fixes 😊

Ready to enhance your iFlow development practices? Are you mostly using the web-ui? I’d like to hear from you how you ensure deployment quality.

#KUDOS to Vadim Klimov  and Eng Swee Yeoh for their invaluable feedback and guidance on the tricky parts with OSGi and this post in general.

As always feel free to leave or ask lots of follow-up questions.

 

Best Regards

Martin

4 Comments
You must be Logged on to comment or reply to a post.
  • Thank you for the mention Martin! It is a very interesting and detailed read. For the record, I love local development and working with bare text files without UI 🙂

    Regards,
    Fatih

  • Hi Martin, thanks for this material, I can tell there was a great effort to put it all together. Kudos!

    I have some doubts as maybe I got a little confused or didn’t fully understand the material, I hope you can give me your opinion because this topic interests me a lot.

    With this approach, what would happen if you make a change directly in CPI, whatever the reason?Your repository will become outdated, right? My doubt comes from the fact that scripts are only a part of the development in CPI, much development happens in the webui shaping the ilfow.
    Please correct me here if Im wrong, but I believe this approach would have some minor limitations, such as when in the scripts you are developing access internal CPI components such as security materials. Also does this leave aside Javascripts? (although I don’t know anyone who uses them 😅). In that sense, wouldn’t it be better to make the deploy and execute the tests directly in CPI? Something similar to what the simulation does.

    Thanks again for such a great material.

     

    Ariel

    Btw… I appreciate you currently work for MS… but have you considered Jenkins?  😁

    • Hi Ariel Bravo Ayala,

      we can extend that discussion on a Teams call maybe. In general my approach focuses on all the scripts of an iFlow, that profit from actual DevOps practices. If you do alter anything on the iFlow on the webui you can use my provided python script to update your Git repos with the recent changes. Just make sure you commited any changes before to avoid override when the script executes.

      Given an external developer for CPI scripts you could even revoke rights to update on the webui and have them push changes only through the CD pipeline.

      For accessing CPI specific things like security materials you need mock services indeed to mimic that in your dev environment.

      JavaScript is not at all out of the picture. All of my concepts described in the post apply too. It was simply a matter of broader distribution of groovy in the integration space.

      The simulation option is nice but not at all a dev environment. So it boils down to the design efforts: Is my integration case complex enough or am I done with 4 lines of groovy.

      In terms of Jenkins I have some demos too but I favour the managed solution in Azure DevOps or GitHub Actions compared to a VM solution with Jenkins. Generally speaking the same could be achieved with Jenkins.

      KR

      Martin

  • Thanks for bring us down memory lane. I can’t believe it’s been three years since Vadim Klimov‘s “bare it all” post trailblazed a new path for us in Groovy development.

     

    The great thing about this community is how we continually learn and share with/from one another, and in that way we can “stand on the shoulders of giants”. It’s amazing how so much has evolved over the past three years, and now that the foundation of Groovy development are in place, we can switch our focus to finding the holy grail of DevOps – CI/CD 🙂

     

    Thank you for showcasing to us how CI/CD is possible, a big kudos to the effort you have put into this 🙂 Keep it going.