Skip to Content
Technical Articles
Author's profile photo Nuno Pereira

SAP CPI: CI/CD from zero to hero – Automated Testing

This blog is part of a blog series, so you can find the first page here ( This is the agenda we’re following:

Automated Testing

Coming from a JAVA development background, automated testing was always a part of the development. I was very surprised to realize that there are no standard tools in regards to automated testing for your iflows. When talking about it on scope of integration, I’m sure there are a lot of types of tests you can do but I can immediately think about:

  1. E2E testing of your integration using mocked scenarios
  2. Performance Tests
  3. Unit testing (main focus on this page)

E2E Testing

General Tools

Regarding E2E testing, there are several general testing products (e.g. Tosca tricentis, HP ALM) that do a good job allowing you to supply your interfaces with predetermined data and asserting some outputs from your API. This tools work perfectly fine if your interface is isolated with no dependencies to external adapters, meaning totally under your control. However, if your interface uses an external API inside of it and you need to mock it in your test, then you get into a problem since these products can’t interfere with the logic inside your iflows by mocking up the adapters you’re using.

Figaf Testing Tool

When doing some investigation, I found Figaf Testing tool for CPI that seems to be an awesome alternative allowing to mock up your adapters. If I reflect on it, this is likely a copy of your original iflow but with the adapters endpoints being replaced with endpoints to a fixed controlled url where you can inject your mocked body simulating a response from that adapter, but I hope someone from Figaf can comment on that Daniel Graversen. It has a UI5 cockpit that lets you manage your test cases allowing you to record your test cases by leveraging trace mode on your iflow. This looks very interesting as you don’t need to manually create your test case data. You can then run it later on and assert for each step that no regressions were introduced. More details on the following link . Never tried it but looks promising. I invite you Daniel Graversen or anyone else to provide more details about it in the comments.

INT4 Shield

Int4 has also a similar SAP certified solution Int4 Shield (as a part of their Int4 Suite) to test CPI packages but as part of a broader product that also lets you test SAP PI/PO and also the integration the backend systems (ESR Proxies, IDOC, etc.) it also has eCATT, Tricentis, Worksoft integrations and so on. The cockpit is installed and configured on SAP S/4 backend (as an SAP certified addon) and in the end is also leveraging the trace mode to capture payloads across the starting and endpoint configured on your automation object. You can then run the test cases from the cockpit and evaluate the results there. More details on week 3 unit 5 from this open sap course I also invite you @Michal.Krawczyk, @Andrzej.Halicki or anyone else that tried the tool to provide more details about it in the comments.

We’re currently evaluating Tosca, since we already have the licenses that were acquired on scope of another products, but we haven’t focused on E2E testing so far on the Integration team

Performance Tests

I have no experience with performance testing specifically for integration. I’ve used HP Load Runner and JMeter in the past for general web testing and I think they are suitable to test/simulate multiple requests at the same time. So far, we didn’t have the chance to do performance tests on our interfaces. If you have done something in this area, provide your insights on the comments.

Unit Testing – Groovy and XSpec

Found nothing on this topic, but this was the one that I thought could bring more value to the team since it’s fully controlled by the developers, you shouldn’t have false positives or any dependencies when talking about unit testing. Our pipeline per package was doing some checks (like cpilint checks), so that seemed like the perfect pipeline to enhance to do automated unit testing as well. More details about the steps below. One key aspect that I wanted to guarantee is that the unit testing reside inside the iflow artifacts. If you move/copy/deploy/git your iflows, makes perfect sense that your tests go with it  as a unit. Below you can find the diagram of the steps we follow on our pipeline.


Unit testing steps

1. Extract and 4. Execute groovy unit tests

The pipeline starts by downloading all your groovy files. For unit tests, we decided to have dedicated groovy test classes and as naming convention we decided to use the UnitTest.groovy suffix, e.g. for a StringUtils.groovy, you could have a StringUtilsUnitTest.groovy file. This was only imposed so that we can clearly see which class is testing what, but from technical perspective this is really not mandatory since jUnit is clever enough to go through a path and execute all classes with methods annotated with @Test.

Value added

Allow to add junit tests for your groovy classes making sure they are executed upon pipeline builds. If you have a mapping that rely on groovy script, you can unit test it and make sure it is returning the expected values. 

2. Extract and 3. Execute XSpec

A colleague mentioned that it would be pretty cool to also allow to execute unit tests for XSLT since we also use it in our mappings. After some investigation, I found XSpec and incorporated that into our pipeline. Unfortunately, you cannot add any file you want to your iflow via the Cloud Integration UI (only some extensions are allowed), so I had to make the workaround of trying to find both files ending in *.xspec or *.xspec.xsl so that you can add them via the UI and they are stored on your project.

We execute all xspec unit tests inside the iflows from the package we’re evaluating and report via junit reporting on jenkins

Value added

Allow to unit test your XSLTs and report failures on your pipeline per project on jenkins.

5. Generate jenkins reports

If any test fail, we report it into Jenkins reports and the build fails.

Value added

Allows to track your unit test execution, see how many tests were executed, how many failed, and lets you drill down into details from Jenkins UI

5. Email notifications for the responsible developer

If any test fail, we report it also via email to the responsible developer together with the details on the email body

Unit Testing – Message Mappings

On our integrations, message mappings is our preferred mapping mechanism. It allows you to map from a UI graphical editor and it is a technology mature enough that AFAIK is coming from PI development days. Moreover, it allows you to interact with custom groovy scripts for more sophisticated mappings or with standard functions for the most common scenarios. Your interface can fail at many points, but mapping is definitely one of the areas where we can have more regressions. Therefore we had to think about testing the message mappings.

1st approach – Use offline testing

With the help of the blogs from Vadim Klimov, I downloaded the karaf libraries from our tenant. In the end, you can download the files from your tenant either using groovy script that downloads files as Vadim shows on his blogs or via a reverse shell that you can interact via netcat connecting your local PC to your tenant. Idea was to identify the sap jar on your tenant container responsible to do a message mapping execution and invoke it via a custom program. This idea was great since you could invoke it offline with needing CPI, on the other hand it was quite dangerous since this was not an official approach from SAP, therefore SAP could change the jar or approach later on and your unit tests would stop working. Also, I believe java code is generated for every mapping you design and this java generator class is not part of our tenant list of jars, but if someone has investigated that please let me know in the comments

2nd approach – Use Cloud Integration runtime to test your message mappings

Idea was to have a dedicated iflow deployed containing only your message mappings as well as all the dependencies and execute this dedicated iflow to assert test results. Here’s  how it works


Message mapping unit tests steps

1. Extract message mappings

The package pipeline is looping each iflow and is extracting the list of all the message mappings contained inside it as well as all the resources that are potentially used inside of it.

2. Deploy dynamic iflow

We have a template iflow, containing only a message mapping example (that is replaced upon pipeline execution) and a step to translate headers back into properties.


Dynamic iflow

Idea is to replace the message mappings and dependencies and to invoke this dynamically created iflow with the headers, properties and body and get a result from it. Since http doesn’t support the concept of “properties” we need to translate that from supplied headers when doing the invocation. The name of the generated iflow contains a prefix, the name of the original iflow, the name of the message mapping so that we guarantee that there are no name clashes.

3. Change endpoint

Again to avoid name clashes, we change the endpoint to a unique name on the iflow design object. After this we deploy the iflow

4. Call the dynamically generated endpoint

We call the dynamically generated endpoint with the properties, headers and body supplied by the developer on the groovy unit test

5. and 6. Evaluate results

We use XMLUnit to make sure the xml results are not jeopardized by whitespaces and we finally compare the expected xml with the actual xml result. Limitation here is that we don’t support json on your message mappings, we only support xml so far.

7. and 8. Cleanup

After execution, we can undeploy and remove the dynamically created iflow. We later optimized this step so that the Prepare Phase and Teardown Phase is not executed unless the iflow under test was changed, since we have around 160 iflows containing message mapping unit tests and having this setup/teardown process every day doesn’t make sense if the iflows under test don’t change.

Example of a groovy test, testing message mappings

All steps described above are part of the package pipeline that is executed automatically so all this logic is really hidden from the developer. From developer perspective, the only thing he needs to know is that if he places unit tests on the resources of the iflows they would get automatically executed on a daily basis. So for message mappings for instance, the only thing the developer needs is a class such as below inside of the iflow resources.

import FormatNumber;
import org.junit.Test;
import static org.junit.Assert.assertEquals;

class Test001_SystemSubscribeMM_MessageMappingUnitTest implements Serializable {
    public String getBody() {
		def body = """<?xml version="1.0" encoding="utf-8"?>
						<Created>20211103 10:20</Created>
		return body
	public Map getProperties()
		def result = [:];
	    return result;
	public Map getHeaders()
		def headers = [SQL_Action_Command: "INSERT", SQL_Table: "FER_Table_Target"]
		return headers;
	public Map getExpectedHeaders()
		def headers = [SQL_Action_Command: "INSERT", SQL_Table: "FER_Table_Target"]
		return headers;
	public void testFormatNullableNumberNotNull()
		def numberTestFormat = new FormatNumber();
		def result = numberTestFormat.FormatNullableNumber("4.321");
		assertEquals("Result in this case should not be empty since argument is a valid number","4.321",result);
	public void testFormatNullableNumberNull()
		def numberTestFormat = new FormatNumber();
		def result = numberTestFormat.FormatNullableNumber(null);
		assertEquals("Result in this case should be 0 since argument is null","0.000",result);
	public String getExpectedBody()
		def expectedBody = """<?xml version="1.0" encoding="UTF-8"?>
<root><StatementName1><dbTableName action="INSERT"><table>FER_Table_Target</table><access><DUMMYCATEGORY>123</DUMMYCATEGORY><FAKE_CATEGORY>111</FAKE_CATEGORY><CUSTOMER>FakeCustomer</CUSTOMER><ENTERED_DATE>2021-11-03 00:00:00.000</ENTERED_DATE><QUANTITY>3.0</QUANTITY><CREATION_DATE>2021-11-03 10:20:00.000</CREATION_DATE><NOTES>FakeNote</NOTES><CREATION_SOURCE>FakeCreator</CREATION_SOURCE></access></dbTableName></StatementName1></root>""";
		return expectedBody;

  • getBody contains the body payload that will be injected into the dynamically created iflow
  • getProperties contains a map of key value properties that might be expected by your message mapping and that will provided when calling the message mapping
  • getHeaders contains a map of key value headers that might be expected by your message mapping and that will provided when calling the message mapping
  • getExpectedHeaders contains a map of expected headers that shall be present in the end. If this list is empty, no check on headers is applied
  • getExpectedBody contains the expected payload body resulted from calling the dynamic iflow
  • @Test methods are just there to illustrate that you can also do other regular unit tests as part of the same class if you want.


In this topic, I’ve introduced the topic of automated testing, presenting some of the tools we’ve researched as well as the chosen solution to perform automated unit testing via jenkins.

I would also invite you to share some feedback or thoughts on the comments sections. I’m sure there are still improvement or ideas for new rules that would benefit the whole community. You can always get more information about cloud integration on the topic page for the product.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Daniel Graversen
      Daniel Graversen

      Hi Nuro

      Thanks for the mentioing. I have been fearing if you had found some good easy to use solution that did not require a separate tool.

      At Figaf we support both running of the original iflow and then also using a mocked copy.

      A shorter way to create and run test cases can be seen in the video on creating test cases.

      I have created a post on performance testing cloud integration via Jmeter. With some examples. But I guess the benefit is if you an test backend performance also in the process, because it will likely be where you performance has problems.

      Some are using Postman to create test cases, but that would in most cases only work for sync iflows unless you build in a lot to logic.


      For running junit test for Groovy scripts and debug them you can reuse your existing test cases and then it will use that when a Junit test case see the video.

      I guess you can also add the Simulation mode but I guess that is only UI at the moment and will only allow you to test some parts of the flow. 


      Daniel, Figaf

      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Daniel,


      Thanks for answering, it would be great if the standard tool already supported this out of the box.

      Some questions:

      • Can you detail how does the "mocked copy" process work? Do you need to manually copy and adjust the urls of your iflow to match the ones on your cockpit? Or does this process is done automatically, meaning in the process of copy you register somewhere on your cockpit all the extracted endpoints to mockup where you can inject your mock data?
      • Do you also support to create test cases manually or to just execute part of your iflow? I'm asking this since it can be that you can't execute your iflow in trace mode with real test data since it can damage your targets. I acknowledge that we're talking about DEV environments where this should not be the case, but still is something to think about
      • Really great video with JMeter, I was curious not only to see how CPI behaves in terms of memory usage, response times, etc, as well as to see the side effects of running so many executions at the same time and potential issues that could arise from it (like getting 503 errors, mpls that do not complete and get stalled forever, and other abnormal unexpected behaviors).I'm not saying that it will occur, I'm just saying that there's potential for such issues. Have you seen any abnormal issues during your tests resulted from this massive parallel calls at the same time?
      • For editing and debugging our unit tests, we use eclipse together with the dependencies (groovy engine and some cpi jars) so we can edit the code with content assist, but then we just copy the end result to the iflow once we're happy. Again, IMO, these are not optimal processes and most of them sounds like workarounds to get the experience we're used to have on other development environments such as eclipse or vscode (git native integration, IDE offline editing, debug, content assist, etc)

      But thanks for sharing your thoughts with me and the community, I can see from your links that this is a well known and explored area by you that we can all learn from.



      Nuno Pereira

      Author's profile photo Daniel Graversen
      Daniel Graversen


      Good questions.

      1)If you decide to mock a given test case. The Figaf Tool will create a copy of that iflow and change URLs to point to Figaf. Once a messages is run and it reaches the endpoint figaf will return the expected payload. If the iflow is updated a now mock is automatically created.

      The response will be the one the original executed had after the request reply was. So if you want to you can change the payload here.

      2) We normally execute the full iflow in trace mode because then we don't need to have persist steps in where we can perform the testing. If you call create employe then you probably need to be mocking that iflow, but if it is send FTP or query Odata you don't need to muck such end points.

      3) No we have not scaled up our Jmeter testing. And ofcause you can if you have build an iflow wrongly have a bottleneck there where you run into Memory errors. And it is important to test.

      4) Agree the debug setup in Figaf does take a few steps to complete. But you will essential have all your code in our a git repository where you can use your favorite IDE. Each time you update the iflow it will be update in the code.

      We use our Git more as a distributed file system with version. We would not expect you to commit other than small changes into it. We do have the gradle plugins that allows you to upload and download iflows directly between CPI but we do have some limitations with the end user. So we recommend you upload the artifact in CPI.

      It is pretty easy to try out the Figaf tool either run it locally or in SAP BTP, it takes 30 minutes to get started for free at


      Author's profile photo Morten Wittrock
      Morten Wittrock
      Very interesting read, Nuno. I'm glad you focus on unit testing. Regression testing is useful, for sure, but regression testing can never confirm that your, say, mapping actually works in the first place; it can only tell you that the mapping works the same as it did prior to some change. But regression testing is obviously easier to implement and requires less work to be done by the developer.
      It's not great that you currently have to jump through some hoops to automate the unit testing of your iflows. There is a very nice solution to this problem on the SAP Cloud Integration roadmap (this one), but it's going to be a while before we can actually use it. But as I see it, native support for unit testing is the only solution that makes sense in the long term.
      Author's profile photo Nuno Pereira
      Nuno Pereira
      Blog Post Author

      Hi Morten,

      Thank you for your comment. I fully agree that only a native support for unit testing would be the definitive standard solution, just wanted to empathize that we wanted to make it as easy as possible to the developer. The developer can use the simulation mode to get the necessary payloads and then create the groovy test class (the example section I show above). All the other process steps described here does not need to be known by the developer, so despite I agree there are a couple of steps performed automatically by Jenkins, from developer perspective I think it's quite easy to use since the developer only needs to place the groovy test file inside the iflow resources.

      I'm not sure if this was clear to the readers so I just edited that last section to mention that all of this is done automatically for all the packages