Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
valivanov
Explorer

Introduction


In one of my previous articles I was talking about the option of using Postman to support testing SAP Cloud Integration scenarios by storing the test cases as Postman requests and using the embedded tools to run the tests manually or automatically. On this article, we will extend the scenario by integrating also Newman cli and Jenkins that will help us to include the required test steps into a CI/CD pipeline.


SAP DevOps and SAP BTP CI/CD topics received quite good updates in the last period. I strongly recommend you to check the OpenSAP course, Efficient DevOps with SAP, to get the latest updates and the SAP roadmap in terms of SAP DevOps. More specifically on the CI/CD for Cloud Integration, follow this very good blog, by Axel Albrecht.

It is not the purpose of this blog to explain again what DevOps is and what benefits can a CI/CD pipeline can bring. But, in the same time, we can all agree that testing is a crucial step in any CI/CD pipeline, and this is what the article is all about, an approach on how to include testing in a Jenkins pipeline that will automate the testing scenarios.

Whenever you are starting with SAP DevOps topics, there are three main SAP solutions: CI/CD Service, Project "Piper" and CI/CD Best Practices Guide. For our scenario, we do not start from zero, Project "Piper" is our starting point. So, combine the SAP documentation with the details that Axel Albrecht is presenting in his blog to understand how the CI/CD solution can be a companion for SAP Cloud Integration Suite and then explore the next solution approach on how you can add testing to your pipeline.


Solution overview



Fig.1 - General architecture


The diagram describes the general architecture of the solution, starting from SAP Cloud Integration where the flows are executed, continuing in the microservice component, responsible with scenario definition and test cases generation and in the last stage, using Jenkins and Newman to automate the testing step, as part of a CI/CD pipeline. Sequentially, the main message flows are:

  1. Inbound Message will be sent to SAP Cloud Integration

  2. Message transformation / mapping / orchestration logic will be triggered in SAP Cloud Integration

  3. Outbound Message is triggered from SAP Cloud Integration

  4. Call SAP Cloud Integration OData APIs to extract the input and output test data: body, headers and properties

  5. Call Postman API to create the test case as Postman request including request body, headers, pre-request script, testing script

  6. Call Jenkins API to create pipeline job and to build/run the job

  7. Newman CLI gets the Postman collection details and run the requests. Results are stored as custom HTML reporter output



Solution details


Create test cases as Postman requests


Use a specific microservice/app to call the SAP Cloud Integration OData APIs and build the Postman requests. Check my previous article on how you can do this. Create/synchronize also a Postman environment for each Cloud Integration tenant. You can organize your test cases in different collections or folders based on your requirements.

Depending on how the test scenario is defined and how the result is evaluated, one Postman request could look like this:


Fig.2 - Sample Postman request


 

Prepare Jenkins instance


Download a Jenkins docker image, either the official image or the Piper image. Then:

Install Node.js, Newman and HTML Extra Reporter on the Jenkins docker image. The last two are npm modules, so the entire installation process could be something like this (depending on your docker image):
# start docker
docker run -u 0 -it ppiper/jenkins-master /bin/sh

 
# update 
apt-get update
# install curl
apt-get install curl
# get install script and pass it to execute:
curl -sL https://deb.nodesource.com/setup_15.x | bash
# and install node
apt-get install nodejs


# confirm that it was successful
node -v
# npm installs automatically
npm -v

# install newman and html extra reporter
npm install -g newman
npm install -g newman-reporter-htmlextra

 

Automate the Jenkins pipeline job: create and build


In order to automate the Jenkins pipeline and to use the pipeline as a code approach, we can use the Jenkins Remote API. You can do things like these:


  1. retrieve information from Jenkins for programmatic consumption.

  2. trigger a new build

  3. create/copy jobs



a. Create Jenkins job

Just start from your own pipeline logic and add on top of this the testing step. For simplicity, on our case the pipeline will contain just the testing step.

In the build area of the job include this script to execute the Newman command:
newman run "https://api.getpostman.com/collections/${collectionUid}?apikey=${apiKey}" -e "https://api.getpostman.com/environments/${envId}?apikey=${apiKey}" --folder "${folder}" --iteration-count "${iterationCount}" --reporters "cli,htmlextra" --reporter-htmlextra-export "/var/jenkins_home/workspace/${JOB_NAME}/newman/newman_result.html"


Fig.3 - Jenkins job: Build section with Newman command


As you can see, the Newman command includes these set of parameters:



    • collectionUid - this is the Uid of the Postman collection where the test requests are saved. Collection Uid can be retrived from Postman using a GET HTTP request to https://api.getpostman.com/collections/

    • apiKey - the apiKey used to call Postman API, this has to be created in advance in the Postman account

    • envId - the Postman environment id - where all the required tenant details will be stored, you can use this api to get it: https://api.getpostman.com/environments

    • folder - in case only a subset of the entire collection has to be tested, a folder name can be sent as parameter as well

    • iterationCount - to control the number of execution runs




Also, add the required configuration in the Post-build Actions area to generate the html report at the end of the test run (we used the HTML Extra custom reporter, but the options are multiple, and a new reporter can also be developed):


Fig.4 - Jenkins job: Post-build Actions with HTML report output configuration


Tip: Check Jenkins API documentation on how to automatically create a Jenkins job. Use a sample job created in advance in Jenkins as template. Using our automated tool, using the pipeline as a code approach, we can create the job with just a click.

The create job functionality in our scenario will:

  • get the template job details by calling the Jenkins API:
    ${Jenkins.host}/job/${templateJob}/config.xml​


  • change the config xml based on the specific requirements

  • create a new job using this API and posting the xml config file:
    ${Jenkins.host}/createItem?name=${newJobName}​



(Consider this user interface just as an example).


Fig.5 - Microservice UI: create job section


b. Trigger Build Jenkins job

As the Jenkins job is to be built with parameters, Postman collection details will be sent as parameters to the API.


Fig.6 - Microservice UI: build job section


Tip: The same Jenkins API can be used for build phase to completely automate the process:
${Jenkins.host}/job/${jobName}/buildWithParameters?collectionUid=${collectionUid}&apiKey=${apiKey}&envId=${envID}&folder=${folder}&iterationCount=${iteration}

 

Jenkins output with custom Newman reporters


a. Check the build run started in the Build History section then check the HTML Report in the left menu:


Fig.7 - Jenkins job: Build run executed and HTML Report generated


b. Open the reporter output, in this case the html report:


Fig.6 - Jenkins job: Newman Run Dashboard


The custom HTML extra reporter includes different views, a summary with all the requests and iterations, a view on request level where you can see the request body and headers, response body and headers and the test results. Check this video to understand more about the reporter look and feel.


Summary


Even though many extensions are possible, the solution exposed in this article is meant to prepare the foundation for a complete cloud integration pipeline, where testing is recognized as mandatory step. Everyone has a clear view about the "WHY", but in terms of "HOW", from the multitude of possible solutions this one has the advantage of reusing open-source components and it is built on top of existing SAP investments and developments.

 

Labels in this area