Skip to Content
Technical Articles
Author's profile photo ASUTOSH MAHARANA

OSGI Events in SAP Integration Suite

Disclaimer

The content provided in this blog post is intended solely for informational and technical demonstration purposes. Any utilization of the described techniques is at your own discretion and risk as it involves functionalities not officially documented by SAP.

Intro

Approximately four years ago, Raffael Herrmann shared a blog post detailing the process of listening internal SAP Cloud Integration events. In the context of developing a CI/CD approach for the SAP Integration Suite, the idea emerged to establish a more efficient method beyond periodically querying the Integration Suite API. This led to the exploration of leveraging events from the SAP Cloud Integration to seamlessly trigger Jenkins pipelines. The following outlines an approach to subscribing to OSGi bundle framework events and their integration into Jenkins pipelines for the purpose of enhancing the CI/CD capabilities within the SAP Integration Suite.

Acknowledgement:

Additionally, insights from Vadim Klimov‘s blog on OSGi Shell Commands have significantly contributed to comprehending and implementing OSGi functionalities within this context.

Table of Contents

  1. Overview
  2. Understanding OSGi Events and Event Handling
  3. Integration Flow Logic
  4. Setting Up Jenkins for OSGi Event Triggers
  5. Test Run

Overview

This blog demonstrates deploying an iFlow in SAP Cloud Integration to trigger a Jenkins pipeline leveraging OSGI Bundle Framework Events. This pipeline performs two key actions: backing up the iFlow binaries to GitHub and executing essential prerequisite checks utilizing Morten Wittrock‘s cpilint program. Go through my GitHub repo for all the code.

System%20Architecture%20Diagram

System Architecture Diagram

Understanding OSGi Events and Event Handling

OSGi, short for Open Service Gateway Initiative, serves as a dynamic module system for Java, enabling the creation of modular applications through the concept of bundles. In the realm of SAP Cloud Integration, OSGi plays a pivotal role in the modularization of artifacts into bundle JARs. These bundles encapsulate discrete units of functionality, allowing for efficient management and deployment of components within the landscape.

When an iFlow, is deployed within the SAP Cloud Integration environment, OSGi’s lifecycle mechanisms are invoked. This initiation triggers a series of lifecycle events, commencing with bundle activation, where the iFlow’s associated bundle JARs are initialized and made operational.

Throughout this lifecycle, various OSGi framework events are generated, including bundle-specific events. Among these, bundle framework events are significant, as they capture critical occurrences such as bundle installation, starting, stopping, and uninstallation, providing insights into the state changes and interactions within the OSGi environment. These events, stemming from the OSGi lifecycle, serve as essential triggers that can be harnessed to drive subsequent actions or workflows within the SAP Cloud Integration.

OSGI%20Bundle%20Lifecycle%20and%20Bundle%20Framework%20Event

OSGI Bundle Lifecycle and Bundle Framework Event

Event handlers in OSGi intercept and respond to specific events within the framework. The Event Admin service manages event communication. To register custom event handlers, we implement handlers, define event types of interest, and specify corresponding actions. By attaching these handlers to relevant events, functionalities can be triggered based on OSGi events within SAP Cloud Integration.

Integration Flow Logic

We created an iFlow to manage event handler registration and deregistration, following steps outlined in the blog while adjusting specific logics. Our primary modifications centered around enhancing the Event Handler logic. This involved integrating a custom implementation to store events using data store APIs, drawing inspiration from an undocumented approach found in another blog. Additional exception handling was incorporated within try-catch blocks, while we removed code related to triggering REST endpoints.

Integration%20Flow%20for%20Event%20Handler

Integration Flow for Event Handler

You can download the zip file from the link.

import com.sap.it.api.asdk.datastore.*
import com.sap.it.api.asdk.runtime.*
import com.sap.gateway.ip.core.customdev.util.Message
import groovy.json.JsonOutput
import org.osgi.framework.*
import org.osgi.service.event.*

def Message processData(Message message) {
    //Get message properties, which were set in ContentModifier (1)
    def eventHandlerId = message.getProperty("eventHandlerId")
    def eventListenType = message.getProperty("eventListenType")
    //Register our custom IFlow event
    def res = registerOSGiEvent(eventHandlerId, eventListenType)
    return message
}
/***********************************************
 * This function helps registering OSGi events *
 ***********************************************/
private registerOSGiEvent(def eventHandlerId, def eventListenType) {
    //Get general bundle context
    def bundleCtx = FrameworkUtil.getBundle(Class.forName("com.sap.gateway.ip.core.customdev.util.Message")).getBundleContext()

    //Define the topics we like to listen to
    def topics = eventListenType
    //Configure our event listener
    def props = new Hashtable();
    props.put(EventConstants.EVENT_TOPIC, topics)
    props.put("custom.id", eventHandlerId)

    //Register custom EventHandler as service and pass properties, credentials
    bundleCtx.registerService(EventHandler.class.getName(), new DeployEventHandler(), props)
    return [successful: true]
}
/**************************************************************
 * This is the custom eventhandler class, we want to register *
 **************************************************************/
public class DeployEventHandler implements EventHandler {
    //This function will be called everytime, when an
    //event with a matching topic passes by. Everything which
    //should happen at an event, must be implemented here.
    public void handleEvent(Event event) {
        //The complete code is called as "async" Runnable in a different thread,
        //because OSGi has a default timeout of 5000ms for events. If an event-
        //handler takes more time, it will be blacklisted. By use of Runnable,
        //we can bypass this limit.
        Runnable runnable = {
            //Build event information
            def evntMsg = []
            try {
                evntMsg = [topic: event.getTopic(), bundleName: event.getProperty("bundle").getSymbolicName(), TimeStamp: event.getProperty("timestamp")]
                def bundle = event.getProperty("bundle").getSymbolicName();
                def pattern = ~/^Test_[a-fA-F0-9]{32}$/
                if (!(bundle ==~ pattern)) {
                    def service = new Factory(DataStoreService.class).getService()
                    //Check if valid service instance was retrieved
                    if (service != null) {
                        def dBean = new DataBean()
                        dBean.setDataAsArray(JsonOutput.toJson(evntMsg).getBytes("UTF-8"))
                        //Define datatore name and entry id
                        def dConfig = new DataConfig()
                        dConfig.setStoreName("osgiEvents")
                        dConfig.setId(bundle)
                        dConfig.doOverwrite()
                        //Write to data store
                        def result = service.put(dBean, dConfig)
                    }
                }
            } catch (Exception e) {
                def pattern = ~/An entry with id \w+ does already exist in data store \w+/
                if (!(e instanceof com.sap.it.api.asdk.exception.DataStoreException &&
                        e.message ==~ pattern)) {
                    evntMsg = e.toString();
                    def service = new Factory(DataStoreService.class).getService()
                    //Check if valid service instance was retrieved
                    if (service != null) {
                        def dBean = new DataBean()
                        dBean.setDataAsArray(evntMsg.getBytes("UTF-8"))
                        //Define datatore name and entry id
                        def dConfig = new DataConfig()
                        dConfig.setStoreName("osgiEventsError")
                        dConfig.setId("error")
                        dConfig.doOverwrite()
                        //Write to data store
                        def result = service.put(dBean, dConfig)
                    }
                }
            }
        }
        //Call the Runnable
        def thread = new Thread(runnable);
        thread.start();
    }
}

Setting Up Jenkins for OSGi Event Triggers

I’ve created a Dockerfile based on a preconfigured Jenkins-Alpine image. You can either push this image to Docker Hub or directly utilize my Docker Hub image to deploy the container into SAP Cloud Foundry spaces using the commands below.

cf push <NameForYourContainer> --docker-image asutoshmaharana23/jenkins-light-plugin:latest --docker-username <YourDockerHubUser>

You can review the Dockerfile provided below to create your own image via the “Docker Build” command and subsequently push it to DockerHub. Afterwards, use the “cf push” command to deploy it in SAP Cloud Foundry or execute the “Docker run” command for local deployment. This setup establishes a Jenkins server with preconfigured plugins and an admin user (password set as “admin” which you can reset as needed).

FROM jenkins/jenkins:alpine

ENV JENKINS_USER admin
ENV JENKINS_PASS admin

# Skip initial setup
ENV JAVA_OPTS -Djenkins.install.runSetupWizard=false

USER root
RUN apk add docker
RUN apk add py-pip

RUN jenkins-plugin-cli \
    --plugins \
    bouncycastle-api \
    instance-identity \
    javax-activation-api \
    javax-mail-api \
    structs \
    workflow-step-api \
    scm-api \
    workflow-api \
    pipeline-milestone-step \
    caffeine-api \
    script-security \
    workflow-support \
    pipeline-build-step \
    workflow-scm-step \
    ionicons-api \
    cloudbees-folder \
    variant \
    workflow-cps \
    pipeline-groovy-lib \
    credentials \
    plain-credentials \
    trilead-api \
    ssh-credentials \
    credentials-binding \
    pipeline-stage-step \
    jaxb \
    snakeyaml-api \
    jackson2-api \
    pipeline-model-api \
    workflow-job \
    pipeline-model-extensions \
    jakarta-activation-api \
    jakarta-mail-api \
    display-url-api \
    mailer \
    branch-api \
    workflow-multibranch \
    durable-task \
    workflow-durable-task-step \
    pipeline-stage-tags-metadata \
    mina-sshd-api-common \
    mina-sshd-api-core \
    apache-httpcomponents-client-4-api \
    git-client \
    pipeline-input-step \
    workflow-basic-steps \
    pipeline-model-definition \
    workflow-aggregator \
    generic-webhook-trigger \
    git \
    okhttp-api \
    commons-lang3-api \
    github-api \
    token-macro \
    github \
    jjwt-api \
    github-branch-source \
    http_request \
    commons-text-api \
    pipeline-utility-steps \
    file-operations \
    pipeline-graph-analysis \
    pipeline-rest-api \
    pipeline-stage-view 

RUN echo $'import jenkins.model.Jenkins \n\
    import hudson.security.* \n\
    import jenkins.security.s2m.AdminWhitelistRule \n\

    def jenkinsUser = System.getenv("JENKINS_USER") ?: "admin" \n\
    def jenkinsPass = System.getenv("JENKINS_PASS") ?: "admin" \n\
    def instance = Jenkins.getInstance() \n\
    def hudsonRealm = new HudsonPrivateSecurityRealm(false) \n\
    hudsonRealm.createAccount(jenkinsUser, jenkinsPass) \n\
    instance.setSecurityRealm(hudsonRealm) \n\
    def strategy = new FullControlOnceLoggedInAuthorizationStrategy() \n\
    strategy.setAllowAnonymousRead(false) \n\
    instance.setAuthorizationStrategy(strategy) \n\
    instance.save() \n\
    ' > /usr/share/jenkins/ref/init.groovy.d/init-security.groovy

USER jenkins

Jenkins Basic Configuration

First create credentials for CPI and GitHub in Manage Jenkins -> Credentials page.

  • Use CPI Service key ClientId and ClientSecret as Username and Password and give ID “CPIOAuthCredentials” as below.

SAP%20Cloud%20Integration%20Credentials

SAP Cloud Integration Credentials

  • Create GitHub Credentials using the same method. Use your GitHub User and PAT Token as username and password. Give ID as “GIT_Credentials”.

GitHub%20Credentials

GitHub Credentials

  • Then go to Manage Jenkins -> Configure System and configure below environment variables.
Name Value
CPI_HOST {{url value from Service Key without https://}} e.g. xxxxxxxxxxx.it-cpi002.cfapps.ap10.hana.ondemand.com
CPI_OAUTH_CRED CPIOAuthCredentials
CPI_OAUTH_HOST {{tokenurl value from Service Key without https://}} e.g. xxxxxxxxxxx.authentication.ap10.hana.ondemand.com
GIT_BRANCH_NAME master
GIT_CRED GIT_Credentials
GIT_REPOSITORY_URL github.com/Asutosh-Integration/Jenkins.git (Use your own forked repo)

Then create a pipeline with below configuration.

Then you will be trigger the jenkins pipeline using webhook trigger. Next we will configure another iFlow to trigger the webhook call from data store entries.

Integration%20flow%20for%20webhook%20trigger

Integration flow for webhook trigger

Data%20Store%20Sender%20Channel%20Configuration

Data Store Sender Channel Configuration

Add below Groovy Script.

import com.sap.gateway.ip.core.customdev.util.Message;
import groovy.json.*

def Message processData(Message message) {
    def json = new JsonSlurper().parseText(message.getBody(String));
    message.setHeader('name',json.bundleName)
    message.setHeader('Authorization','Bearer <Token Mentioned in Jenkins Webhook Trigger>')
    return message;
}

HTTP%20Receiver%20Channel%20Configuration

HTTP Receiver Channel Configuration

Test Run

We will deploy any iFlow in our tenant.

explorer%20iFlow%20Deployed

explorer iFlow Deployed

We are able to see an entry in Data Store for the same.

Data%20Store%20Entry%20for%20the%20event

Data Store Entry for the event

This message will get picked up by our webhook iflow and the jenkins pipeline will get triggered.

Messages%20got%20processed

Messages got processed

Jenkins%20Pipeline%20got%20completed

Jenkins Pipeline got completed

Binary files archived in GitHub.

Binary%20files%20got%20archived%20in%20GitHub

Binary files got archived in GitHub

Code Analysis was done by cpilint.

Code%20Analysis%20was%20done%20by%20cpilint

Code Analysis was done by cpilint

Conclusion

This attempt aims to capture internal SAP Cloud Integration events, potentially applicable across various use cases. However, the undocumented nature of these methods may present implementation challenges. Special thanks to Raffael Herrmann, Vadim Klimov and Morten Wittrock for their invaluable insights shared in their blogs.

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.