Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
asutoshmaharana2326
Active Participant

Disclaimer


The content provided in this blog post is intended solely for informational and technical demonstration purposes. Any utilization of the described techniques is at your own discretion and risk as it involves functionalities not officially documented by SAP.

Intro


Approximately four years ago, r_herrmann shared a blog post detailing the process of listening internal SAP Cloud Integration events. In the context of developing a CI/CD approach for the SAP Integration Suite, the idea emerged to establish a more efficient method beyond periodically querying the Integration Suite API. This led to the exploration of leveraging events from the SAP Cloud Integration to seamlessly trigger Jenkins pipelines. The following outlines an approach to subscribing to OSGi bundle framework events and their integration into Jenkins pipelines for the purpose of enhancing the CI/CD capabilities within the SAP Integration Suite.

Acknowledgement:


Additionally, insights from vadim.klimov's blog on OSGi Shell Commands have significantly contributed to comprehending and implementing OSGi functionalities within this context.

Table of Contents



  1. Overview

  2. Understanding OSGi Events and Event Handling

  3. Integration Flow Logic

  4. Setting Up Jenkins for OSGi Event Triggers

  5. Test Run


Overview


This blog demonstrates deploying an iFlow in SAP Cloud Integration to trigger a Jenkins pipeline leveraging OSGI Bundle Framework Events. This pipeline performs two key actions: backing up the iFlow binaries to GitHub and executing essential prerequisite checks utilizing 7a519509aed84a2c9e6f627841825b5a's cpilint program. Go through my GitHub repo for all the code.


System Architecture Diagram



Understanding OSGi Events and Event Handling


OSGi, short for Open Service Gateway Initiative, serves as a dynamic module system for Java, enabling the creation of modular applications through the concept of bundles. In the realm of SAP Cloud Integration, OSGi plays a pivotal role in the modularization of artifacts into bundle JARs. These bundles encapsulate discrete units of functionality, allowing for efficient management and deployment of components within the landscape.

When an iFlow, is deployed within the SAP Cloud Integration environment, OSGi's lifecycle mechanisms are invoked. This initiation triggers a series of lifecycle events, commencing with bundle activation, where the iFlow's associated bundle JARs are initialized and made operational.

Throughout this lifecycle, various OSGi framework events are generated, including bundle-specific events. Among these, bundle framework events are significant, as they capture critical occurrences such as bundle installation, starting, stopping, and uninstallation, providing insights into the state changes and interactions within the OSGi environment. These events, stemming from the OSGi lifecycle, serve as essential triggers that can be harnessed to drive subsequent actions or workflows within the SAP Cloud Integration.


OSGI Bundle Lifecycle and Bundle Framework Event


Event handlers in OSGi intercept and respond to specific events within the framework. The Event Admin service manages event communication. To register custom event handlers, we implement handlers, define event types of interest, and specify corresponding actions. By attaching these handlers to relevant events, functionalities can be triggered based on OSGi events within SAP Cloud Integration.

Integration Flow Logic


We created an iFlow to manage event handler registration and deregistration, following steps outlined in the blog while adjusting specific logics. Our primary modifications centered around enhancing the Event Handler logic. This involved integrating a custom implementation to store events using data store APIs, drawing inspiration from an undocumented approach found in another blog. Additional exception handling was incorporated within try-catch blocks, while we removed code related to triggering REST endpoints.


Integration Flow for Event Handler


You can download the zip file from the link.
import com.sap.it.api.asdk.datastore.*
import com.sap.it.api.asdk.runtime.*
import com.sap.gateway.ip.core.customdev.util.Message
import groovy.json.JsonOutput
import org.osgi.framework.*
import org.osgi.service.event.*

def Message processData(Message message) {
//Get message properties, which were set in ContentModifier (1)
def eventHandlerId = message.getProperty("eventHandlerId")
def eventListenType = message.getProperty("eventListenType")
//Register our custom IFlow event
def res = registerOSGiEvent(eventHandlerId, eventListenType)
return message
}
/***********************************************
* This function helps registering OSGi events *
***********************************************/
private registerOSGiEvent(def eventHandlerId, def eventListenType) {
//Get general bundle context
def bundleCtx = FrameworkUtil.getBundle(Class.forName("com.sap.gateway.ip.core.customdev.util.Message")).getBundleContext()

//Define the topics we like to listen to
def topics = eventListenType
//Configure our event listener
def props = new Hashtable();
props.put(EventConstants.EVENT_TOPIC, topics)
props.put("custom.id", eventHandlerId)

//Register custom EventHandler as service and pass properties, credentials
bundleCtx.registerService(EventHandler.class.getName(), new DeployEventHandler(), props)
return [successful: true]
}
/**************************************************************
* This is the custom eventhandler class, we want to register *
**************************************************************/
public class DeployEventHandler implements EventHandler {
//This function will be called everytime, when an
//event with a matching topic passes by. Everything which
//should happen at an event, must be implemented here.
public void handleEvent(Event event) {
//The complete code is called as "async" Runnable in a different thread,
//because OSGi has a default timeout of 5000ms for events. If an event-
//handler takes more time, it will be blacklisted. By use of Runnable,
//we can bypass this limit.
Runnable runnable = {
//Build event information
def evntMsg = []
try {
evntMsg = [topic: event.getTopic(), bundleName: event.getProperty("bundle").getSymbolicName(), TimeStamp: event.getProperty("timestamp")]
def bundle = event.getProperty("bundle").getSymbolicName();
def pattern = ~/^Test_[a-fA-F0-9]{32}$/
if (!(bundle ==~ pattern)) {
def service = new Factory(DataStoreService.class).getService()
//Check if valid service instance was retrieved
if (service != null) {
def dBean = new DataBean()
dBean.setDataAsArray(JsonOutput.toJson(evntMsg).getBytes("UTF-8"))
//Define datatore name and entry id
def dConfig = new DataConfig()
dConfig.setStoreName("osgiEvents")
dConfig.setId(bundle)
dConfig.doOverwrite()
//Write to data store
def result = service.put(dBean, dConfig)
}
}
} catch (Exception e) {
def pattern = ~/An entry with id \w+ does already exist in data store \w+/
if (!(e instanceof com.sap.it.api.asdk.exception.DataStoreException &&
e.message ==~ pattern)) {
evntMsg = e.toString();
def service = new Factory(DataStoreService.class).getService()
//Check if valid service instance was retrieved
if (service != null) {
def dBean = new DataBean()
dBean.setDataAsArray(evntMsg.getBytes("UTF-8"))
//Define datatore name and entry id
def dConfig = new DataConfig()
dConfig.setStoreName("osgiEventsError")
dConfig.setId("error")
dConfig.doOverwrite()
//Write to data store
def result = service.put(dBean, dConfig)
}
}
}
}
//Call the Runnable
def thread = new Thread(runnable);
thread.start();
}
}

Setting Up Jenkins for OSGi Event Triggers


I've created a Dockerfile based on a preconfigured Jenkins-Alpine image. You can either push this image to Docker Hub or directly utilize my Docker Hub image to deploy the container into SAP Cloud Foundry spaces using the commands below.
cf push <NameForYourContainer> --docker-image asutoshmaharana23/jenkins-light-plugin:latest --docker-username <YourDockerHubUser>

You can review the Dockerfile provided below to create your own image via the "Docker Build" command and subsequently push it to DockerHub. Afterwards, use the "cf push" command to deploy it in SAP Cloud Foundry or execute the "Docker run" command for local deployment. This setup establishes a Jenkins server with preconfigured plugins and an admin user (password set as "admin" which you can reset as needed).
FROM jenkins/jenkins:alpine

ENV JENKINS_USER admin
ENV JENKINS_PASS admin

# Skip initial setup
ENV JAVA_OPTS -Djenkins.install.runSetupWizard=false

USER root
RUN apk add docker
RUN apk add py-pip

RUN jenkins-plugin-cli \
--plugins \
bouncycastle-api \
instance-identity \
javax-activation-api \
javax-mail-api \
structs \
workflow-step-api \
scm-api \
workflow-api \
pipeline-milestone-step \
caffeine-api \
script-security \
workflow-support \
pipeline-build-step \
workflow-scm-step \
ionicons-api \
cloudbees-folder \
variant \
workflow-cps \
pipeline-groovy-lib \
credentials \
plain-credentials \
trilead-api \
ssh-credentials \
credentials-binding \
pipeline-stage-step \
jaxb \
snakeyaml-api \
jackson2-api \
pipeline-model-api \
workflow-job \
pipeline-model-extensions \
jakarta-activation-api \
jakarta-mail-api \
display-url-api \
mailer \
branch-api \
workflow-multibranch \
durable-task \
workflow-durable-task-step \
pipeline-stage-tags-metadata \
mina-sshd-api-common \
mina-sshd-api-core \
apache-httpcomponents-client-4-api \
git-client \
pipeline-input-step \
workflow-basic-steps \
pipeline-model-definition \
workflow-aggregator \
generic-webhook-trigger \
git \
okhttp-api \
commons-lang3-api \
github-api \
token-macro \
github \
jjwt-api \
github-branch-source \
http_request \
commons-text-api \
pipeline-utility-steps \
file-operations \
pipeline-graph-analysis \
pipeline-rest-api \
pipeline-stage-view

RUN echo $'import jenkins.model.Jenkins \n\
import hudson.security.* \n\
import jenkins.security.s2m.AdminWhitelistRule \n\

def jenkinsUser = System.getenv("JENKINS_USER") ?: "admin" \n\
def jenkinsPass = System.getenv("JENKINS_PASS") ?: "admin" \n\
def instance = Jenkins.getInstance() \n\
def hudsonRealm = new HudsonPrivateSecurityRealm(false) \n\
hudsonRealm.createAccount(jenkinsUser, jenkinsPass) \n\
instance.setSecurityRealm(hudsonRealm) \n\
def strategy = new FullControlOnceLoggedInAuthorizationStrategy() \n\
strategy.setAllowAnonymousRead(false) \n\
instance.setAuthorizationStrategy(strategy) \n\
instance.save() \n\
' > /usr/share/jenkins/ref/init.groovy.d/init-security.groovy

USER jenkins

Jenkins Basic Configuration


First create credentials for CPI and GitHub in Manage Jenkins -> Credentials page.

  • Use CPI Service key ClientId and ClientSecret as Username and Password and give ID “CPIOAuthCredentials” as below.



SAP Cloud Integration Credentials




  • Create GitHub Credentials using the same method. Use your GitHub User and PAT Token as username and password. Give ID as “GIT_Credentials”.



GitHub Credentials




  • Then go to Manage Jenkins -> Configure System and configure below environment variables.

































Name Value
CPI_HOST {{url value from Service Key without https://}} e.g. xxxxxxxxxxx.it-cpi002.cfapps.ap10.hana.ondemand.com
CPI_OAUTH_CRED CPIOAuthCredentials
CPI_OAUTH_HOST {{tokenurl value from Service Key without https://}} e.g. xxxxxxxxxxx.authentication.ap10.hana.ondemand.com
GIT_BRANCH_NAME master
GIT_CRED GIT_Credentials
GIT_REPOSITORY_URL github.com/Asutosh-Integration/Jenkins.git (Use your own forked repo)

Then create a pipeline with below configuration.




Then you will be trigger the jenkins pipeline using webhook trigger. Next we will configure another iFlow to trigger the webhook call from data store entries.


Integration flow for webhook trigger



Data Store Sender Channel Configuration


Add below Groovy Script.
import com.sap.gateway.ip.core.customdev.util.Message;
import groovy.json.*

def Message processData(Message message) {
def json = new JsonSlurper().parseText(message.getBody(String));
message.setHeader('name',json.bundleName)
message.setHeader('Authorization','Bearer <Token Mentioned in Jenkins Webhook Trigger>')
return message;
}


HTTP Receiver Channel Configuration



Test Run


We will deploy any iFlow in our tenant.


explorer iFlow Deployed


We are able to see an entry in Data Store for the same.


Data Store Entry for the event


This message will get picked up by our webhook iflow and the jenkins pipeline will get triggered.


Messages got processed



Jenkins Pipeline got completed


Binary files archived in GitHub.


Binary files got archived in GitHub


Code Analysis was done by cpilint.


Code Analysis was done by cpilint



Conclusion


This attempt aims to capture internal SAP Cloud Integration events, potentially applicable across various use cases. However, the undocumented nature of these methods may present implementation challenges. Special thanks to r_herrmann, vadim.klimov and 7a519509aed84a2c9e6f627841825b5a for their invaluable insights shared in their blogs.
Labels in this area