Skip to Content
Technical Articles

SAP CPI – Event Driven Architecture (Z Event) with S4HANA and SAP EM

Hello Folks,

One more interesting blog to share the knowledge and experience with you, based on test cases, basically let’s follow the topics and instructions during the reading.


  1. Introduction

  2. Event Driven Architecture (EDA)

  3. Scenario and Integration Perspective

  4. Iflow – SAP CPI

  5. Import detail – Don’t be afraid, GROOVY it !!!

    1. XSLT Details

What I will not cover in the blog ?

  • The whole setup of S4HANA (Connections and abap code).
  • The configuration of SAP EM (Queue,webhook and further details).
  • The whole setup of Iflow – just important parts.


I would like to share is how the Event Driven Architecture and how this can be a game change in the new prespective of integration definitions, with new model focous in simple integrations based on microservices.

SAP Integration Suite (EM and CPI) can support your bussiness in this perspective and explore fully one scenario that I described in my book – SAP Enterprise Message. – That I wrote together with co-author Andrzej Halicki.

The focous of this blog is the continuos of my previous blog to use SAP Enterprise Message with the webhook mechanis ( API PUSH ) as you can see here – SAP EM – Webhook mechanism – SAP CPI.

Basically when the material suffer any change state of the data as Creation or Modification (Update/Delete) the event will be triggered automatically from S4HANA using a Z Event ( This is not a custom event from the ADD-ON – Event Ennabling because the VDI test that I use to develop it has only version 1809 that does not support this customization.

Event Driven Architecture (EDA):

In a very simple way to understand event driven architecture (EDA) means defines an event as a “significant change in state”.

Event-driven architecture (EDA) is a software design pattern in which decoupled applications can asynchronously publish and subscribe to events via an event broker (modern messaging-oriented-middleware).

SAP provides the event enablement Add-On in ECC and S4HANA (Cloud and On-Premise) to support you with that.

In case that you want explore and understand more about this architecture definition, I recommend you to buy Event-Driven Architecture with SAP

Scenario and Integration Perspective

As you already understand the potencial of SAP Integration Suite (EM and CPI ) let’s recapt simple about both services.

SAP EM is cloud event broker to handle exchange of events and messages.

SAP CPI as an iPaaS tool, as it offers a wide range of capabilities vis-à-vis: security, connectivity, various messaging protocols, and rapid development and deployment with the use of industry standard prepackaged integration contents.

Integration Perspective:

The scenario propose in this blog as mentioned above always when “significant change in state” of the material happens in the backend system S4HANA the Z EVENT will be triggered automatically to SAP EM and via webhook (push) the message to SAP CPI that will be responsable to routing based on the STATUS of the material (M or C), read the API – API_PRODUCT_SRV, retriver the values, make a filter using XSLT and send to one or multicast receivers based on PLANT that this material belong.

Iflow – SAP CPI

Basically I will not present the Iflow for routing and explain you some details of this prespective of share about others systems the change state of the material.

As you can see there is many local process, yes, I decide make like this to mitigate the error handling in case of problems of connections and to make more clean the IFLOW.

  1. Process Direct
  2. Get the product from the json event and save as property
    1. The ID will be used for API – GET Read
  3. Local Process call API_PRODUCT_SRV
  4. Exception in case of.
  5. Call the ODATA API
  6. Exception ODATA in case of
  7. XSLT to filter language and plant
  8. Groovy to setup up the routing details
  9. Routing – Single calls or Multicast deliver in case of the material is in more of one plant in the register.
  10. In case of Single delivery or Multicast
  11. In case of deliverable for the External System 1
    1. 14 – Call local process of Token first.
    2. 15 – Exception in case of
    3. 16 – Retrive from memory the result of the ODATA API call to build the final message with access token in the header in the next local process.
    4. 17 – Groovy Mapping – XML to JSON
    5. 18 – API Call
    6. 19 – Exception in case of.
  12. Local Process System 2
    1. 14 – Groovy mapping XML to JSON
    2. 15 – Call webservice
    3. 16 – Exception in case of.
  13. Local Process System 3
    1. 14 – Groovy mapping XML to JSON
    2. 15 – Call webservice
    3. 16 – Exception in case of
  14. Muticast for Process System 2 and 3.
    1. Because the material contains in the same plant in the backend system.

The XML result from API:

As you can see the API did a good job but not exactly needed, if you try to use $expand and $filter, but the (filter)  is a list, is a problem, something that you can’t make in this case because I’m expanding some entities of ODATA and use also filter can geneate the famouse error:

Because of that I decide go for XSLT filter to also use the property related with PLANT saved in the first goovy of this Iflow.

Import detail – Don’t be afraid, GROOVY it !!!

I high light recommend GROOVY your life and for that you must order and read the SAP Press ebite from @engswee.yeoh and @vadim.klimov – Developing Groovy Scripts for SAP Cloud Platform Integration.

As you can see all mapping details and get properties and headers, I decide for groovy.

XSLT Details and first groovy:

The first groovy in the flow is reponsable to parsing the JSON, leading zeros of PRODUCT and create a property CENTRO

import java.util.HashMap;
import groovy.json.*;
import java.util.regex.*;
def Message processData(Message message) {
    def map = message.getProperties()
    Reader reader = message.getBody(Reader)
    def json = new JsonSlurper().parse(reader)
    String product  = json.product
    String test1 = json.ThirdParty.test1
    String test2 = json.ThirdParty.test2
    String test3 = json.ThirdParty.test3
    message.setProperty("CodeProduct", product.replaceFirst("^0+(?!${0})", ""))
    if (test1 != null && !test1.isEmpty() && test2 != null && !test2.isEmpty()){
        if(test1 != null && !test1.isEmpty()){
            message.setHeader("test1", "X")
        if (test2 != null && !test2.isEmpty()){
            message.setHeader("test2", "X")
        if (test3 != null && !test3.isEmpty()){
            message.setHeader("test3", "X")
    def stringPlant = json.Plant
    return message

Stantard Event and ZEvent


  "eventType": "BO.Product.Changed",
  "cloudEventsVersion": "0.1",
  "source": "https://sap.corp",
  "eventID": "Aop3xCdEHtuvrK3F3Izrug==",
  "eventTime": "2021-05-25T14:28:10Z",
  "schemaURL": "https://sap.corp/sap/opu/odata/IWXBE/BROWSER_SRV/",
  "contentType": "application/json",
  "data": {
    "KEY": [
        "PRODUCT": "000000000000000015"


	"Product": "000000000000000015",
	"Status": "M",
	"Thirdparty": {
		"Test1": "X",
		"Test2": "X",
	"Plants": "T191,S039"

The values into KEY is array, if you don’t that, he will extract the value as:

  • 000000000000000015

  • Status – M or C

Regex in Groovy to remove leading zeros:

  • You need import the lib:import java.util.regex.*;

  • Code: .replaceFirst(“^0+(?!${0})”, “”))

The param – Plants is reading from previous property.

Now let’s discuss the XSLT remove the generic result call from the API_PRODUCT_SRV that return every detail from the material but the import was to filter by language and which plant this material belongs to don’t send wrong message to the system that should not received those details.

I decide implement this logic in abap side as you can see in the ZEvent:

  • ThirdParty – Who should receive this ZEvent
  • Plants – To filter and exclude from the result API call.
<?xml version="1.0" encoding="UTF-8" ?>
<xsl:transform xmlns:xsl="" version="1.0">
 <xsl:output method="xml"  omit-xml-declaration="yes" encoding="UTF-8" indent="yes" />
 <xsl:param name="Plants"/>
 <xsl:strip-space elements="*"/>
  <xsl:template match="@*|node()">
        <xsl:apply-templates select="@*|node()"/>
  <xsl:template match="A_ProductDescriptionType[Language!='EN']"/>
  <xsl:template match="A_ProductPlantType[not(contains($Plants,Plant))]"/>


Why you add this logic in abap to check ? 

The point is when you check the API_PRODUCT_SRV based on the product code that comes in the JSON ZEvent this API will return all values related with this material and this is a problem, comes with all languanges installed in the system and also all plants that this material is belongs, THIS IS NOT WRONG, but this is not funcional for the SAP CPI really determine which systems must receive this update of the material (Create or Change).

Because of that I decise push this logic to ABAP side to check when the material suffer any state of the data, provide me the thirdparty system as a list that will be used for routing perspective.

Take look in the full sample from result of API:

<?xml version="1.0" encoding="UTF-8"?>
            <ProductDescription>Handelsware 14, PD, Zukauf, H14</ProductDescription>
            <ProductDescription>Trad.Good 14,PD,Bought-In,H14</ProductDescription>
            <ProductDescription>TESTE OPERADOR 2021</ProductDescription>
            <ProductDescription>Mercadería 14, PD, comprado, H14</ProductDescription>

I belive seeing the result you are able to understand the problem.

With the XSLT Filter describe above to select only language EN and specific PLANT – M016 and T191 from ZEvent, so as you can see in the result Plant – S039 and T161 is out.


To solve this issue, we create the Z custom event in the S4/HANA.

The solution was before send the ZEvent check the tables MARA , MARM and MARC, independently if the material just suffer change in the description, this logic is to guaranty that always SAP CPI will know to with third party system must receive the data.

Independently which material master view that was affected.

I really hope that you enjoy the read and also start to think foward of event driven architecture with SAP products together with  SAP ODATA API’s from S4 to support you better and change the approach of classic integration model.


Kind regards,



You must be Logged on to comment or reply to a post.
  • How about sending all the relevant product data in the event notification? You could avoid doing lookup call backs to the source system as the look ups will have a negative impact on the system especially with the mass changes to the materials.

      • OK, that was helpful.

        I was thinking of event-carried state vs notification. Notifications place an additional load on producing system and it won't be easy to scale when the number of consumers increases. If you were to stick with small notification messages though, you would be better off using claim-check with dedicated store for event payloads.

        BTW you definitely could use IDoc as an event message e.g. triggered from BTE, but you might want to map it in middleware before sending it to the broker.

        • Igor,

          Off course you can scale it, the scenario is sending for 3 operators, and I just add more 6 at multicast.


          So whenever change the state of material in backend system, zevent will send, API Product will be checked to extract the values and send for 9 operators in one Iflow.




  • You are right Ricardo, in your scenario you have just one sink (SAP CPI). The problem would be if you want to add many additional consumers of this event.

    • Surely to centralize.


      If you don't want follow this approach, you can make a PULL mode that consumers must read the data from the queues and call Direct of via APIM the API to check what has been change.

    • Hello great Tobias Griebe,

      First of all thank you for all support about SAP EM in 2019 without your support to deeply understand all functionalities and others maybe I would not be able to procude any book, even the sap press e-bite.

      I decide to use webhook to excatly use the PUSH API instead of PULL mode (AMQP), surelly that AMQP is more security and better than simple HTTPS call base but I don't "overload" the CPI node for working pulling the data from the queue.

      Kind regards,