Technical Articles
SAP CPI – Event Driven Architecture (Z Event) with S4HANA and SAP EM
Hello Folks,
One more interesting blog to share the knowledge and experience with you, based on test cases, basically let’s follow the topics and instructions during the reading.
Agenda:
-
Introduction
-
Event Driven Architecture (EDA)
-
Scenario and Integration Perspective
-
Iflow – SAP CPI
-
Import detail – Don’t be afraid, GROOVY it !!!
-
XSLT Details
-
What I will not cover in the blog ?
- The whole setup of S4HANA (Connections and abap code).
- The configuration of SAP EM (Queue,webhook and further details).
- The whole setup of Iflow – just important parts.
Introduction:
I would like to share is how the Event Driven Architecture and how this can be a game change in the new prespective of integration definitions, with new model focous in simple integrations based on microservices.
SAP Integration Suite (EM and CPI) can support your bussiness in this perspective and explore fully one scenario that I described in my book – SAP Enterprise Message. – That I wrote together with co-author Former Member
The focous of this blog is the continuos of my previous blog to use SAP Enterprise Message with the webhook mechanis ( API PUSH ) as you can see here – SAP EM – Webhook mechanism – SAP CPI.
Basically when the material suffer any change state of the data as Creation or Modification (Update/Delete) the event will be triggered automatically from S4HANA using a Z Event ( This is not a custom event from the ADD-ON – Event Ennabling because the VDI test that I use to develop it has only version 1809 that does not support this customization.
Event Driven Architecture (EDA):
In a very simple way to understand event driven architecture (EDA) means defines an event as a “significant change in state”.
Event-driven architecture (EDA) is a software design pattern in which decoupled applications can asynchronously publish and subscribe to events via an event broker (modern messaging-oriented-middleware).
SAP provides the event enablement Add-On in ECC and S4HANA (Cloud and On-Premise) to support you with that.
In case that you want explore and understand more about this architecture definition, I recommend you to buy Event-Driven Architecture with SAP
Scenario and Integration Perspective
As you already understand the potencial of SAP Integration Suite (EM and CPI ) let’s recapt simple about both services.
SAP EM is cloud event broker to handle exchange of events and messages.
SAP CPI as an iPaaS tool, as it offers a wide range of capabilities vis-à-vis: security, connectivity, various messaging protocols, and rapid development and deployment with the use of industry standard prepackaged integration contents.
Integration Perspective:
The scenario propose in this blog as mentioned above always when “significant change in state” of the material happens in the backend system S4HANA the Z EVENT will be triggered automatically to SAP EM and via webhook (push) the message to SAP CPI that will be responsable to routing based on the STATUS of the material (M or C), read the API – API_PRODUCT_SRV, retriver the values, make a filter using XSLT and send to one or multicast receivers based on PLANT that this material belong.
Iflow – SAP CPI
Basically I will not present the Iflow for routing and explain you some details of this prespective of share about others systems the change state of the material.
As you can see there is many local process, yes, I decide make like this to mitigate the error handling in case of problems of connections and to make more clean the IFLOW.
- Process Direct
- Get the product from the json event and save as property
- The ID will be used for API – GET Read
- Local Process call API_PRODUCT_SRV
- Exception in case of.
- Call the ODATA API
- Exception ODATA in case of
- XSLT to filter language and plant
- Groovy to setup up the routing details
- Routing – Single calls or Multicast deliver in case of the material is in more of one plant in the register.
- In case of Single delivery or Multicast
- In case of deliverable for the External System 1
- 14 – Call local process of Token first.
- 15 – Exception in case of
- 16 – Retrive from memory the result of the ODATA API call to build the final message with access token in the header in the next local process.
- 17 – Groovy Mapping – XML to JSON
- 18 – API Call
- 19 – Exception in case of.
- Local Process System 2
- 14 – Groovy mapping XML to JSON
- 15 – Call webservice
- 16 – Exception in case of.
- Local Process System 3
- 14 – Groovy mapping XML to JSON
- 15 – Call webservice
- 16 – Exception in case of
- Muticast for Process System 2 and 3.
- Because the material contains in the same plant in the backend system.
The XML result from API:
As you can see the API did a good job but not exactly needed, if you try to use $expand and $filter, but the (filter) is a list, is a problem, something that you can’t make in this case because I’m expanding some entities of ODATA and use also filter can geneate the famouse error:
Import detail – Don’t be afraid, GROOVY it !!!
I high light recommend GROOVY your life and for that you must order and read the SAP Press ebite from @engswee.yeoh and @vadim.klimov – Developing Groovy Scripts for SAP Cloud Platform Integration.
As you can see all mapping details and get properties and headers, I decide for groovy.
XSLT Details and first groovy:
The first groovy in the flow is reponsable to parsing the JSON, leading zeros of PRODUCT and create a property CENTRO
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import groovy.json.*;
import java.util.regex.*;
def Message processData(Message message) {
def map = message.getProperties()
Reader reader = message.getBody(Reader)
def json = new JsonSlurper().parse(reader)
String product = json.product
String test1 = json.ThirdParty.test1
String test2 = json.ThirdParty.test2
String test3 = json.ThirdParty.test3
message.setProperty("CodeProduct", product.replaceFirst("^0+(?!${0})", ""))
if (test1 != null && !test1.isEmpty() && test2 != null && !test2.isEmpty()){
message.setHeader("test1&test2","X")
}else{
if(test1 != null && !test1.isEmpty()){
message.setHeader("test1", "X")
}
if (test2 != null && !test2.isEmpty()){
message.setHeader("test2", "X")
}
if (test3 != null && !test3.isEmpty()){
message.setHeader("test3", "X")
}
}
def stringPlant = json.Plant
message.setProperty("Plants",stringPlant)
return message
}
Stantard Event and ZEvent
Stantard:
{
"eventType": "BO.Product.Changed",
"cloudEventsVersion": "0.1",
"source": "https://sap.corp",
"eventID": "Aop3xCdEHtuvrK3F3Izrug==",
"eventTime": "2021-05-25T14:28:10Z",
"schemaURL": "https://sap.corp/sap/opu/odata/IWXBE/BROWSER_SRV/",
"contentType": "application/json",
"data": {
"KEY": [
{
"PRODUCT": "000000000000000015"
}
]
}
}
ZEvent:
{
"Product": "000000000000000015",
"Status": "M",
"Thirdparty": {
"Test1": "X",
"Test2": "X",
},
"Plants": "T191,S039"
}
The values into KEY is array, if you don’t that, he will extract the value as:
-
000000000000000015
-
Status – M or C
Regex in Groovy to remove leading zeros:
-
You need import the lib:import java.util.regex.*;
-
Code: .replaceFirst(“^0+(?!${0})”, “”))
The param – Plants is reading from previous property.
Now let’s discuss the XSLT remove the generic result call from the API_PRODUCT_SRV that return every detail from the material but the import was to filter by language and which plant this material belongs to don’t send wrong message to the system that should not received those details.
I decide implement this logic in abap side as you can see in the ZEvent:
- ThirdParty – Who should receive this ZEvent
- Plants – To filter and exclude from the result API call.
<?xml version="1.0" encoding="UTF-8" ?>
<xsl:transform xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<xsl:output method="xml" omit-xml-declaration="yes" encoding="UTF-8" indent="yes" />
<xsl:param name="Plants"/>
<xsl:strip-space elements="*"/>
<xsl:template match="@*|node()">
<xsl:copy>
<xsl:apply-templates select="@*|node()"/>
</xsl:copy>
</xsl:template>
<xsl:template match="A_ProductDescriptionType[Language!='EN']"/>
<xsl:template match="A_ProductPlantType[not(contains($Plants,Plant))]"/>
</xsl:transform>
Why you add this logic in abap to check ?
The point is when you check the API_PRODUCT_SRV based on the product code that comes in the JSON ZEvent this API will return all values related with this material and this is a problem, comes with all languanges installed in the system and also all plants that this material is belongs, THIS IS NOT WRONG, but this is not funcional for the SAP CPI really determine which systems must receive this update of the material (Create or Change).
Because of that I decise push this logic to ABAP side to check when the material suffer any state of the data, provide me the thirdparty system as a list that will be used for routing perspective.
Take look in the full sample from result of API:
<?xml version="1.0" encoding="UTF-8"?>
<A_Product>
<A_ProductType>
<ProductGroup>L001</ProductGroup>
<to_Description>
<A_ProductDescriptionType>
<Language>DE</Language>
<ProductDescription>Handelsware 14, PD, Zukauf, H14</ProductDescription>
</A_ProductDescriptionType>
<A_ProductDescriptionType>
<Language>EN</Language>
<ProductDescription>Trad.Good 14,PD,Bought-In,H14</ProductDescription>
</A_ProductDescriptionType>
<A_ProductDescriptionType>
<Language>PT</Language>
<ProductDescription>TESTE OPERADOR 2021</ProductDescription>
</A_ProductDescriptionType>
<A_ProductDescriptionType>
<Language>ES</Language>
<ProductDescription>Mercadería 14, PD, comprado, H14</ProductDescription>
</A_ProductDescriptionType>
</to_Description>
<to_Plant>
<A_ProductPlantType>
<Plant>M016</Plant>
<SerialNumberProfile>0001</SerialNumberProfile>
</A_ProductPlantType>
<A_ProductPlantType>
<Plant>S039</Plant>
<SerialNumberProfile>0001</SerialNumberProfile>
</A_ProductPlantType>
<A_ProductPlantType>
<Plant>T161</Plant>
<SerialNumberProfile>0001</SerialNumberProfile>
</A_ProductPlantType>
<A_ProductPlantType>
<Plant>T191</Plant>
<SerialNumberProfile>0001</SerialNumberProfile>
</A_ProductPlantType>
</to_Plant>
</A_ProductType>
</A_Product>
I belive seeing the result you are able to understand the problem.
With the XSLT Filter describe above to select only language EN and specific PLANT – M016 and T191 from ZEvent, so as you can see in the result Plant – S039 and T161 is out.
<A_Product>
<A_ProductType>
<to_Description>
<A_ProductDescriptionType>
<Language>EN</Language>
<ProductDescription>TEST</ProductDescription>
</A_ProductDescriptionType>
</to_Description>
<to_Plant>
<A_ProductPlantType>
<Plant>M016</Plant>
<SerialNumberProfile>0001</SerialNumberProfile>
</A_ProductPlantType>
<A_ProductPlantType>
<Plant>T191</Plant>
<SerialNumberProfile>0001</SerialNumberProfile>
</A_ProductPlantType>
</to_Plant>
</A_ProductType>
</A_Product>
To solve this issue, we create the Z custom event in the S4/HANA.
The solution was before send the ZEvent check the tables MARA , MARM and MARC, independently if the material just suffer change in the description, this logic is to guaranty that always SAP CPI will know to with third party system must receive the data.
Independently which material master view that was affected.
I really hope that you enjoy the read and also start to think foward of event driven architecture with SAP products together with SAP ODATA API’s from S4 to support you better and change the approach of classic integration model.
Kind regards,
Viana.
Hi Ricardo,
Great Blog...Thanks for sharing!!
Thank you,
Syam
How about sending all the relevant product data in the event notification? You could avoid doing lookup call backs to the source system as the look ups will have a negative impact on the system especially with the mass changes to the materials.
So I belive you don't understand EDA.
In such case use IDOC, no make sence create a event with whole details.
OK, that was helpful.
I was thinking of event-carried state vs notification. Notifications place an additional load on producing system and it won't be easy to scale when the number of consumers increases. If you were to stick with small notification messages though, you would be better off using claim-check with dedicated store for event payloads.
BTW you definitely could use IDoc as an event message e.g. triggered from BTE, but you might want to map it in middleware before sending it to the broker.
Igor,
Off course you can scale it, the scenario is sending for 3 operators, and I just add more 6 at multicast.
So whenever change the state of material in backend system, zevent will send, API Product will be checked to extract the values and send for 9 operators in one Iflow.
Regards,
Viana.
You are right Ricardo, in your scenario you have just one sink (SAP CPI). The problem would be if you want to add many additional consumers of this event.
Surely to centralize.
If you don't want follow this approach, you can make a PULL mode that consumers must read the data from the queues and call Direct of via APIM the API to check what has been change.
Hi Ricardo Viana,
great blog. Always nice to see your blogs about EDA and Event Mesh!
Just one question: why are you using the webhook functionality to connect EM and CPI? There is a native AMQP adapter availbe for CPI: https://blogs.sap.com/2019/11/20/cloud-integration-connecting-to-external-messaging-systems-using-the-amqp-adapter/
Best regards,
Tobias
Hello great Tobias Griebe,
First of all thank you for all support about SAP EM in 2019 without your support to deeply understand all functionalities and others maybe I would not be able to procude any book, even the sap press e-bite.
I decide to use webhook to excatly use the PUSH API instead of PULL mode (AMQP), surelly that AMQP is more security and better than simple HTTPS call base but I don't "overload" the CPI node for working pulling the data from the queue.
Kind regards,
Viana.