[Blog Post] SAP Event Mesh – Enable SAP Cloud Integration Suite to Consume Messages from SAP Event Mesh Service
Authors: Vipul Khullar & Ayush Kumar
Previous blog posts in this series:
- [Blog Series] SAP Event Mesh – Deep Dive | SAP Blogs
- [Blog Post] SAP Event Mesh – Event Driven Architecture Explained | SAP Blogs
- [Blog Post] SAP Event Mesh – Single Tenancy & Multi-Tenancy Explained | SAP Blogs
- [Blog Post] SAP Event Mesh – CAP based implementation of SAP Event Mesh in a Single-Tenant Scenario
In the previous blog posts, we saw how we can communicate between different CAP-based microservices using the SAP Event Mesh service to achieve true asynchronous communication and how the CAP framework facilitates the same.
In this blog post, we will try to achieve the same by leveraging SAP Cloud Integration Suite by acting as a middleware between the two microservices, which will be useful, especially in cases where specific event handling is not desired within the 2nd microservice (communication is intended via API only).
To execute the following scenario, you need.
- BTP account (trial account would also work).
- Event Mesh subscription for your subaccount.
- Service key for the Event Mesh instance.
- Cloud Integration Suite subscription for your subaccount.
- Local Setup for CAP JAVA (if you already have a CAP application initialized upgrade the CDS service version to 1.22.1 or higher, else you might face runtime errors complaining about bean error).
In this scenario, we have created two CAP-based microservices on BTP and we have tried to set up an event-based communication between the services with the help of Event Mesh and SAP Cloud Integration Suite. We have tried to leverage the AMQP adaptor in cloud integration flow to listen to the event raised by the first microservice and then leveraged the API endpoint in 2nd microservice to replicate the data to the same.
Steps in SAP Event Mesh:
- Open the SAP Event Mesh Message Client UI.
- Select your instance (Message Client).
- Create a queue with any name of your choice.
- Note: you can use the default queue as well which is generated by the CAP Application at run time.
Note: For demo purposes, we will use the queue name as emDataMappingQueue and subscribe to the same topic as used in the previous blogs.
- Now run the event-producer application which we have created in the previous blog. In the Event Mesh UI now you will see the messages are incremented in both the queues that are the CAP-created queue and the queue created by us.
Steps in SAP Cloud Integration:
- Open the SAP Cloud Integration Suite and navigate to security material. Create an oAuth2.0 client Credentials using the SAP Event Mesh instance key created in the previous blogs. (We will use this credential configuration to connect to the Event Mesh)
- Create a new artifact.
- Now connect the sender to the start using AMQP->WebSocket adapter.
- Now again open the key and under the AMQP section, you will find all the values which are required to configure AMQP.
In the Processing section:
In the Connection Section:
- Create an XSLT mapping to change the data in the format expected by the consumer.
Now run the Event Producer application and create a post request to post student details as done in the previous blog post.
Now, let’s check the trace corresponding to the event raised, in Cloud Integration Suite.
We observe that now the data has been mapped according to the consumer microservice which can now consume this information without knowing the data model of the event producer.
Conclusion: In this blog post, we presented a use case where we can enable multiple microservices which may or may not have similar data models to communicate with each other without having to modify anything within the microservices. We achieved this with the help of SAP Event Mesh and SAP Cloud Integration Suite.
In the next few blog posts, we will cover how we can raise and consume business events from an SAP S/4HANA system using SAP Event Mesh and how we can consume them in SAP Cloud Integration Suite or any other application on BTP.
Please do like the blog post if you find the content helpful. Also, do share your comments and inputs, if any.
Next blog post in the series: [Blog Post] SAP Event Mesh – S/4HANA On Premise integration with Event Mesh | SAP Blogs
Hi Vipul Khullar ,
Here you are trying to sent the messages consumed in an iFlow to a CAP based microservice .Is it possible to consume the messages in SAP iflow directly using a UI5 application without using any microservice whether it be a CAP service or a node js application.
Thanks for the question. Yes, it is possible to consume the messages directly via an event-handler or via a webhook in any java/node js/other application.
We have covered one example in our previous blog post, where we showcased an event-handler-based implementation in java.
For a webhook-based implementation in node js, you can refer to this blog post.
Hope this helps.
Hi Vipul Khullar
Thanks for your reply.I wanted to understand the importance of using CPI to consume messages from event mesh .Should it be used only when we need to change the data structure or for better error handling.
If we use CPI can we eliminate the use of the microservices and connect it directly with SAP UI5 from the SAP integration suite.
Hi Keerthana Jayathran
CPI can be used in scenarios where multiple systems/microservices need to integrate with each other with the help of APIs/events (especially in cases where some complex data mapping is required). It also helps in scenarios where the two systems/microservices need to be agnostic of each other's presense as CPI can act as a middleware with in-built adaptors that ease the process of integration between them.
I don't think connecting SAP UI5 app directly to CPI is a valid use case. You would always need a backend microservice for this.
Hi Vipul Khullar
Great Blog! I have a question for you that is it possible to keep the message in queue after consuming?
Hi Frankie Huang Yes you can keep the message in the queue.
But that will be a task overhead for you to design.
By default all the event broker's work on the same principle that as soon as a successful acknowledgement for the message is received the message gets removed from the queues'.
Hi Vipul Khullar ,
Thanks for your sharing.
I follow the blog and Successfully created an Integration Flow.
But during the iFlow process end.
I found that consumed message still keep in the Event Mesh.(In Event Mesh UI, message count has not decreased)
Is there any step I haven't set？
Hi TIANHAO REN this might occur if you iflow is not successful either it went in retry or error state.
If the iflow completes successfully then the messages get removed from the queues.