Skip to Content
Technical Articles
Author's profile photo Baris Buyuktanir

Advanced Event Mesh and BTP : Getting Events to Work

Some of the audience might also remember one of my childhood’s favorite animated TV series “Voltron”. It was a ”science fiction cartoon” featuring a team of pilots who control lion-shaped robots that combine to form a larger, “more powerful” robot called Voltron.

This is how I see BTP:  “Voltron”. (BTP and the  BTP services / Voltron and the lions). In fact BTP is and was already a very effective PaaS, but now more effective with the addition of Advanced Event Mesh, a very powerful Event Broker that joined the team (Advanced Event Mesh in fact promises more but this we’ll discuss later..)

While formulating the architecture and design of the integrations, you have to know and make use of best combination of possible solutions you have in hand. Keeping in mind the pros and cons of each solution and having making use of powerful parts of each one is the ultimate target. Today in the cloud with SAP BTP, we have a very powerful combination / platform, which can add together the power of API based service oriented architecture with the relatively new kid on the block : Event Driven Architecture.

By combining services and solutions like Cloud Integration, Advanced Event Mesh and CAP, (and with other options I won’t be mentioning here); BTP forms and serves the Voltron, surpassing the sum of its underlying components.

I won’t be mentioning about some of these in detail as there are lots of good technical articles & blogs where the benefits and implementations of Cloud Integration, CAP Development are already presented. There are also various blogs covering the advantages of event driven architecture which you can check for introduction.

Instead, I will focus on a scenario where you can combine the power of these three(BTP CI, CAP and AEM), while mostly focusing on the “Advanced Event Mesh” technically, assuming that the other parts’ are already covered and readers have good knowledge.

And also the sample payload data has been modified for privacy reasons, without affecting the functionality.



Imagine a use case where you have a Sales System(C4C), integrated to multiple target/consumer applications that you need to notify about what is happening about the “Accounts”.(Obviously you can prefer other objects as well. In this use-case Account is picked as it’s a master data that is requested often in many scenarios)

One very traditional method to handle this would be a file based async approach, or your target applications’ querying your upstream application(C4C) in a pull based approach. This means multiple calls to your application(high cost) because you have multiple consumers. Anothe side effect is tight-coupling your consumers to the source application etc.

What if not all your consumers are interested in “all” the Account changes you will have. How this situation is handled?

This would mean different queries, different routing rules or intermediate data storage problems .. and much more.


What if our source system notifies the world about the changes in the “Accounts” via events once; and multiple possible consumers can pick whatever they are interested in. And not tightly couple producers with consumers (in fact they are not aware of eachother) so that adding and removing each one will be easier.


Very basicly the main steps of the use-case is as follows:

  1. Operations in C4C will trigger events and C4C publishes the new/changed information via these events.(mostly via what is called notification events)
  2. BTP, Cloud Integration (as the orchestrator of not only C4C Events today-but also orchestrator of possible future events from different source systems.) will manage the flow and make necessary calls.
  3. BTP, Advanced Event Mesh (as an event broker and router) will manage these event stream and communicates via multiple targets, routing the correct information to interested consumers
  4. BTP, CAP Services for implementing smart logic, rules for data, filtering etc. and exposing this logic to CI via microservices.



Below is the high level architecture where three third party application are the targets(interested in the same type of data but  from different perspective) and C4C being the source system. The good thing with these scenario is all parties are loosely coupled and in fact source system doesn’t even know who the consumers are.






  • Cloud For Customer (C4C)

As the source of the events, in C4C, you first need to register a system to publish the events to and register the specific events to be published via “Subscriptions”.

Administrator > General Settings >  System Administration > Event Notification


This is for C4C to publish events to other applications. ( in our case it’s the BTP, Cloud Integration iFlow Endpoint)


Target for Event Notification and Subscriptions of Events

In this case I will only be publishing the changes in the Account object’s Root node (all create, update and delete operations). Therefore whenever a change-delete-insert happens in this particular object, an event is triggered.

Below you can see some of many subscription options from C4C (you can pick among many and in which detail)


Registration of the target system and events contd.

When you make a change to the object, as per design, below event is triggered from C4C (a sample) and published.

  "specversion": "1.0", 
  "type": "/sap.c4c/Account.Root.Updated", 
  "source": "/XAF/sap.c4c/000000000123456788", 
  "id": "bb19819d-66xx-1eed-bdc5-0c04a378e286", 
  "event-type": "Account.Root.Updated", 
  "event-type-version": "v1", 
  "event-id": " bb19819d -66xx -1eed-bdc5-0c04a378e286", 
  "event-time": "1981-13-04T03:30:30Z", 
  "data": { 
     "entity-id":"001981D0E1M2E3T4CA1EEC8AF58A7D84" } 


This event is published to the BTP, CI iFlow which acts as the orchestrator. For C4C, it’s an endpoint (an external consumer) as configured below

 !! Replace the hostname and endpoint URL according to your requirements (whatever you configured in the iFlow)

With these settings, the publishing (source part) is finished.


  • BTP, Cloud Integration (formerly CPI) iFlow


In order to simplify the scenario, basic version of BTP, CI iFlow is demonstrated(many steps for logging, security, configuration-scripting for specific requirements are removed from the iflow), as the purpose of the blog is to present the overall architecture and the Event Broker : Advanced Event Mesh portion.


Our iFlow “IFLOW_EVENTRECEIVER_TO_CONSUMERS” will be doing the orchestration of the scenario and will be receiving current and as an extension possible future events from current / new source systems.

The main integration process is simply;

  1. Receiving the event information from via Http Adapter
  2. Setting some header/properties to be used later (depending on the complexity of the scenario)
  3. Calling the CAP Service in order to receive the data(the full enriched object data and topics to be published. (Notification Event Payload only have the changed data but we want to publish more detailed information than that)
  4. Iterating of multiple topics(possible with the scenario) and publishing to these topics.(0..n topics)
  5. A local integration process to publish to Advanced Event Mesh for each topic received via AMQP adapter.
  6. Handling exceptions and notifying related parties via another generic process.


The http endpoint to receive the event data


Http Endpoint (start of iFlow)


The main and the sub-process


Main Integration Process


Sub-process sending the message to AEM

Advanced Event Mesh Connection Details (AMQP Adapter)

The sub process “Process Topics one by one” is doing the Advanced Event Mesh publishing part, where pCurrentTopicToBePublished is the property we set per each iteration as the topic name.

This topic is published to Advanced Event Mesh with the message payload. (We receive these two from the CAP service(API) described below.)

Bear in mind that there can be a single topic or even zero if we decide not to publish the information. It’s all up to you, in this case I prefer 0..n topics and iterate it. In the last part we’ll be discussing about the fun part (how topics can be used to route messages in Advanced Event Mesh in a very flexible manner)


AMQP Adapter DetailsAMQP%20Adapter%20Connection%20and%20Processing%20Details

AMQP Adapter Connection and Processing Details



In this scenario the responsibility for the CAP service is

  • to formulate the data to be published to the target applications
  • to determine which topics to publish this data


We won’t be looking at the details for the CAP Service but the logic is simply:

  • CAP Service API receives the event data payload (JSON)
  • CAP service queries the C4C System in order to enrich the data  (via OData services)
  • As per the logic, return one or multiple topics to publish to and enriched object to the caller (CI iFlow)

Below is a sample request and response for the CAP Service:




    "specversion": "1.0",
    "type": "/sap.c4c/Account.Root.Updated",
    "source": "/XAF/sap.c4c/000000000123456788",
    "id": "bb19819d-66xx-1eed-bdc5-0c04a378e286",
    "event-type": "Account.Root.Updated",
    "event-type-version": "v1",
    "event-id": " bb19819d-66xx-1eed-bdc5-0c04a378e286",
    "event-time": "1981-13-04T03:30:30Z",
    "data": {
        "root-entity-id": "001981D0E1M2E3T4CA1EEC8AF58A7D84",
        "entity-id": "001981D0E1M2E3T4CA1EEC8AF58A7D84"


    "EventTriggeredOn": "1981-04-13T03:30:00Z",
    "Topics": [
    "Entity": {
        "AccountId": "20000001",
        "Role": "CRM000",
        "ERPAccountID": "358424",
        "AccountName": "BB Test Account",
        "AccountStatus": "2",
        "SalesOrg": "ECOM1",
        "DeliveryPostalCode": "34000",
        "DeliveryCity": "Istanbul",
        "Country": "TR",
        "DistributionChannel": "01",
        "Division": "DIV1",
        "SalesRepCode": "1000132",
        "TaxId": "TAX00001",
        "CompanyID": "ABCD1234"


Now it comes to the fun part, Advanced Event Mesh, event broker configuration and topic structure.



Advanced Event Mesh is an enterprise segment event broker in many cases compared to other  event brokers such as Kafka, Rabbit MQ,..

The name with the SAP’s other solution Event Mesh is causing confusion as if Advanced Event Mesh is the same solution with Event Mesh but with more capabilities. The fact is, only the last part of this sentence is correct. (Obviously both EM and AEM are event brokers, providing messaging-eventing services, using queues and topics and subscriptions.. But, when it comes to different capabilities like “meshing” the broker services, easily combining async-sync scenarios, and filtering / routing capabilities using topics in a very effective way (which we will be using in this blog and scenario); Advanced Event Mesh behaves very different and shines with what it offers additionaly as an Enterprise Level Event Broker (and more).

Some of the capabilities are already demonstrated in events/blogs/webinars. Along with today’s presented features, I will also try to demonstrate these in the following blog posts.

The fun part starts just after Cloud Integration (iFlow) publishes the message to the smart topic(s).(described below)

Assuming that you have already provisioned your Advanced Event Mesh Broker service within your BTP Tenant, you will be facing user interfaces such as below when you have done for the first time(subject to change with the new versions, but the idea is the same)


Let’s return back to our scenario where C4C is publishing “notifications” about the Accounts’ data to BTP.

In the scenario we have 3 different downstream applications expecting new/changed/deleted Account data.





  • E-commerce application
  • Interested in the accounts belong to E-Commerce Sales Organizations(ECOM1, ECOM2)


  • Regional store application / Turkiye region
  • interested in the accounts for country : Turkiye (TR)


  • Global Application
  • interested in every account data regardless of the country, sales organization



Topic / Topic Subscriptions

In event-driven architecture (EDA), topics serve as a way to categorize the data conveyed in event messages. Events are sent to one or multiple topics, and endpoints (destinations) can subscribe to one or more topics to receive events from publishers. Practically topics are nothing but additional information (free format string) attached to message (like an attribute / header of the message). However when efficiently designed, they are very powerful. (you can organize routing of messages (events) to multiple different consumers in a very flexible manner.)

We make use of topics via topic subscriptions. Topic subscriptions are to attract messages, for telling the world which messages are you interested in as a consumer end-point.

Advanced Event Mesh can make use of wildcards and more in topic subscriptions which gives you a lot flexibility with wildcards “*” and special “>” character. Our scenario is a small demonstration of this usage. (In fact AEM has more, might be a “topic” of another article about EDA / AEM)

  • * in each level means everthing matches with this level (level is everything between “/” )
  • > after the last level means matches every level after your last .. It’s like XYZ/*/*/* forever.

Usage will be more clear with our example below..

Below is a sample topic structure that I used (see the placeholder attributes in {{XYZ}} to be replaced by the real values )

Topic Structure


Lets assume that 4 events are triggered from C4C. Based on the data within the event, our smart CAP service fills the topics as below:

  1. First event is for a change in an account from sales organization ECOM1 ||  from Turkey(TR)  ||  division DIV1
  2. Second is for an account from sales organization Stores1(STR1) || country: Turkey(TR)  ||  division DIV1
  3. Third is for an account from sales organization ECommerce(ECOM2) || country: United Kingdom(UK)  ||  division DIV2
  4. Fourth is for an account from sales organization Stores(STR2) || country: Germany(DE)   ||  division DIV2

So the topics formulated by the CAP services will be:

  • Message-1 Topic : c4c/account/v1/ECOM1/TR/DIV1
  • Message-2 Topic : c4c/account/v1/STR1/TR/DIV1
  • Message-3 Topic : c4c/account/v1/ECOM2/UK/DIV2
  • Message-4 Topic :c4c/account/v1/STR2/DE/DIV2


Now this is where the magic happens.

We have three consumers expecting only the messages that they are interested.

Based on the topics, Advanced Event Mesh routes this messages to zero or multiple subscribers.


Advanced Event Mesh uses wildcards and special symbols while making this filtering possible.

  • * in each level means everthing with this level
  • > after the last level means everything after that


How does this happen?

  • All these consumer applications are listening to their own queues(there are different options but widely used one is this)


  •   APP_1 is a consumer for QUEUE_1Subscription (accounts for E-Commerce Sales Organization(ECOM1, ECOM2))



  • APP2  is a consumer for QUEUE_2Subscription (accounts for country : Turkiye (TR). regardless of other information


            or similarly



  • APP_3  is a consumer for QUEUE_3Subscription (All accounts regardsless of sales org, country, division)


              or simply



Below are our queues in the beginning:


3 new Queues are created


And the topic subscriptions for these queues:


Queues are subscribed to the topics very easily

By this setup;


  • APP_1 -> QUEUE_1 will receive Message 1, Message 3
  • APP2 -> QUEUE_2 will receive Message 1, Message 2
  • APP 3 (Generic) -> QUEUE_3 will receive all messages : Message 1, Message 2, Message 3 and Message 4

as they subscribed to.

Once you make your design, the extensions would be much easier in the future such as additional source and target systems’ onboarding and/or additional new requirements for the current subscribers.

Lets say our new scenario requires APP2 to extend the coverage for the accounts of the France also. In this case The only thing you need to do in this scenario is adding a subscription to the queue for APP2 as below that’s it.


Once changes are done in accounts for FR, these are published to APP2 as well along with APP3 which is already receiving all countries’ accounts.

Final status of the queues after new subscription and events publishing

Onboarding of new consumer:  

Lets say a fourth application(APP4) would like to receive account-related information for a particular sales organization (stores with code(STR1)). Easily done in two steps:  You assign this app a queue and adjust the subscription for this queue as


and Voila! Start enjoying how easy the related data is routed to the application.


What we have done ..

Although we simplify the scenario to demonstrate the intended points:

  • We have enabled C4C system as an event source publishing data to the loosely coupled consumers in a controlled manner.
  • With Advanced Event Mesh and BTP, we have enabled multiple consumers to be notified about the changes that they are interested.
  • Design a flexible architecture to make future subscribers to easily join the scenario. (and possibly new sources/producers as well)


Follow-up Plans..

Now it’s easier to consider other combinations of sync/async, combinations of API based and event driven scenarios depending on the target and source systems’ capabilities.

Advanced Event Mesh and BTP has many more capabilities to add on top and in the next couple of articles and blogs, my intention is to demonstrate scenarios with some  “advanced” features of Advanced Event Mesh and show how it’s possible to modernize integrations with the use of Event Driven Architecture.

Please do not hesitate to contact me in case of questions / comments / recommended use-cases. 


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Sebastian Schuck
      Sebastian Schuck

      Thanks for this great blog post demonstrating how to utilize the SAP BTP technologies to setup an "event pipeline" step by step while clearly outlining the advantages of an event-driven approach.

      Author's profile photo Muni M
      Muni M

      Hi Baris,

      Thank you for this wonderful blog.

      I understand the pub sub model can scale better when number of producers and consumers grow more and volume also goes high. If I just ignore the scalability part, can we just handle this the scenario ( one sender and three receivers in your case) using CPI itself. Because you can route these messages to receivers. CPI can works as web hook and push the messages to all receivers.

      If you look at the message exchange pattern in solace, I think this is similar to CPI message exchange pattern.

      Why do we have to use CPI between C4C and Event mesh? can we just let C4C send the data to Event mesh directly? in plain term, C4C is here producer.

      I have another question on Queues part. Do we have to create queue in solace  manually every time new consumers comes in? or it can be created via java/rest api just like in kafka consumers subscribe to topic.




      Author's profile photo Baris Buyuktanir
      Baris Buyuktanir
      Blog Post Author

      Hi Muni,

      Thank you for your comment.


      Multiple questions and different variations to your questions..

      Regarding the CI(CPI) you are right, it's a choice but not a must have. As indicated in the blog as well, I use it for having flexibility in orchestration of calls and other operations in mentioned. (for the future as well). You can have multiple options, call AEM first from C4C(cannot use AMQP, but REST), then route it to the downstream applications, or another option, you can create a REST endpoint in CAP and call it directly. You need to consider the use-case to give the most proper answer as they all have pros and cons. It also depends on the upstream application and next step of your flow. For instance if C4C calls AEM, then you need to do some payload specific operation, it means you have to do other things. Or if you want to enrich the object because event doesn't send you the whole information at once sometimes. You can do each programmatically in CAP service + AEM but again it's a choice you need to make thinking of the restrictions of each side.

      What you have mentioned is correct, it's the pub-sub pattern with extensions and again correct AEM is powerful processing these requests unless you do anything extra "inaccurate"

      For the queues part, the answer is again: AEM gives you multiple options / it depends on your use case. There are message delivery modes in AEM/Solace. If you want direct message you don't need to create a queue. Queue's are for persistance/guaranteed delivery. The question is what do you expect if your consumer is not connected for some reason? Are you OK with losing the message when you are not connected or not? These endpoints can be durable(like this one) or non-durable if this is what you meant. However non-durable queues are used until the consumer is disconnected, you have to choose according to your case.

      Lastly, you have multiple options while creating the queue, if you mean it while saying "creating". You can create a queue from management UI, through code and through SEMP.

      Hope this helps,

      Author's profile photo Maximiliano Colman
      Maximiliano Colman

      Hi Baris,

      Have you checked this?



      Author's profile photo Baris Buyuktanir
      Baris Buyuktanir
      Blog Post Author

      Hi Max,

      I have checked it, another nice one with a different approach. Is it a blog post that you convert to presentation?

      What I think is the order of the components and responsibilities may differ depending on the conditions, both approaches are serving to the "similar" purpose with dynamic topic generation and payload manipulation(if required). In our case strategically we have decided to have the logic in a service (CAP) then CI itself which does the "similar" mapping / decision matrix as an API.

      Thanks for the different perspective..

      Just a quick question : In your case I have seen that the S/4HANA event is first published to Advanced Event Mesh. Which mechanism have you used to publish this information from S/4HANA to the "Advanced" Event Mesh? 




      Author's profile photo Maximiliano Colman
      Maximiliano Colman

      Hi Baris,

      It was a word document to pdf.

      About the publishing the events from S4, it’s a tricky question, the standard solution from SAP just support SAP Event Mesh and the official reply from SAP is push them via webhooks to SAP Advanced Event Mesh:



      There are in the market add ons for SAP that allows you to directly connect to SAP Advanced Event Mesh like ASAPIO via REST APIs:




      Author's profile photo Baris Buyuktanir
      Baris Buyuktanir
      Blog Post Author

      Hi Max,

      This is exactly the reason why I asked, hoping that I'm missing something (if there's a third option then the below ones)

      We have implemented S/4 -> EM -> AEM scenario, however it's nothing but using EM like a pass-through, meaning 2 event brokers for the same purpose, which doesn't make sense other than being a temp solution.

      ASAPIO commercial package seem to be capable, we are investigating it.

      In fact SAP also has an enablement package in their roadmap (pointing to Q32023) but the content & the pricing is not clear.

      I was also thinking about the option of capturing the event from S/4HANA and publish it with a REST call but it requires manual effort & development which is done by these add-ons with additional features.


      Author's profile photo Maximiliano Colman
      Maximiliano Colman

      Well that sounds really interesting, it seems that a new capability will be added in integration suite called “Event broker” that supports solace / advanced event mesh