Skip to Content
Personal Insights
Author's profile photo Robert Eijpe

Lessons learned implementing an Event-Driven Cloud Integration Architecture with S/4HANA Cloud, SAP Cloud Integration, and SAP Event Mesh

We were challenged when my company was asked to support a client’s integration strategy to connect S/4HANA public cloud with their production plant equipment. The client planned to replace their Microsoft ERP application with S/4HANA Cloud. And this with minimal impact on existing connected systems.

S/4HANA supported the selling, planning, purchasing, and invoicing methods and master data for products and customers. Other applications will cover the Bill of material based on a recipe for (semi-)finish products, the control during the production process, quality checks, and needed transport documents. We need to integrate different systems to support their production process, resulting in many integrations flows where data should be exchanged.

With S/4HANA on-premises and ‘classic’ ERP systems, we mostly fall back to the traditional way of integrating and modifying the backend ERP. But in S/4HANA Cloud, interfering in existing processes is limited to the provided extension and integration points given by SAP. Another challenge was the wish of the client to determine any delay caused by the integration with their ERP system in the production process, which was an issue in their ‘old’ running process.

The challenge

We reframed our challenge to find an architecture to integrate all systems, cloud, and on-premises as if it works as one integrated application almost in real-time and called it Event-Driven Cloud Integration Architecture.

We know that SAP S/4HANA supports events. And by choosing SAP Event Mesh and SAP Cloud Integration, SAP ALM, and SAP HANA Cloud on the SAP BTP, we got all the pieces we needed to implement our event-driven architecture for integrated systems.

Only the technology was not enough for a successful implementation. We also needed an implementation strategy and guidelines, which wasn’t as easy as its sounds. The SAP API hub provides information about the S/4HANA events but does not tell how to integrate these with SAP Cloud Integration. And the Discover section of the SAP Cloud Integration didn’t provide any guidelines or examples connected to S/4HANA events. But we didn’t find an end-to-end implementation guide for an Event-Driven Cloud Integration Architecture from SAP.

During our preparation for the implementation phase, we found out why. And the answer is simple. The SAP events in S/4HANA are triggered on that high level that you don’t know in detail what is happening where reasons and possible solutions can be that diverse, that this will be different for any industry. An event of S/4HANA should be followed by an API call to get the details in most cases. And only based on that detailed information can the relevance for the receiving system be determined. And even when an event is raised, it can have different meanings for S/4HANA and the other connected systems system. In other words, a change of a record in one system can result in the creation of a record in another system. Or the record change will be ignored when it is not relevant to the receiving system.

Let’s explain this by an example in our project. S/4HAHA raises a create product event when we create a material in the SAP system. But for one of our interfaces, we are only interested in finished products. To find out if the raised product event belongs to a finished product, we first need to call an SAP API for the product details and use this information to filter the events from those unrelated to finished products. And when we change materials in S/4HANA, we have to look not only if it is a finished product but also to check if the changed fields are relevant for our interface. When these appropriate fields aren’t adjusted for our interfaces, the raised S/4HANA product event is irrelevant and can be ignored. So we found out that an event for the S/4HANA event is not the same for the connected systems.

Common Data Events

To overcome this problem, we decide to introduce our Common Data Events (CDE) concept. In this concept, we define a CDE as an event together with data that is only relevant to our organization. And in this CDE concept, we also split our integration into two phases:

  • the enrich CDE phase
  • the dispatch CDE phase

We transform an application event into an agnostic and application-independent common data event and enrich it with applicable data based on a common data model (CDM) specific to our organization and raise it during this first phase. And we react to this CDE and process the event in the second phase.


Enrich CDE phase

In our project, we designed Common Data Events for all our entities by finding all integration scenarios which used the same S/4HANA events and looking for needed data.

We ensure data consistency by collecting all necessary raised S/4HANA events in an SAP Event Mesh queue. A webhook on this queue starts a Cloud Integration flow which enriches the event with detailed data by calling one or multiple APIs. The flow will also filter events on their relevance. Only enhanced and relevant will be raised again as Common Data Events on the SAP Event Mesh for further processing.

We developed our SAP CAP application to support this process, which checks the relevance and meaning of S/4 events and limits the needed processing of SAP Integration flows and the irrelevant traffic between SAP Cloud Integration and external systems. And this app was the missing glue for our event-driven integration solution.

Event-Driven Integration Architecture

Now that we have implemented this Enrich Common Data Event phase, we have become flexible because we have all relevant data belonging to our event action type in an Event Mesh queue. Next, we can dispatch Common Data Event to the receiving systems from the Event Mesh queue to multiple applications and push and pull notification and integration scenarios.

In notification scenarios, applications will handle the Common Data Event by themselves. They are primarily custom-built applications like SAP CAP applications or other mesh systems like Azure Event Grid or Amazon Simple Queue Service which will do the follow-up processes.

We focus on integration scenarios to fulfill a data exchange process between applications during our project. Based on the scenario, we need to transfer the CDE into a predefined action, primarily an API call or the creation of a flat file. We map the CDM data to the structure of the receiving system and map the content to the expected values. Again, we will use our SAP CAP application to determine the needed action in the receiving system. This helps us limit the number of calls (licenses costs) and modifications of the receiving systems (development and testing costs).


Based on our experiences with our client, we can conclude that SAP S/4HANA and SAP BTP Integration Suite are mature enough for running an Event-Driven Cloud Integration Architecture. But it will also bring challenges for our client. Every action in a related S/4HANA transaction will result in a small message, which should be handled in real-time in the proper context. Together with our client, we rethink how we would do the implementation and how they can support integration processes in the future because it is entirely different from a traditional integration.

But it brings much more value to our clients. They can now choose best-of-breeds software and custom-developed applications, which run like an integrated application without boundaries. Our client can now run and upscale their production process with a higher quality in real-time without limitations of the used applications. And it brings them many advantages in terms of cost, risk, implementation time, and support:

  • The CDE concept limits the number of calls to backend systems because it will call the backend only when it needs to enrich the event data once
  • It improves the security because receiver applications will never call S/4HANA APIs directly, and only relevant CDE with limited data will be available for the receiver systems.
  • S/4HANA events and APIs need to be configured only once and not for every integration flow
  • The CDE concept is agnostic and also fits non-S/4HANA events, making it easy to attach new applications and extend the integrated solution.
  • With the help of our custom build application, the events depend on the relevant data and not on the predefined high-level event type of S/4HANA. This will simplify the design of integration flows (development and support costs) and limit the number of calls to the receiving system (licenses costs).
  • The introduction of the two phases in our CDE concept decouples the event raising part (S/4HANA integration) from the event handling part (the integration flow connecting the receiver system). This makes adding new and changing existing integration flows easy without interfering with other flows or the event-raising part.

We like to hear from you about our Common Data Events concept and the chosen Event-Driven Cloud Integration Architecture from SAP. As you see, a lot of lessons were learned. There is much more to tell and explain, but I hope this will give you a first insight into the value of an Event-Driven Cloud Integration Architecture.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Martin Stenzig
      Martin Stenzig

      Robert, great article. Did you account for any of technical limits event mesh has at the moment (as per documentation)

      • The maximum message size is 1 MB for all messaging protocols.
      • The maximum storage space for all messages in all the queues per subaccount is 10 GB. If messages are above 1 MB, the AMQP 1.0 over WebSocket and MQTT 3.1.1 over WebSocket connections are closed. It’s applicable for applications running on Cloud Foundry or on other platforms.
      1. Did you put anything into the integration suite to check the 1MB message size limit if you complete it
      2. Depending on how many messages you are firing, how big the enriched messages are and what system outages you are accommodating the 10GB limit is probably not of concern?



      Author's profile photo Robert Eijpe
      Robert Eijpe
      Blog Post Author

      Hi Martin,
      Thanks for your positive feedback.

      In our case, we didn't take into account the maximum message size, because the messages aren't that big, mostly bytes, not even kb. Events are mostly related to single objects and interfaces mostly need a delta of the available SAP fields.  And even though we have a high throughput of messages we didn't face any problem with maximum storage space because most of the messages are directly processed and removed from the queue.

      But even if the message size becomes too big, the CDE approach will still work. Only in this case, you should send the complete message to Event Mesh but you persist the message in the Cloud Integration flor of enrich CDE phase and sent only the key fields for the stored message to the event mesh.

      In the cloud integration flow of the CDE dispatch phase, you can then retrieve the data and process it further.

      Hope this will answer your questions.


      Author's profile photo Vinod Patil
      Vinod Patil

      Hello Robert,

      This is really nice blog. I liked the idea of coming up with CDE framework.

      I think events with minimal information is not sustainable solution from SAP.

      Isn't IDOCs with immediate processing a better way to send data out of SAP than Event followed by extra API call? 🙂

      Did you explore Advanced Event Mesh from BTP? I believe it solves this problem.