Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
BarisBuyuktanir
Explorer

 

What is an Advanced Event Mesh (Solace) Connector

"Advanced Event Mesh(AEM) Connectors are assets that enable integration between AEM event brokers and various external systems, applications, and cloud services."

Acting as a bridge between the broker and other systems, the connectors facilitate the integration of AEM messaging capabilities into different environments, such as cloud-native applications, IoT platforms, microservices architectures, and traditional enterprise systems while providing a standardized way to connect the messaging platform with these environments, ensuring interoperability and ease of use. The role is crucial as it enables seamless integration and communication between AEM event brokers and diverse environments, in development and building of scalable, event-driven architectures and distributed systems.

If you're familiar with SAP BTP Cloud Integration, you can consider them to have a similar mentality as SAP BTP Integration Suite Open Connectors.

There are different types of connectors in terms of where they operate, based on who developed/distributed them etc.(Solace, 3rd party).

You also have the flexibility to develop your own connectors.

Roughly there are rich set of connectors including connectors already integrated within the broker itself communicating to and from target systems via standard protocols, connectors for other event brokers, connectors to/from analytics-stream processing platform, iPaaS providers on top of SAP BTP Cloud Integration as Mulesoft Anypoint, Dell Boomi or connectors deployed to the source/target systems such as ASAPIO.

For a list of growing number of connectors and their details you can visit

https://solace.com/integration-hub/


SCENARIO

In this blog post, we are going to simulate a scenario via Advanced Event Mesh Connectors, particularly Amazon S3 Producer, to showcase how easy it is to connect the AEM event broker to an Amazon S3 Bucket and simulate S/4HANA as the source system.

For simplicity's sake and to focus on the main purpose of the blog post, we are going to refer to S/4HANA as the source system. However, we will simulate the S/4HANA event publishing part using Postman. This simulation will yield the same inbound results from an Advanced Event Mesh perspective as if they were published from S/4HANA. (Different variations of S/4HANA's event publishing to Advanced Event Mesh are described in different blog posts, which you can find in the community.)

 

ARCHITECTURE

Below architecture will be used end-to-end.

Architecture and FlowArchitecture and Flow

Along with showcasing AEM Connectors, this architecture is practical and could potentially serve as a solution to modernize legacy architectures characterized by point-to-point or file-based integrations. For example, in use cases where S/4HANA needs to integrate with third-party applications via SFTP/File-based interfaces, which are often associated with potential problems.

With the implementation of Event-Driven Architecture via AEM and connectors like this, and by leveraging the numerous advantages offered by cloud-based object stores (e.g., AWS S3), customers can realize both immediate and strategic benefits for future use cases, both directly and indirectly.

As a result this kind of architecture with tools like connectors provides much greater flexibility and easiness.

Very high level, below architecture represents the flow where;

  • A source system (like S/4HANA) publishes new customers via events.
  • SAP BTP, Advanced Event Mesh via connectors connects to the AWS S3 Bucket and puts the content(file) in the bucket.

Then a third party system of your preference can consume this object(file) in a standardized and secure way.  

 

CONFIGURATION STEPS

For the configuration of this scenario below steps will be applied.

Configuration of AWS S3 Bucket

Very simply you need to have an AWS S3 Service enabled and an S3 Bucket configured for this purpose.

I have created one and named it as bb-s4-store-s4hana

Amazon S3 BucketAmazon S3 Bucket

You also need to configure AWS IAM Service for  a user to access the bucket and access key to be used.

This key will later be used in the configuration of Advanced Event Mesh S3 Bucket Connector.

Access Key for the BucketAccess Key for the Bucket

Configuration of Advanced Event Mesh Connector: S3 Bucket Provider

From the left side menu of the broker manager, (Connectors link) you can access connectors that Advanced Event Mesh (Solace) provides out-of-the box.

Initial Settings

Add a new one and choose Amazon AWS > AWS S3

Connectors for AWS, Azure, Google CloudConnectors for AWS, Azure, Google Cloud

AWS ConnectorsAWS Connectors

Use the access key and bucket information that is being configured in the first step

Wizard -AuthenticationWizard -Authentication

Next step for client profile and Amazon S3 Host (you can use the defaults for demo)

Wizard- Connection informationWizard- Connection information

Configuring the file name and subscription

File Name Mapping Function

This is the part that is the core of the configuration.

The filename mapping function determines the object name in S3 bucket (which eventually becomes your file name). The good thing here that Advanced Event Mesh Connector (eventually RDP) allows you is the functions and variables you can use, so that you make your file name dynamic and informative. You can use Substitution Expressions which is a Solace-specific expression language used to replace specific text attributes (request targets, request headers, etc.) with system generated output.

More information can be found from the below link.

https://docs.solace.com/Messaging/Substitution-Expressions-Overview.htm

As I wanted my filename to be

customer-created-XXXXXXXXXXXX.csv

with a timestamp therefore I utilize below expressions/functions in the naming of the object:

  • unixTime(): Returns the number of seconds since midnight, January 1, 1970 UTC for the specified timestamp, ignoring leap seconds which is known as unix time or epoch time. This will allow uniqueness of the file name and be informative about file creation time.
  • topic(N) :Returns part or all of the message topic. I used the first and second segments of the topic in the file name which are “customer” and “created”. 

Subscriptions

These are the topics to which my queue(generated) is subscribed, triggering the creation of objects to be pushed to S3 when an event related to these topics is published. Specifically, I want the object(file) to be created when a 'customer/created' event is published.  

My final result of File Name and Subscriptions would be as follows.

Wizard- File Name Mapping and SubscriptionsWizard- File Name Mapping and Subscriptions

As a summary, AEM Connector Wizard informs that the following objects to be created.

Result – Assets (Artifacts) createdResult – Assets (Artifacts) created

Technically behind the scenes, the connector with a very quick 2-3 step process creates an RDP Client with REST Consumer, Queue and Queue Binding and attaches related subscriptions that we should do when we configure everything manually. Within the auto-configuration of these assets it also configures the AWS Authentication, schemes, remote call details.

As mentioned, this could also be done with different manual steps. The details of how you can manually set RDP and substitution expressions for this and other purposes could be found in my other post below

https://community.sap.com/t5/technology-blogs-by-members/advanced-event-mesh-dynamically-publishing-...

"As a result; all this setup is done within seconds, it's up and running.."

 

Resulting AEM Artifacts CreatedResulting AEM Artifacts Created

TESTING

We will be using Postman to simulate S/4HANA event creation making a direct REST call to Advanced Event Mesh.

Payload

The message payload will be a CSV where there is the customer name and a guid in each line.(with the below header line)

“Customer Name”;”GUID”

Topic

Our topic will be customer/created for simplicity.

Postman REST CallPostman REST Call

And Voilà! When we check AWS S3 Bucket after publishing 2 events, there are two objects(files)

Result - Files(objects) in the bucket for consumer / target systemResult - Files(objects) in the bucket for consumer / target system

And the content of a sample file which is basically the event message payload is as follows:

File ContentFile Content

Once you have this in the S3 Bucket, you can reach the file/ download this file securely from a third party application.

 

FINAL WORDS & TAKE AWAYS

With a very quick and simple scenario and with the help of Advanced Event Mesh Connectors, we demonstrate how to easily set up an end-to-end use case with modern event driven implementation.

At the same time with event-enabling our scenario, we seamlessly connected our broker to Amazon S3.

Imagine the need to publish the same information to Azure Functions as well. All that is required is configuring the Azure Functions Connector specific to that purpose and nothing else needs to be changed to extend overall scenario. The use case can be further extended by calling other custom REST endpoints(RDPs) or you can subscribe to the same topic with different types of consumers and receive the same event and process it.. Still without affecting either the publisher and other subscribers.

The possibilities with these kinds of setups are almost “limitless”. They are easy to maintain once properly established and straightforward to configure with the connectors and event-driven architecture.

You can find my other blogs from the below links:

Labels in this area