Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Growing demands of rapid digital interactions and expanding the ecosystem means IT departments will need to spend more time eliminating silos, bridging between services. While handling the ever complex, immense amount of data in real-time. Also trying to bring the new service to the market at an expedited speed. But also with cost, and good resource management in mind. 

This drives a drastic change to how systems are interconnected. From JUST API calls to more scalable event-driven architecture. Development style has gone from heavy service oriented to small quick turn around snippets of code. With a Serverless platform, it can help lowering the operation cost and reduce packaging and deployment complexity. 

SAP being a world's leading enterprise business processing solution. There is always going to be a need to connect an organization's core to other SaaS offering, partners or even another SAP solution.  Red Hat Integration offers flexibility, adaptability, and ability to move quickly with framework and software to build the event-driven integration architecture. Not just connect, but also maintain data consistency across platforms. 

 

An Architectural Overview


This is how Red Hat Integration can help to achieve modernized integration with SAP. 

SAP exposes business functionality through the Netweaver Gateway. Camel K or Camel in RHI can be used by developers to integrate(bi-directional) these functionalities. Camel K/Camel not only connects the dot, but also provides a set of built-in patterns and data transformations components making customized integration easy. They can be deployed in the form of a serverless function, serverless source/sink, or a long running microservice. 

At the time of writing, I don’t see a complete support from SAP event enhancements, developers do still require to retrieve real data via other methods such as through OData and APIs. To implement a true event driven architecture, AMQ Streams (Kafka) can be used as the event stream store to handle streaming of events, for reducing decoupling and achieving near real-time latency. 

Since the system is based on events, we can also capture changes of data state in Databases using Debezium. Keeping all data consistent by passing the updated state back to SAP. 

When there is a need to expose any functions or services as API endpoints, we can easily implement it with Camel using the OpenAPI Standard Specification. And have the API managed and secured by the 3scale API management platform. 

Openshift as the platform that can run on major cloud vendors and on-prem, so it’s truly cloud agnostic. It provides a serverless platform to deploy and manage all functions. And with Interconnect we will be able to broadcast events, to the closest data center to optimize traffic control. 

As a result,  it is now ready to connect to endless 3rd party and partner services, streaming large amounts of edge signals and providing real-time processing from edge devices. Legacy, mainframe systems can also be part of the ecosystem. Lastly, this is a good tool for SAP to SAP integration too.



Technical Dive


Connect with Camel


SAP offers interfaces such as OData v4, OData v2, RESTful API and SOAP as the HTTP based one, or you can also use the classic RFC(remote procedure call) and iDoc Messages. And there is a recent event enablement add-on that offers AMQP and MQTT protocol. 

The Camel feature in RHI allows you to seamlessly connect to any of your preferred protocols. Developers can simply configure to connect to the endpoints with it’s address, credential and/or SSL settings. 

Example: 

Camel code connecting to OData v4  (Olingo4 components)
.to("olingo4://read/SalesOrder")

[NOTE]: https://camel.apache.org/components/latest/olingo4-component.html

 

Camel code connecting to Restful API
.to("http://demo.sap.io/SalesOrder")

[NOTE]: https://camel.apache.org/components/latest/http-component.html 

 

OData v4, RESTful API and Events protocols are better suited for Serverless. As OData v4 has significant performance over the older version, and support for analytical application. Whereas OData v2, RFC and iDoc are better used in traditional Camel Project. 

 

What Components to use for SAP endpoints.








































Serverless

Camel K/Camel Quarkus
Camel
OData V4
OData V2
Restful API
SOAP O
RFC/IDoc
Events

 

After receiving the payload, Camel can then use the built-in data format components to transform it. In the serverless case, data is mostly in the form of JSON.  By marshal and unmarshalling incoming payload, we can easily access the value and retrieve the content we need from the payload. 

 

Example: 
.marshal().json().
.to("kafka:mytopic")

[NOTE: https://camel.apache.org/manual/latest/json.html]

 

For syntax mapping between incoming and outgoing payload, there is visual tooling for you to design the mapping, and run that data mapping via Camel Engine. 


[NOTE: https://www.atlasmap.io/]


 

Utilize the useful pattern in Camel, they are a straight  implementation from Enterprise Integration Pattern, which organize and define the most used pattern and behaviour of integration applications. 

Example:

    Split streams by and split with token “,”.
.split(body().tokenize(",")).streaming()
.to("knative:mychannel")

[NOTE] (https://camel.apache.org/components/latest/eips/split-eip.html

 

Flexible workload with Red Hat Serverless


There aren’t many things Camel K developers need to worry about when converting the long running application “Serverless”. First thing you have to do is make sure Red Hat Serverless is installed on the OpenShift platform (Should be done by the platform admin).  And you are ready to go. 

Camel K will detect if Serverless is available, and create the services needed for serverless. But there are two aspects of Serverless that you might want to take a closer look at. 

 

AutoScaling setup 

You can scale the replicas for an application/function to closely match incoming demand. When the administrator sets up the cluster for Serverless, they would already have configured the autoscaler to apply globally to the cluster. It watches the traffic flow and scale accordingly. You can however override the setting, by using the “knative-service” trait in Camel K. 

Example:
kamel run --trait knative-service.autoscaling-class=hpa.autoscaling.knative.dev --trait knative-service.autoscaling-metric=concurrency integration.java

Eventing setup

Eventing enables late-binding event sources and event consumers. The cluster admin should have already set up the underlying layer to store the events. Some options may not be persistent and may perish when nodes restarts. 

 

This is quick demo showing how to Integrate SAP with 3rd party services Telegram. Using Camel K with Knative Eventing.


Summary


This is a quick overview of how we can use Camel K to build an true event-driven serverless integration for SAP to 3rd party services or other SAP modules. With Red Hat Integration Camel’s OData 4 connector to interact with the exposed SAP functions. And also by deploying on OpenShift serverless platform will automatically turn the integration application serverless.  
Labels in this area