Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Recently I had an opportunity to work with Sanket Taur (IBM UK) and his team on a demo, showcasing how Red Hat products can help speed up innovation with SAP Landscapes. To be honest I was shocked at how little time we were given to create the entire demo from scratch. It’s less than a week. While still doing our day job, having a couple of hours per day to work on it. If this doesn't convince you..  I don’t know any other stronger proof than this, to show how agile and fast a cloud solution can be from development to production. 

I STRONGLY encourage you to attend Sanket’s session for more details, this post is JUST my view on the demo, and things I did to make it running. The demo was a simple approval process of Sales Orders. The SOs are created in the Core SAP platform (In this case ES5), therefore we need to create an application that speaks to the Core SAP platform and retrieve all the data needed. 


First thing first, we need a Kubernetes(k8s) platform. And then I used Camel K -- an enhanced framework based on Camel (part of Red Hat Integration product) to create the application. There was some mixup during the setup, instead of the OData v4 endpoint from ES5 for SO, line items and customer details. I was given an OData v2 endpoint. (Needless to say, how more efficient the OData v4 is, compared to v2. Please do update it when you have a chance). Note that Camel K only supports OData v4. HOWEVER, we can still get the results using normal REST API calls (So you are still covered).  

This is how Camel helps you retrieve all the information needed. As you can see I have made several requests to get all the data needed as well as doing some transformation to extract results to return. 

 
from("direct:getSO")

   .setHeader("Authorization").constant("Basic XXXX")
   .setHeader("Accept").constant("application/json")
   .toD("https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BASIC/SalesOrderSet('${header.SalesOrderID}')?bridgeEndpoint=true")
   .unmarshal().json()
   .setHeader("CustomerID").simple("${body[d][CustomerID]}")
   .marshal().json()
   .bean(this, "setSO(\"${body}\",\"${headers.CustomerID}\")")

;

 from("direct:getItems")
     .setHeader("Authorization").constant("Basic XXXX")
     .setHeader("Accept").constant("application/json")
.toD("https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BASIC/SalesOrderSet('${header.SalesOrderID}')/ToLineItems?bridgeEndpoint=true")
     .unmarshal().json()
     .marshal().json()
     .bean(this, "setPO(\"${body}\")")
;

 from("direct:getCustomer")
     .setHeader("Authorization").constant("Basic XXXX")
     .setHeader("Accept").constant("application/json")
.toD("https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BASIC/BusinessPartnerSet('${header.CustomerID}')?bridgeEndpoint=true")
.
unmarshal().json()
.marshal().json()    
.
bean(this, "setCust(\"${body}\")")

;

 

The endpoints to trigger the call to SAP, is exposed as an API. Here I use Apicurio Studio to define the API contract. With two endpoints, fetch and fetchall. One returns SO, PO and Customer data, where the other one returns a collection of them. 


 

We can now export the definition as a OpenAPI Specification contract in the form of YAML (Link to see the yaml). Save the file into the folder of where your Camel application is. Add the API yaml file name to your Camel K application mode line, and Camel K will automatically map your code to this contract.
   // camel-k: language=java dependency=camel-openapi-java open-api=ibm-sap.yaml dependency=camel-jackson

 

By using the Camel K CLI tool. Run the command to deploy the code to the OpenShift platform. 
   kamel run SapOdata.java

 

And you should now see a microservice running. Did you notice how Camel K helps you, not only it detects and loads the libraries needed for you, but also containerised it as a running instance. 


Go to my git repo to see the full code and running instructions.

 

Kafka was used in the middle to set the event driven architecture. So the SO approval application can notify the shopping cart client when it’s been approved. 


Since everything was put together in a week, with everyone in different timezones, miss communication will happen. What I did not realize was that all the client applications, SO approval and shopping carts were all written in JavaScript, and must communicate via HTTP. But Kafka only does Kafka protocols!!! Therefore, I set up an Http Bridge in front of the Kafka clusters, so it will now translate the Kafka protocols. 


And now clients can access the topic via HTTP endpoints.  For more information on how to set, go to my Github repo for more detailed instructions.


 

Last but not least, we need to migrate all UI5 SAP web applications to OpenShift. The UI5 is basically an NODEJS app. We first create the docker file to containerize it. And push it to a container registry.
  docker push quay.io/<YOUR_REPO>/socreate

 

And deploy the application to OpenShift.
  oc new-app quay.io/<YOUR_REPO>/socreate:latest --as-deployment-config

 

BUT WAIT!! Since UI5 only does binds to *localhost* (weird..), we need to add a proxy that can tunnel traffic to it. Therefore, I added a sidecar proxy running right next to the NodeJS application. By adding the following configuration. 
spec:
     containers:
       - name: nginx
         image: quay.io/weimei79/nginx-sidecar
         ports:
           - containerPort: 8081
             protocol: TCP
         resources:
           limits:
             cpu: 500m
             memory: 1Gi
         terminationMessagePath: /dev/termination-log
         terminationMessagePolicy: File
          imagePullPolicy: Always



 

This will start the proxy, and since this NGINX proxy starts on port 8081, make sure you update all related settings on OpenShift.
  oc expose dc socreate --port=8181
  oc expose svc socreate


 

And this is how you would migrate the UI5 application from a local SAP instance onto OpenShift. More detailed migration instructions, check out my Github repo

 

Once it’s done, you can see all the applications are running as a container on the cloud. And ready to approve the SOs. 


 

This is actual developer view on top of our demo OpenShift platform



 

Thank you Sanket for this fun ride, all the nail biting moments, but this is all the fun in IT right? We work through problems, tackle issues and ultimately get everything done! 🙂 If you are a SAPer, and want to explore the world of clouds and containers, what are you still waiting for? Join the ride! This is the story on how we made SAP Cloud Native and Event Driven in 4 days. 

To see the full version, be sure to attend Sanket’s session (Virtual free events):

SAP & OpenShift: From classic ABAP development to cloud native applications: Use cases and reference architecture to implement with SAP Landscapes to unlock innovation enabled by Hybrid Cloud and Red Hat OpenShift.

Register here:

https://events.redhat.com/profile/form/index.cfm?PKformID=0x373237abcd&sc_cid=7013a000002w6W7AAI

Refer to Sanket's awesome post for more details on the concept and why we created this demo.

https://blogs.sap.com/2021/06/28/safeguard-your-investments-with-red-hat-openshift/
Labels in this area