Skip to Content
Technical Articles
Author's profile photo Chaithanya Mirle Kantharaju

Building S/4HANA on-Premise Extensions using Enterprise Messaging and Node

In my previous blog, I had described how we could use Enterprise Messaging Service on SCP to orchestrate communication and events between applications. You can find it here Using Enterprise Messaging for Extensions with Node Part 1

It also gives a peek on how we could use it to extend on-Premise S/4HANA for any custom extensions.

In this blog I would describe how this could be done.

Let me start with the below picture which shows how architecture for such a scenario would work.

 

 

So, the main components we would need are:

  • Enterprise Messaging Service on SCP
  • An orchestrator service (Similar to my previous blog)
  • Cloud Connector: Responsible to expose S/4HANA services to outside world
  • The custom App which ultimately performs the extension business logic that you need

Now, let us quickly check the flow of data between these components.

 

As you would notice all components works in a “Fire and Forget” mode. The S/4HANA system, EM service, the orchestrator all simply informs and doesn’t get any acknowledgement. This way you can split your applications in a logical manner without creating bottlenecks in the process.

So, when a user changes a business partner in the BP transaction, an event is triggered using MQTT protocol.

The Enterprise Messaging (EM) Service puts this in a queue which is subscribed by the Orchestrator. The Orchestrator has the knowledge of whom to forward the request to.

Finally, the message only contains the BP Id and not full information. Hence if the final BUPA Application needs more information, it’s responsibility of the application to request for more information.

Let’s begin to develop extension as described above.

Pre-Requisites

  • S/4HANA 1909 or above (I have used 1909 FPS 01)
  • SCP trial account
  • Cloud Connector (If you want to perform GET for more information)
  • Visual Studio Code
  • Node v12 installed
  • CF CLI
  • CDS CLI

Part 1: Prepare S/4HANA system

In order to enable S/4HANA to send messages to SCP, there are some initial setup which needs to be performed.

I will broadly list down the steps, the details of each step are very nicely documented in the blog below. You can simply follow that for setting up the initial settings.

https://blogs.sap.com/2019/09/16/sap-enterprise-messaging-for-s4hana-on-premises/

  1. Setup user roles – Make sure your user has the roles /IWXBE/RT_XBE_ADM, /IWXBE/RT_XBE_BUSI, /IWXBE/RT_XBE_MDT. These are role templates, so you would have to create a role and assign to yourself.
  2. Setup RFC Destination – This step has sub-steps (shown below) as well, since you need to provide the SCP credentials and service key.
    1. Creating Enterprise Messaging service instance on SCP.
    2. Configuring the RFC on S/4 System.

The detailed steps is documented well at above link and here as well https://help.sap.com/viewer/810dfd34f2cc4f39aa8d946b5204fd9c/1809.000/en-US/cd8a2607096c4c8ab2aa154abe05fd98.html

  1. Account setup – You would need your client ID and secret key from Service instance here.
  2. Manage Channel and Partners – A Channel is a connection through which a service instance is connected. So there is a 1:1 relation between a Channel and Service instance.

This is also the place where you need to specify the topic space and all the messages through this Channel will be concatenated with this topic.

Below is a screenshot of the configuration I have done.

  1. Maintain Events – You need to explicitly define which events/topics you need to trigger. By standard SAP provides multiple topics which can be configured. For our scenario I have configured BO/BusinessPartner/* .

This means any event (Create, Change, Delete) with relation to Business Partner BO will trigger a message through the above channel we configured in step 4.

  1. Create Queue – We now need to setup a queue on Enterprise Message Service instance. Please follow the below steps.
    1. Open the Enterprise Message instance and click on View Dashboard
    2. Click on Queues and create a queue named
    3. Now let’s create a subscription. Click on Queue Subscriptions and create a Topic Name same as what you have given in step 4 Topic Space. So, it should look something as below.

And that’s it actually. These 6 simple steps are all you require for the initial setup of S/4HANA on-Premise system.

Let’s quickly test the setup.

Try to change a Business Partner in S/4HANA by using transaction BP.

After saving, open the Enterprise Messaging Service Dashboard and you should be able see a new message in the BP queue that you created.

 

Part 2: The BUPA extension application using Node JS

Ok, so the next part is to create a custom application and host it on the Cloud Platform.

For this example, the scenario I did was to store more information about the Business Partner once it is changed. Generally, the SAP provided Business Partner fields are not sufficient and we end up extending the BO.

Instead, here we would extend the Business Partner on SAP Cloud Platform. This is a simple but a typical Side-By-Side Extension.

I developed this application locally on Visual Studio code, but you could use any IDE of your choice.

I will be using the same application which I had described in the blog Using Enterprise Messaging for Extensions with Node Part 2

There is a small change however, the customer entity which was read-only is now made changeable. And also, I’m now calling it as customerProfiles.

You can however still follow the below steps which describes how to create the application.

Step 1: Generate Package.JSON

In the project root folder open the command prompt in administrator mode and execute the below command.

npm init –yes

This will generate the initial package.json file in the root folder.

Step 2: Install the CDS dependencies

In the same folder run the below command

npm i -g @sap/cds-dk

This will install the SAP CDS package which we will help us create a DB module and service module and also deploy as an MTA on cloud foundry.

Special Note: At the point this blog was written, CDS works only on Node V12 or below. So, make sure the Node you have installed is Version 12, 10 or 8.

Step 3: Initialize the project

Run the command cds init customerProfiles

Step 4: Define Data Model

Open the project in Visual Studio Code.

Right click on the db folder and create a file data-model.cds and the below code.

namespace my.customerProfiles;

entity customerBasic {
    key ID : Integer;
    fname : String;
    lname : String;
    vat : Integer;

}

entity profilesValue {
    key ID : UUID;
    key profiles : Association to profiles;
    key customer : Association to customerBasic;
    value : String; 
}

entity profiles {
    Key ID : Integer;
    name : String;
    
}

This defines entities and attributes which will ultimately be deployed on HANA service in SAP Cloud Platform.

Step 5: Let’s add some mock data

Now CDS automatically scans for a folder called CSV under DB folder. If it finds a file with same name as the entity, it automatically uploads the data to the binded database and respective tables.

So, create a file named my.customerProfiles.customerBasic.csv and enter the data as below.

ID;fname;lname;vat
100;Chaith;Mirle;987766
101;Banana;Slim;76543
200;Apple;Jobs;1232334
210;Grapes;Sour;123455
215;Hotel;California;54433

Create another file my.customerProfiles.profiles.csv and enter the data as below.

ID;name
10;shop Floor Size
11;No. of Employees
14;Brand Active
16;Location
17;Outlet

Step 6: Create a service to expose data

Let’s create a service to expose these data.

Create a file customer-service.cds under the srv folder. Paste the below code in the file.

using my.customerProfiles as my from '../db/data-model';

service CatalogService {
    entity Customers as projection on my.customerBasic;
    entity Profiles @readonly as projection on my.profiles;
    entity ProfilesValue as projection on my.profilesValue;
}

Step 7: Let’s add SQLite3 for testing locally

Open the command prompt and run the command npm i sqlite3 -D

This will install sqlite3 and also adds it as a Devdependency in package.json

Step 8: Test Locally

To test a CDS application locally. Run command CDS Watch in the project root folder.

You should be able to see the Node uploading the data to sqlite and also the services should be generated.

By default, you will see that our Application is listening on localhost:4004.

Special Note: CDS Watch is similar to Nodemon, it watches for any changes in your project folder and automatically restarts the server. So you can actually run CDS Watch when you begin the project itself.

Now, if you open a browser and go to http://localhost:4004/ you should be able to see resources available.

Step 9: Deploy to SAP Cloud Platform

This step requires CF CLI to be installed. I hope you have already done it.

Before deploying let’s add a dependency in package.json as below.

  "cds": {
    "requires": {
      "db": {
        "kind": "sql"
      }
    }
  }

 

Since we will be using HANA and not SQLite when we deploy, we need to add that dependency to our project.

Run the command npm add @sap/hana-client –save

Finally, we are now ready to deploy the application.

Login to cf using command cf login

Create a hana service instance using below command

cf create-service hanatrial hdi-shared customerprofile-db

Now, run the below commands

cds build –production

cf push -f gen/db

cf push -f gen/srv –random-route

The first command builds your application. This generates a separate module for db and srv

The second command creates an application which binds to the hana service we created above.

The third command creates an application which exposes the services similar to what we tested earlier.

Special Note: Login to your CF account and make sure these applications are binded properly and started. If not, you would have to sometimes bind the instance manually and restart the service.

Let’s now create the Orchestrator Application which will call the above POST API when it receives a message from the S/4HANA.

Part 3: The Orchestrator Service

The Orchestrator service is another Node application which we will deploy and will have the knowledge on who is interested when the message comes to the queue.

I will develop this in the Business Application Studio simply because it’s easier to test with EM service binding on my cloud foundry.

Step 1: Login to SAP Cloud Platform Trial Account

Login to your SAP Cloud Platform trial account and click on SAP Business Application Studio.

If you are launching after a day, you will notice your dev space would be stopped. Start the service again and then click on the space.

Step 2: Create a CAP Application

In the welcome tab, click on Create project from template

Select CAP Project and click on Start. Then enter a project name as bupaReceiver.

This should create an empty project with different CAP modules such as db, srv

Step 3: Edit the project

Click on the file package.json.

CAP would’ve already generated some mandatory tags. Please add the below dependency.

  "cds": {
    "requires": {
      "messaging": {
        "kind": "enterprise-messaging"
      }
    }
  }

This application would need additional dependency called “axios” which we would use to call the REST APIs of the customer update application which we created above.

Axios is a HTTP client which provides us APIs to call REST services in a easy and standard manner.

You will find the complete code sample of package.json below.

{
  "name": "bupareceiver",
  "version": "1.0.0",
  "description": "Business Partner Orchestrator",
  "repository": "<Add your repository here>",
  "license": "UNLICENSED",
  "private": true,
  "dependencies": {
    "@sap/cds": "^3",
    "@sap/xb-msg-amqp-v100": "latest",
    "axios": "^0.21.0",
    "express": "^4"
  },
  "cds": {
    "requires": {
      "messaging": {
        "kind": "enterprise-messaging"
      }
    }
  },
  "devDependencies": {
    "sqlite3": "^4.2.0"
  },
  "scripts": {
    "start": "npx cds run"
  }
}

Create Services

Since we are only creating a service which will in turn call APIs, we don’t need any persistency and hence the db module is not required in this application.

Create a new file under folder srv called bupaReceiver.cds

Add the below code to it.

service ReceiverService {
    entity DummyEntity {
        key dummyID : Integer;
    }; 
}

This file creates a service called ReceiverService

Create a new file under folder srv called bupaReceiver.js

const cds = require ('@sap/cds')
const axios = require('axios');

module.exports = cds.service.impl ((srv) => {
  srv.on('S4HANABP/BO/BusinessPartner/Changed', async (msg) => {    
    const messagePayload = JSON.stringify(msg.data);   
    const bupaID = JSON.parse(messagePayload);    
    const ID = parseInt(bupaID['KEY'][0]['BUSINESSPARTNER']);
    console.log(ID);

// Make a request for a user with a given ID
 axios.get('<you domain>/Customers/'+ID)
  .then(function (response) {
    // handle success
    console.log("Business Partner Found", ID);
  })
  .catch(function (error) {
    //Post New Business Partner  
    let res = axios.post('<you domain>/catalog/Customers', 
       {        "ID" : ID,
                "fname" : "S4 Fname",
                "lname" : "S4 lname",
                "vat" : 12344 } )
           .then((response) => {
     console.log("Business Partner Created", ID);
        }, (error) => {
        console.log(error);
    });;
  })
  .then(function () {
    // always executed to update the Profiles
    let res = axios.post('<you domain>/catalog/ProfilesValue', 
       {    profiles_ID : 16,
            customer_ID : ID,
            value : "Munich" } )
           .then((response) => {
     console.log("Profiles Updated");
        }, (error) => {
        console.log(error);
    });;    
  });

 });;
})

This particular code gets called on event when message is received in the queue S4HANABP/BO/BusinessPartner/Changed

Like I said initially S/4HANA only sends the ID of the business partner and if we need more information the Business Partner application should ask for more. This orchestrator only extracts this BUPA number and passes it on.

Please note: To replace your host name in the axios.post function calls.

Once, the message is received, a call is made to check if there is a customer with same ID, if yes then it simply adds the profile values, if not, it first creates a customer and then adds profiles to it. All of these calls are done using the Axios APIs to the customer profile update application.

Based on the queue you can make several calls to different applications and orchestrate your whole integration scenario in this manner.

This is not a true orchestrator because there is still some kind of business logic here. In reality, it would be better if the logic to find customer is also put in our BUPA application. But I will leave this upto you 🙂

Add MTA File

Let us add the MTA.yaml file and build it.

To add the MTA file, right click on the project bupaReceiver and click on Open in Terminal.

Enter the command CDS ADD MTA

This will add a MTA.YAML file which can be used to build our application.

Now right click on the MTA.YAML file and select Build MTA.

This will generate the mta_archives folder with the .mtar files which will be used to deploy on cloud foundry.

Deploy to Cloud Foundry

Expand the folder mta_archives and right click on bupaReceiver-srv.mtar file and select Deploy MTA Archive.

Verify the cloud foundry space to make sure the instance is running.

Step 4: Test the full scenario

Now let’s try to test the full scenario.

In order to test this make sure the following applications are running with at least 1 instance.

Open a Business Partner in your S/4HANA system and change an attribute.

Now open the bupaReceiver service, goto Logs. Here you should be able to see the Business Partner number that was changed and also the message that Profiles was updated.

And if you open the customerProfiles application and goto logs, you should be able to see the Post call done.

Summary

So this blog was meant to give you a kickstart on how S/4HANA can be extended using Enterprise Messaging on SAP Cloud Platform.

We could extrapolate this pattern to more complex scenarios which would enable our customers to breakdown large applications to smaller chunks and use Enterprise Messaging to connect each chunk.

In my opinion, this is very a powerful architecture pattern which can be used to develop applications in a fast and scalable manner and I would definitely recommend it for my customers.

 

Assigned Tags

      4 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Gregor Wolf
      Gregor Wolf

      Hi Chaithanya Mirle Kantharaju,

      thank you for this extensive post. I like that you spread the word about event based systems. One suggestion I have regarding the implementation of the data after you've received the event. I would suggest that you switch from the plain axios request to the CAP way and follow the documentation  Sending Requests. That way you can use a destination instead on a hardcoded URL. Also I would suggest to use the API_BUSINESS_PARTNER_SRV to read the changed data from the S/4HANA backend.

      Best regards
      Gregor

      Author's profile photo Chaithanya Mirle Kantharaju
      Chaithanya Mirle Kantharaju
      Blog Post Author

      Thank You.

      That's true we should never use the hardcoded requests, I did just for the demonstration purpose. In a real world I would in fact recommend to use something even better than CAP, the Nginx Controller.

       

      Thanks,

      Chaithanya

      Author's profile photo Gregor Wolf
      Gregor Wolf

      Can you explain the purpose of the Nginx Controller in this scenario?

      Author's profile photo Chaithanya Mirle Kantharaju
      Chaithanya Mirle Kantharaju
      Blog Post Author

      Well it is kind of what CAP does but with more features. For example we can measure the statistics of API calls as well.

      The API Management of Nginx can be used as an Orchestrator instead of hard coding the APIs.

      https://www.nginx.com/products/nginx-controller/

      https://www.nginx.com/solutions/api-management-gateway/

      It will be quite useful when we have very large amount of calls and services to be managed.