Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
chaithanya_mk
Participant
Ok, so now that we have our sample sender application which we did in the part 1, let's continue further. You can go to part 1 here

In this part, let's create the customer update application and the orchestrator service which calls this update application.

Ok before we create an orchestrator let’s create a application which can update some data about customers.

Customer Update Service


Imagine a scenario where when a Business Partner in S/4HANA is updated, you want to also update some information about the business partner but on SAP Cloud Platform (A simple side-by-side Extension).

So again I have created this application using Node. But this time let’s try doing this on Visual Studio Code.

Step 1: Generate Package.json


Open the command prompt in administrator mode and go to project root folder. Execute the below command.

npm init --yes

This will generate a the initial package.json file in the root folder.

Step 2: Let’s install the required dependencies


In the same folder, run the below commands.

npm i -g @Sap/cds-dk

This will install the SAP CDS package which we will help us create a DB module and service module and also deploy as a MTA on cloud foundry.

Run the command CDS V to make sure it has been installed correctly, you should get a result similar to below screenshot.


 

Special Note: At the point this blog was written, CDS works only on Node V12 or below. So make sure the Node you have installed is Version 12, 10 or 8.

 

Step 3: Initialize project


Run the command cds init customerInfo

This command creates a project customerInfo similar the CAP projects that you created for Sender Service.

 

Step 4: Define Data Model


Open the project in Visual Studio Code.

Right click on the db folder and create a file data-model.cds and the below code.
namespace my.customerInfo;

entity customerBasic {
key ID : Integer;
fname : String;
lname : String;
vat : Integer;

}

entity profilesValue {
key ID : UUID;
key profiles : Association to profiles;
key customer : Association to customerBasic;
value : String;
}

entity profiles {
Key ID : Integer;
name : String;

}

This defines entities and attributes which will ultimately be deployed on HANA service in SAP Cloud Platform.

Step 5: Let’s add some mock data


Now CDS automatically scans for a folder called CSV under DB folder. If it finds a file with same name as the entity, it automatically uploads the data to the binded database and respective tables.

So create a file named my.customerInfo.customerBasic.csv and enter the data as below.
ID;fname;lname;vat
100;Chaith;Mirle;987766
101;Banana;Slim;76543
200;Apple;Jobs;1232334
210;Grapes;Sour;123455
215;Hotel;California;54433

 

Create an another file my.customerInfo.profiles.csv and enter the data as below.
ID;name
10;shop Floor Size
11;No. of Employees
14;Brand Active
16;Location
17;Outlet

 

Step 6: Create a service to expose data


Let’s create a service to expose these data.

So create a file customer-service.cds under the srv folder. Paste the below code in the file.
using my.customerInfo as my from '../db/data-model';

service CatalogService {
entity Customers @readonly as projection on my.customerBasic;
entity Profiles @readonly as projection on my.profiles;
entity ProfilesValue as projection on my.profilesValue;
}

Step 7: Let’s add SQLite3 for testing locally


Open the command prompt and run the command npm i sqlite3 -D

This will install sqlite3 and also adds it as a Devdependency in package.json

Step 8: Test Locally


To test a CDS application locally. Run command CDS Watch in the project root folder.

You should be able to see the Node uploading the data to sqlite and also the services should be generated.


By default, you will see that our Application is listening on localhost:4004.

Special Note: CDS Watch is similar to Nodemon, it watches for any changes in your project folder and automatically restarts the server. So you can actually run CDS Watch when you begin the project itself.

Now, if you open a browser and go to http://localhost:4004 you should be able to see resources available.


Click on Customers, the data we had entered in CSV file should be shown.


 

Now let’s test a POST method using Postman.


Launch Postman application or the add-on on Chrome.


Test a POST request as shown below.



You can also do a GET of ProfileValue to make sure the data has been created.

Step 9: Deploy to SAP Cloud Platform


This step requires CF CLI to be installed. I hope you have already done it.

Before deploying let’s add a dependency in package.json as below.
"cds": {
"requires": {
"db": {
"kind": "sql"
}
}
}

Since we will be using HANA and not SQLite when we deploy, we need to add that dependency to our project.

Run the command npm add @Sap/hana-client --save

Finally, we are now ready to deploy the application.

Login to cf using command cf login

Create a hana service instance using command

cf create-service hanatrial hdi-shared customerinfo-db

Now, run the below commands

cds build --production

cf push -f gen/db

cf push -f gen/srv --random-route

The first command builds your application. This generates a separate module for db and srv

The second command creates a application which binds to the hana service we created above.

The third command creates a application which exposes the services similar to what we tested earlier.

Special Note: Login to your CF account and make sure these applications are binded properly and started. If not, you would have to sometimes bind the instance manually and restart the service.

 

Let's now create the Orechestrator Application which will call the above POST API when it receives a message from the sender application.

The Orchestrator


If you check our architecture again you would notice that one of the critical service we need is the Orchestrator. The Orchestrator service is responsible to direct the calls from Message broker to respective applications.

So let’s begin to create this service. Similar to sender Service I have used NodeJS to demonstrate because of the simplicity service creation in Node.

Step 1: Login to SAP Cloud Platform Trial Account


Login to your SAP Cloud Platform trial account and click on SAP Business Application Studio.


If you are launching after a day, you will notice the your dev space would be stopped. Start the service again and then click on the space.

Step 2: Create a CAP Application


In the welcome tab, click on Create project from template


Select CAP Project and click on Start. Then enter a project name as capreceiver.

This should create an empty project with different CAP modules such as db, srv



Step 3: Edit the project


Just like the sender application prepare the project dependencies. Click on the file package.json

CAP would’ve already generated some mandatory tags. Please add the below dependency.
  "cds": {
"requires": {
"messaging": {
"kind": "enterprise-messaging"
}
}
}

And application dependency
    "@sap/xb-msg-amqp-v100": "latest",
"axios": "^0.21.0",

This application would need additional dependency called “axios” which we would use to call the REST APIs of the customer update application which we created above.

Axios is a HTTP client which provides us APIs to call REST services in a easy and standard manner.

You will find the complete code sample of package.json below.
{
"name": "capreciever",
"version": "1.0.0",
"description": "A simple CAP project.",
"repository": "<Add your repository here>",
"license": "UNLICENSED",
"private": true,
"dependencies": {
"@sap/cds": "^3",
"@sap/xb-msg-amqp-v100": "latest",
"axios": "^0.21.0",
"express": "^4"
},
"cds": {
"requires": {
"messaging": {
"kind": "enterprise-messaging"
}
}
},
"devDependencies": {
"sqlite3": "^5"
},
"scripts": {
"start": "npx cds run"
}
}

Create Services


Since we are only creating a service which will in turn call APIs, we don’t need any persistency and hence the db module is not required in this application.

Create a new file under folder srv called receiver-service.cds

Add the below code to it.
service ReceiverService {
entity DummyEntity {
key dummyID : Integer;
};
}

This file creates a service called ReceiverService

Create a new file under folder srv called receiver-service.js
const cds = require ('@sap/cds')
const axios = require('axios');

module.exports = cds.service.impl ((srv) => {
srv.on('customer/created', async (msg) => {
const messagePayload = JSON.stringify(msg.data)

let res = await axios.post('<Customer Update Host Name>/catalog/ProfilesValue',
{ profiles_ID : 16,
customer_ID : 101,
value : "Munich" } )
.then((response) => {
console.log(response);
}, (error) => {
console.log(error);

});;
})
})

This particular code gets called on event when message is received in the queue /customer/created.

Please note: To replace your host name in the axios.post function call.

Once, the message is received, a call is made using the Axios.post API to the customer profile update application.

Based on the queue you can make several calls to different applications and orchestrate your whole integration scenario in this manner.

Add MTA File


Let us add the MTA.yaml file and build it.

You can follow the same steps as sender service to create the MTA file.

Now right click on the MTA.YAML file and select Build MTA.

This will generate the mta_archives folder with the .mtar files which will be used to deploy on cloud foundry.

Deploy to Cloud Foundry


Expand the folder mta_archives and right click on capreiver-srv.mtar file and select Deploy MTA Archive.

Verify the cloud foundry space to make sure the instance is running.


Make sure the Enterprise messaging instance is bind to the application.


 

Final Testing of the whole scenario


Ok so now that we have all the applications and their instances ready, let's test the full scenario

Make sure all the instances of your applications are running.


 

Open the capsender-srv and run the sender URL.


Now open Postman and do a GET on the ProfilesValue resource. You should be able to find the customer_id: 101 with a profile value as "Munich".


 

Ok this sounds very simple!! But in the background what happened is, when you called the send() function it send a message to the Broker(You can check the messaging dashboard) and our Orchestrator which was listening to the queue received the message.

Once it received it called a POST request of our Customer Update application which finally updated the Profile value.

As you see, all the applications we developed talk to each other asynchronously and doesn't require much interface mapping at all.

Now, Imagine if we could orchestrate all the events in an enterprise and break up applications to talk to each other in this way. Not only are we reducing the footprint of fat applications, but also making our development teams more agile.

This in my opinion is the power of Enterprise Messaging!

Additionally you could also configure S/4HANA to be your sender application to orchestrate the business events that happens in your core ERP.

I will try to soon write a blog on that part too.
4 Comments
Labels in this area