Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member185511
Active Participant


Hello everyone,
I am going to write about my experience regarding a simplified way of implementing OData service and re-using the same OData service for all of my UI5 projects (around 8+ big projects) for last 5 years.

First of all, if you are implementing standard FIORI applications, smart controls or very simple honkie-ponkie app which is targeting a standard SAPGUI user, this article might not be for you.

Back in 2015, when i was learning nodeJS and React, i noticed sending data from backend and consuming on frontend layer was super easy. In 2016, i wrote a HANA Procedure backup tool with nodeJS and React (here is my article) and really loved data communication methodologies there.

Once i started learning SAPUI5, i noticed something was not right on the entire data flow. Selecting fields in SEGW, updating service again, updating structure due to new requirements, function imports, annotations, deep inserts, entire SEGW was really burden for the project. There was no way i could implement something in a good mood. After a simple REST API data flow, OData configuration, settings, capturing nested data was unnecessarily complicated. One content become dynamic, things were getting uglier.

I decided to simplify things in a more generic way which would change a lot of things in my career and in the company.

Lean ODATA


When i mean lean OData, i am talking about a generic OData service which you can use for all your projects. Below you can see only GET and POST methods are used. We just need two basic methods. I will explain how filters are flowing for the endpoints below.


Our ODATA should contain fixed columns but it has to cater all requirements easily without any change. I created below OData service with 4 fields only based on a structure. Once request comes from UI5, a custom BADI layer is triggered based on parameters.


TMPL_ID : (template id) This column is giving us the information of what is this service all about. It can be related to a specific report module, a workflow, calculation, file generation or anything. This is referring to BADI filter which also contains all the methods related to this module. A sample would be FINANCIAL_REPORT

Generate report, download generated report(XLSX,PDF or live report), context selections before running the pivot, previously generated reports all belong to same MODULE_ID. All related methods exists in same BADI.

DETAIL: This refers exactly what this service will do. This could be a calculation needs to be run, an approval process, template generation or chart generation. It is the method name inside the BADI.Example would be GET_REPORT, RUN_CONVERSION, GET_SALES_CHART, GET_SELECTIONS etc.

CONTEXT: This is key column where i send parameters/filters for POST request in the BODY.  It is also used to retrieve data back as JSON format for both GET/POST requests. JSON content flows here./

MSG: During Get/Post operation, if any error is raised, it is sent here. Having a standard way of error handling makes it easier on frontend. I could send same error in CONTEXT but MSG field makes it easier to handle.

 

For GET method;
        GET BADI LR_BADI
FILTERS
TMPL_ID = LV_TMPL_ID.

CALL BADI LR_BADI->GET
EXPORTING
IT_FILTER = IT_FILTER "Filter coming from URL
IS_DATA = IS_DATA "contains info related to query.
IMPORTING
ET_MSG = LT_MSG
ER_DATA = LR_DATA.

* RESPONSE

ER_ENTITY-TMPL_ID = IS_DATA-TMPL_ID.
ER_ENTITY-ACTION = IS_DATA-ACTION.
ER_ENTITY-DETAIL = IS_DATA-DETAIL.
ER_ENTITY-CONTEXT = /ui2/cl_json=>serialize( data = LR_DATA ). "dynamic json content
ER_ENTITY-MSG = /ui2/cl_json=>serialize( data = LT_MSG ). "any message if raised

 

for POST method:
       CALL BADI LR_BADI->SAVE
EXPORTING
IS_DATA = IS_DATA "contains all info related to query
IMPORTING
ET_MSG = LT_MSG
ER_DATA = LR_DATA.
* RESPONSE
ER_ENTITY-CONTEXT = /ui2/cl_json=>serialize( data = LR_DATA ).
ER_ENTITY-MSG = /ui2/cl_json=>serialize( data = LT_MSG ).

Design Pattern:


Here we are heavily using strategy design pattern which you can easily see in majority of the SAP products. Just to give you an example, entire BPC architecture (standard mode) heavily uses this approach.

SAP BADI layer is just wrapping the pattern. Below is the class diagram of a strategy design pattern.

Interface is our BADI interface and concrete classes are BADI implementations.



Below screenshot is from my current system. There are 2 main methods one is for GET and one is for SAVE. I could combine in to 1 interface method but just wanted a seperation between HTTP method types.


Each BADI is implemented via FILTER which defines what class to trigger.



 

I have around 114 BADI implemented right now from same interface. It covers around 5 big projects and since 2016 we never visited SEGW t-code again. This gave us flexibility and speed that we need for project implementation. Rather than spending time on SEGW modelling, we spend it for better architecture, innovative functionalities, and better UI/UX for our end users. Standard FIORI screens was very boring honestly.



 

 

Welcoming any change in any phase of the project:


Biggest advantage of above architecture was we could hire any ABAP developer who can pick the logic easily and continue development in a couple of days. They don't need to know about SEGW configurations, deep inserts, annotations and any other checkbox functionality there. It is all about how HTTP methods, JSON data and performance oriented clean ABAP.

This gave us lightning speed. Once we implemented first project successfully, majority of user's non-sap implementation ideas turned towards SAP. We were doing it very fast and very robust. New projects keep coming and we had to reject majority of them.

We were accepting any late request, new ideas in any phase of the project based on our SCRUM planning. Everything was smooth and sucessfull. Win-win for both sides.

 

Switch between GET and POST method:


There are several cases where FILTER might get too long (especially on analytic report) where you might hit URL length limit. For this cases, switching from a GET method, to POST method is very easy. It is all about changing parameters from URL to Request BODY.

Again SEGW is not impacted.

 

FOR GET:
ls_param-param1 =  VALUE #( dt-it_filter[ property = `PARAM1` ]-select_options[ 1 ]-low  OPTIONAL ).
ls_param-param2 = VALUE #( dt-it_filter[ property = `PARAM2` ]-select_options[ 1 ]-low OPTIONAL ).

For POST:
  DATA: BEGIN OF ls_param,
param1 TYPE t_member,
param2 TYPE tt_context,
END OF ls_param,
/ui2/cl_json=>deserialize( EXPORTING json = <params_from_odata_body> CHANGING data = ls_param )

 

Distribution of the load:


There are several cases we are exporting multiple table and chart data at the same time. This architecture is giving you an immediate split option whenever there is a performance issue for sequential process. For example 2 chart can be triggered parallel in the screen (or all at once). This is all depends on the scenario.

Below 4 methods are triggered at the same time once user open the homepage. I could be combining them to 1 method with 4 internal table or split into 4 methods and every method exports their internal table. It is all about decision for the design principle and performance for heavy services.


 

Logging everything:


Below API Handler layer is the place where it find the BADI and triggers the specific class/methods based on BADI filters. This gives us advantage of logging all the body/parameters/url of the service, who triggered, how many seconds it took, was there error or anything related to it is stored.

We can get some good statistical reports end the year for users and prepare a usage report. This includes which page used most, average seconds for the end points, errors, failed requests and some other details are reported.



 
logger->timer_Start( ).

<BADI is triggered here>

logger->timer_end( ).

* log service details. User, time, etc.

logger->log_service_data( ).

 

Sample call using async/await


As you can see below, we are just setting our TMPL_ID to define what BADI to trigger and DETAIL to define what method to trigger. URL filter is send for announcement ID.
// SERVICE.JS	
getAnnouncementDetail: function(iv_annoid) {
var sPath = "/DATASet(TMPL_ID='APP1_ANNO',ACTION='GET',DETAIL='GET_ANNO_DETAIL')";
var aFilterValues = ["ANNO_ID=" + iv_annoid];
var params = {
sModelName: LOCAL_MODELS.ANNOUNCEMENT_EDIT,
aFilterValues: aFilterValues
};

return this.getODataAPI().read(sPath, params);
},


// controller.js
_GetAnnouncementDetail: async function (iv_annoid) {
sap.ui.core.BusyIndicator.show(0);
var oResponse = await this.Service.getAnnouncementDetail(iv_annoid);
// some bindings here
sap.ui.core.BusyIndicator.hide();
},

Conclusion:


I implemented this architecture on my first SAPUI5 project and later on improved a bit and started using on my latest client as well. Since 2016 service was not touched at all but new BADI implementations keep coming by using same engine. We saw biggest advantage on 3 different areas;

  • Development speed

  • Flexibility

  • Third party library integration


Similar to above, i also implemented a file upload solution which works in a similar way but i can write about it later.

Since 2018, i was planning to write regarding this approach and finally i am able to write it. I know many of developers might disagree but we loved it and still using without any issue.

Below is a sample screenshot from latest project.I had to blur sorry.




  • Announcement section(where user can upload and crop images and add HTML content)

  • Calendar control where they can add events from admin screen

  • Some chart data related to activities.

  • Left menu (also coming as JSON from backend based on auth)


 


 

UPDATE :
Just a small update, recently Palantir which is a big data company also published an article regarding REST API architecture. They are basically telling;

First, many clients don't need to differentiate between Create, Update, and Delete — they want an omnibus "set" operation.

which is similar to this approach. Yes it is about REST and this article about ODATA, but with above approach, it is very similar what Palantir is telling in their article. You can find article below.

https://blog.palantir.com/rethinking-crud-for-rest-api-designs-a2a8287dc2af

 
24 Comments
Labels in this area