SAP FIORI performance for 20,000 records in 22 seconds
I would like to share my recent SAP FIORI app implementation experience.
I have come across a scenario where the inventory analyst want to fetch all the materials around 20,000 with different stock figures.
It sounds crazy but there were business reasons behind this design, so had to walk through the same design.
SAP has given standard app for the same kind of scenario but required UI changes and tuning for huge volume of data as shown below.
We have achieved fetching 20,000 records in 22 seconds in both quality and production.
The UI should allow 2 column headers for total quantities, multi column filters/sorting, variants etc.
It should also open popup for each table cell with unique values of that column etc.
When I thought about this requirement, I had little doubt about the performance.
I have searched in google but not mentioned any where about the performance of dealing this much huge volume of data.
Most of the blogs say to minimize data as much as possible.
I thought of using CDS views, but converting the logic of standard ODATA service to CDS views is big challenge because of time constraints.
I have achieved this by little changes to the ODATA service.
- Create new entity type with required properties by extending standard ODATA service.
- Maintained proper data types for the custom fields in the entity type.
- There are three types calling an ODATA service from the app.
- Single ODATA call oDATAModel.read with $skip and $top, here $top can be specified with max number of records.
- Batch read operation, here all ODATA service calls are sent to backend and call function get triggered after all the calls are executed.
- Call initial ODATA service for count numer of records $count and then call ODATA service in loop make sure each loop brings around 3000 recrods. It depends on the data, for us it worked 3,000 records.
I have tried with the three options, but third option is best suitable for this kind of scenario.
So friends SAP FIORI can handle huge volume of data undoubtedly.