How to optimize Reporting Performance
We face many challenges in our BI projects in terms of Reporting Performance. Users always expect outstanding performance of our BI reports. No matter what ever we do in the backend (Modeling and Query Designer). I am going to share very important tips & techniques which I have learned through out my experience which consists of some standard thumb rules too.
- It is recommended to use Inclusions Instead of Exclusions wherever possible as Exclusions cant access DB indices and will impact performance.
2. It is recommended to to Suppress Result rows Wherever possible.
3. It is recommended to use SAP Exits where ever possible and bring down the Customer Exits
4. SAP Suggests free characteristics in reports should be limited to 8-10.
In RSRV checks, free characteristics usage is marked in red which has very high impact on reports
5. It is recommended to reduce RKFs & CKFs in the Query to as few as possible.
a. Because, when huge no. of RKFs & CKFs are Included in a Query, Restricting, Conditioning and Computations are done for each of them during query execution.
b. This is very time consuming and a high number of RKFs & CKFs can seriously hurt Query Performance
6. It is recommended to redefine your Aggregates in an optimized way by taking Statistics(BI Admin Cockpit should be in place) & Query definitions into Consideration.
a. I mean, please open your query definition and pick up the fields which are used in the query while defining Aggregates.
b. Delete the unused Aggregates
c. If the aggregates are too Large, those not only degrades the Query performance but also loading performance by longer times to roll up and Attribute Change runs also takes longer times.
d. Make sure you get good valuation indicators for your Aggregates, but not like below
7. Archiving (NLS) is recommended to archive unused data.
a. Reduction of online disk storage
b. Improvement in BW query performance
c. Increased data availability as rollup, change runs and backup times will be shorter
d. Reduced hardware consumption during loading and querying
8. Reports should be designed on Multiproviders wherever possible.
a. You can use 0INFOPROVIDER field to restrict to the specific infoproviders in Queries
b. You can also maintain partitioning criteria in RRKMULTIPROVHINT table
9. Logical and physical partitioning is recommended for better performance.
10. Extensive use of Filters at Query level is recommended.
11. Select appropriate Read mode settings for Multiproviders with “H” and Infosets with “X” in RSDIPROP t-code
12. Reporting on Infosets should be considered below tips :
a. Do not select all the fields which are part of Infoset definition like below. You can select the fields which we want to use in query only.
b. You should select “Use Selection of Structure Elements” in RSRT–>Properties
c. Do not make too many joins as it cause high runtimes to fetch data after the joins
d. You can set a limit of “Limit value for large Where conditions” in RSCUSTV19 Eg: 300
13. Statistical Reports performance can be improved by broadcasting query result to cache and prefilling cache.
14. Deletion of unused Queries is recommended
15. Delete temporary Query Views is recommended.
16. It is recommended to be careful while creating Cell Structures as they require high query run times and will lead to performance degradation.
17. Program RSR_CACHE_RSRV_CHECK_ENTRIES can be scheduled to run on regular basis to remove the unused Cache entries .
18. Make proper Query read mode and Cache mode settings in RSRT–>Properties. The recommended Cache mode could be 1 or 5.
It is always better to keep an eye on above points while developing our Business models and Queries. This blog will help us to satisfy our users with good reporting performance.
Good points made......Suman.
Thanks for the feedback. 🙂
I guess Suman I did it already, it was 4 Stars. You know always there is something to work on.
You know always there is something to work on.
What do you mean by this? Thanks for the rating.
Nice information Suman...
Thank you Mishra 🙂
very well documented ℹ , Thanks For sharing 🙂 🙂
Thank you SG 🙂
I just found this.. This is really helpful. Thanks!
Amazed to know it is really helpful 🙂
very good info..! Thanks
Thank you Ravikanth Reddy Vinta for your feedback 🙂
very good document about improving the performance of a report.
Thank you so much for your kind feedback Verma 🙂
Thanks for sharing a significant document. It'll be very useful in real time..
Certainly it will be useful. Thanks for the feedback 🙂
Very helpful Information ...
Thanks for sharing
Thank you Ganesh for your valuable feedback 🙂
Very good information....It will useful while creating qurey..thanks for sharing 🙂
Thanks a lot for your cheerful comments Ganesh 🙂
Very nice blog.
Thank you Koti Rama 🙂
suman sir it is very useful to me .
Thank u sir.
Please don't call me sir. Thanks for the feed back 🙂
It will really be very helpful for anyone on BI as the performance is one of the major issue evryone used to face...
Thanks for sharing .gr8 effort. 🙂
Thanks for your noble words 🙂 Welcome..
Very Informative Suman, Keep up the good work.
Thanks for appreciating it 🙂 🙂
Thanks for sharing..
But my query have exception aggregation on material level. And volume of data is huge. So performance is very low for 1 day data. please provide quick reply it will be appreciated.
Hi Chandresh Patel ,
Exception Aggregation is only for OLAP aggregation behavior. Keep Material infoobject in your Aggregate. This should improve your report performance 😉
Very helpful information Suman. Thank you.
Thank you Gopi 🙂
Thanks Suman for the nice blog. Very much helpful.
Thank you vinod for your valuable feedback. 🙂
Thank you suman, while i am working on Performance issue came then i saw these document.
its very useful for us.
This is my objective. Whomsoever come across this job, they can happily use my blog 🙂
Thanks for sharing your knowledge out of experience, really helpful 🙂
True most of the key points are focused.
Thanks for your cheerful comments 🙂
Performance of BEx query is very important . Very handy documents . Happy to read 🙂 .
Glad to receive your wonderful feedback Saikat Pal 🙂 🙂 🙂 🙂 . You may check my other blogs as well in Bex..
Very useful information 🙂
Thank you Chandra Sekhar for your valuable feedback.
A very good & detailed explaination. Keep it up Suman....
Glad to receive your valuable feedback Aravind Nag M 🙂 🙂 🙂
Excellent man 🙂 .
Thanks Giri Prasad 🙂 for your wonderful feedback.
i have reports on huge multiproviders. which includes around 8 to 9 cubes. i thought the concept was to provide lots of query field to user and then let the user create their own reports as with BOBJ 4.0 Webi has query stripping which only runs when fields are brought in and increase performance.
" SAP Suggests free characteristics in reports should be limited to 8-10."
i was informed to keep out of rows and put everythign in free chracteristics as this allows for better reporting instead of Webi trying to understand whats rows and whats columns.
can you shed some light on this?
SAP suggests the best practices. But you can still keep more than 10 free chars in your queries. In one of our reports, i have put 60 free chars and the report is performing nicely with aggregates. Make sure you keep some of the most important drill down chars in your aggregates. This can improve the performance.
Thanks for sharing such wonderful document. I feel its like "ALL IN ONE" dictionary for performance tuning.
Hi p bansi ,
Really amazed by your splendid comment 🙂 🙂 . Thank you!!
Its tough to touch all points in one blog...Its really great effort....
Hi Rajeev Parimi ,
Yep!! It's true! 😎 Thanks for your comments.
Thanks a lot for the KT. All the points are so important.
Its helping me a lot as I am started with report developments in SAP BEx.
Would you like to give me your inputs in a issue in preparing an inventory report
Glad to receive your comment on my blog. 🙂 BEx is the most interesting and brain storming tool to achieve wonders and gain Users confidence. Your BEx reports should talk on behalf of you to the users community. Sorry, I have not worked on MM reporting. But I will definitely come up with exciting inventory reporting in Bex whenever I get a chance to work with it..
I have read the document maybe a hundred times by now and just realized that did not thank you for sharing 🙂
I need to thank you too along with Suman for patiently helping me always
Rama has always been an inspiration as well.
Hi Yasemin ULUTURK ,
Thank you for the wonderful gesture 🙂 Really delighted to receive a comment from MoM 🙂 .
Being appreciated by the community is really nice and motivating, you deserve much more Suman...
Hi Yasemin ULUTURK,
Exactly!! I too feel the same. 🙂 Appreciation/Applause is the key driving factor to continue sharing our experiences. Thanks for mentioning/recognizing my efforts 🙂
Thanks for sharing useful information..!
Very usefull, for beginers and more advanced users.
Thanks for sharing.
You are welcome bysani balaji 🙂 .
It's my pleasure to share my knowledge with the BEx Community Leslaw Piwowarski.
This document will also help to attend interviews 😉
This article had actually helped me for my presentation. Great work, Suman Chakravarthy K! Would be looking for more such articles in the future, as beginners like me need them dearly.
Thank you Sayan for your wonderful comments on my blog. You may check out my other blogs in SCN
Yes, I will. In fact, I already have. Thanks again. Keep composing such great articles, as it is really helpful for us.
Thanks for sharing Suman. Very useful information.
Good blog Suman!
I am curious about the second half of point #3 though:
Customer exits are normally used to calculate the variable values to be passed to queries. They do require a certain amount of time to calculate, but the code for these is usually not data-intensive. Also they're calculated once-per-query-run, not once-per-record. So the time spent on this would be negligible compared to the time taken by the system to query the InfoProviders and actually retrieve and process the data for the front-end. The only exceptions I can think of are if your query is trivially small, or the cust exit code is data-intensive, both of which are rare scenarios.
So again, I'm curious why Customer Exits would be a significant point in query optimization.
Good compilation of reporting performance.
Useful article. Thank you for sharing.