Additional Blogs by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
Have you seen the John Cusack movie High Fidelity?  (or even better, read Nick Hornby's book)  Like his character in the movie, I'm a fan of Top 5 lists.  Top 5 Dream Jobs; Top 5 Artists; Top 5 Side 1, Track 1 Songs; Top 5 Most Painful SQL Statements...  OK, maybe that last one wasn't in the movie, but if he had been a DBA or Basis Administrator, it would surely have made it onto the DVD release.

If you are trying to improve database performance for your SAP system, the thing that has been most successful for me over the years is to maintain a running Top 5 list, and always be looking at what you can knock off that list.  Since I'm running ERP 6.0 EhP4 on a DB2 LUW 9.1 platform, I'll show examples there for how to start building such a list by using the DBA Cockpit.  Very similar procedures exist for all other databases.

Open transaction ST04, and click on Performance (probably already expanded) and double-click on SQL Cache.  You can leave in the default selection criteria, and just click on the green check mark.  This is going to show you the SQL statements that have been executing in your database, and a little something about them.  (Note that there are multiple ways to get to each of these items, I'm only showing one typical way)

By default, this view is sorted by Total Execution Time, which is really not very helpful.  Many times it doesn't matter how much overall time a SQL statement has spent executing, what matters is how much time it spends each time it executes.  So I typically sort descending by Avg Execution Time.  What I look for are the Top 5 worst offenders - these are the SQL statements that are using the most time, and thus the most resources, each time they execute.  These are what I'd like to address.
 
 
In this example, I see that the first two SQL statements listed have average execution times of over 100 seconds each.  These, however, have only been executed a few hundred times each (since the last restart).  So, I'd like to look at the next one.  It has a slightly lower average execution time at only 79 seconds, but it has been executed almost 7,500 times since startup, for a total of almost 590,000 seconds (>163 hours) of database time.  So, if I highlight that SQL statement and click on EXPLAIN, that will tell me how the SQL statement is executing.
 
 
 
As you can see, this SQL statement is doing an Index Scan of the primary key index, BKPF~0.  However, the details for the explain plan show me that that index scan is only using the MANDT column for range delimiting, and of course all records for this index will contain the same value for MANDT.  The index scan then uses BELNR as a sargable predicate, which basically means that the query will scan for all rows that match the MANDT value, then filter out those which don't match BELNR, as opposed to searching the index for BELNR itself.

So, what can we do?  Well, we have this nifty tool called the Index Advisor.  So, going back to the previous screenshot, highlight the SQL statement, then click on the Index Advisor button, and then click the button for "Recommend Indexes."  What you will see is something like this:
 
 
 
This has now come up with a suggest for a new index for us.  It is a non-unique index based on 14 columns (wow!), of which you can see the first two are BELNR and MANDT.  The nice thing about the Index Advisor is we can also now see what effect that would have on our query.  To do this, we click down below on the EXPLAIN button, making sure that recommended indexes are evaluated.
 
 
Now it chugs for a minute, then comes up with a new explain plan, based on this new simulated index:
 
 
And we can see that it has chosen to use the new index, it doesn't need to go to the table at all because all of the selected column data is in the index, and, according to the explain plan, our cost has been reduced considerably.  Although this does not directly predict the affect it will have on the runtime of this particular query, it does give an idea of the improvement, which can be validated with testing.

There are other considerations of course, for example you want to know ahead of time how many indexes a table already has, as each new index will, to some degree, slow down insert, update and delete operations.  And, as always, the recommendations from the Index Advisor should be evaluated and tested by experienced database professionals.  But, at least this gives you a starting point to evaluate and target database performance issues.

I can tell from past experience that this can make a large impact.  I had one personal experience where adding 2 indexes at once reduced overall database reads by 50%.  While these results are not typical (your mileage may vary, as they say), tuning is a continual process and is virtually guaranteed to net positive results, if done in a logical, methodical manner.

So get started on your Top 5 list today!
 
 
21 Comments