Skip to Content

Enhancing Standard Transactions efficiently!

For most of the ABAPer’s enhancing SAP standard transactions usually involves three steps:

  1. Search the place in standard application to write the code.
  2. Develop the code to meet requirement.
  3. Test the functionality.

Step 1 is an easier task now with the inception of new enhancement framework. But be careful while using new enhancement frame work, it is like open wire circuit and if you do not play safe you may end up with a short circuit down the line. My intension is in no way to discourage you from using implicit enhancements but to remind you what Uncle Ben said to peter parker that  “With great power, comes great responsibility”.

Now coming back to agenda of this blog let’s assume that you have chosen the place to write your custom code very carefully and that could be any out of user-exits, BADI, BTE or implicit enhancement; now you just have to write the code (Step 2) to meet the requirement and you are done.

Now take a pause here and ask yourself this question “Am I really done or I am missing something.” Most of the ABAPer’s would say they just need to complete the third and last step to test the functionality and they are done. They might be right also if there are not database fetches in their custom code. But what if you have database fetches in your enhancement? Common reaction to this would be how does it matters, I have written optimal select query using primary/secondary index , fetching only required fields and I have no duplicate fetches. Well, that’s a good job done if had it been a standalone program but for coding enhancement you really need to walk some extra miles as while adding this extra bit of functionality you are overloading the transaction with additional database fetches and if your enhancement gets triggered multiple times during single transaction processing then situation can be even more worse.

For example if you are enhancing sales order processing (VA01/VA02) and you have this enhancement at line item. That means on each time line item change your enhancement gets triggered. Imagine the production scenarios for a large enterprise system where average order size is 50-60 line items and you have coded 20 select statements to meet different business requirements over a period of time. Though you thought you have not written any duplicate selects but your selects are getting triggered approximately 1200 time in one business transaction processing which is a huge overload on the system and it’s going to kill the system in production environment.

I have seen the projects where approx. 70 enhancements were done for different business requirement over a period of time in business transaction processing involving the selects on various master data table, configuration tables and also custom tables and it slowed down the transaction performance by 60% to 80% which is not acceptable in any way.

So now the question is how do we take care of this performance issue? Well, you really need to do an analysis to see if that select which you coded on master data table and customizing that is really required and based on my experience I can tell you that in more than 95% of cases you will find that data is already available in some function groups global memory/ SAP buffer for you to use. You need to perform runtime analysis (SE30) of the standard transaction you want to enhance and find out the standard SAP FMs for master data/ customizing table fetch.

These FM uses FG global memory to store the previously fetched values so when you call these FM’s in your enhancement you are not doing any database fetch but just reading from FG global memory. In most of the cases such buffer FM will have the table name in their name so you can easily find it in the SE30 hit list. Also each such FM comes in two flavors one for fetching single record and other for fetching multiple belonging to same FG. E.g. to read material master data from MARA you can user use buffer FM MARA_SINGLE_READ to read single record or FM MARA_ARRAY_READ for reading multiple material at a time in sales order processing transaction.

Based on my experience below is the list of master / config table and what can be done to avoid DB fetch in enhancement during sales order processing:

Master / config table To avoid DB fetch
MARA (Material master general data) Use FM MARA_SINGLE_READ to read single record or FM MARA_ARRAY_READ for reading multiple.
MARC (Material plant data) Use FM MARC_SINGLE_READ to read single record or FM MARC_ARRAY_READ for reading multiple.
T001W (Plant master table) Use  FM T001W_SINGLE_READ
ADRC (address master data) Use FM ADDR_SELECT_ADRC_SINGLE
KNVV (customer sales area data) Use FM KNVV_SINGLE_READ
KNA1 (Customer master general data) Use FM V_KNA1_SINGLE_READ
KNVP (Customer Master Partner Functions) Use FM SD_KNVP_READ
TVEP (Sales Document: Schedule Line Categories) This table is a generic area buffer table so use CLIENT SPECIFIED and provide MANDT also in where clause with the key fields to utilize SAP buffer.
TVEPZ (Sales Document: Schedule Line Category Determination) This table is a generic area buffer table so use CLIENT SPECIFIED and provide MANDT also in where clause with the key fields to utilize SAP buffer.
TPRIO (Customers: Delivery Priorities) This table is a generic area buffer table so use CLIENT SPECIFIED and provide MANDT also in where clause with the key fields to utilize SAP buffer.
KNMT (Customer-Material Info Record Data Table) Use FM RV_CUSTOMER_MATERIAL_READ
MARM (Units of Measure for Material) Use FM MARM_GENERIC_READ_WITH_MATNR

It is evident from above list that for most of the master data and config tables you will either find call to buffer FM in transaction processing or it is a buffered table. Now it’s up to you to utilize this information to write a DB fetch free enhancement.

Till now I discussed about standard tables only but there can be instances where you need to write a select on custom table in an enhancement and in that case you should consider about below point:

  1. If custom table can be categorized as config table then it makes lots of sense to switch on the buffer and utilize SAP Buffer to while fetching data.
  2. If custom table can be categorized as master data table then it will be good idea to create buffer FM for single and multiple record reading based on SAP standard buffer FM approach and use these FM instead of coding direct selects in enhancement.

So from now onwards enhancing SAP standard should be a four step process for you:

  1. Search the place in standard application to write the code.
  2. Analyze business transactions SE30 Hit list to identify buffer FM to avoid direct DB fetch and check buffer settings of tables to properly utilize SAP Buffer.
  3. Develop the code to meet requirement.
  4. Test the functionality.

All the views provided here are my personal opinions and based out of my experience with developing enhancements in ABAP and do not necessarily reflect my employer’s. Please treat the inputs in this blog as just opinions and you should definitely seek professional expertise and opinion before making your business decisions.

You must be Logged on to comment or reply to a post.
  • In my company, years before the enhancement framework came along we worked out that using the standard SAP read modules in user exits and even IDOC function modules would remove database access totally. The last setp, which is never going to happen, is that SAP need to standardise the usage of such function modules in standard SAP code. For example do a trace on VA01 and you will see differnt “buffering” function modules used to get data from KNA1 plus a direct read or two, all from standard SAP code.
    • Agree with you on this, quite often we find use of different access methods in standard code for same table at different places in a transaction but I think we will have to live with this limitation until SAP does something about it.
    • Yes, you are right Paul, it will access table buffer. Reason why I mentioned to use buffer FM T001W_SINGLE_READ is because VA01 call this FM and using this FM will ensure no multiple read from table buffer for the same plant value (will be better for line item enhancements).