Expert Routine – why not (part 1)
It is just simple test if it is worth to use expert routine or not.
In my example I’m using 2 DSOs, one is a copy of the other, so the mapping is 1:1 for every field. Firstly let’s check standard transformation:
1 minute 7 second for over 2.2 mln records it’s not a bad result.
But let’s check if we can improve it using Expert routine:
Simple ABAP code:
data RECORD type RSARECORD.
loop at SOURCE_PACKAGE ASSIGNING <source_fields>.
MOVE-CORRESPONDING <source_fields> to RESULT_FIELDS.
record = record + 1.
RESULT_FIELDS-record = record.
append RESULT_FIELDS to RESULT_PACKAGE.
16 seconds quicker…
To make the result more reliable I repeated the loading a few times. Below you can find the final result:
Results are quite similar so at this point it’s really hard to say if it is worth to use Expert Routine. So let’s try something more complex. The same approach, but for DSO with 254 characteristics, and over 4 mln records:
To make test more reliable, the setting in DTP for number of Parallel Processing is: 1 process.
Loading using standard transformation:
And with Expert routine:
As you can see even with 1:1 mapping you can save around 25 minutes what is over 30% improvement.
But, there are some disadvantages of using Expert routine like:
1. Aggregation of Key Figures (Summation, Maximum, Minimum) is not maintained
2. You cannot use initial value for Characteristics and Key Figures, there is always overwrite mode for all involved infoobjects.
In the next part I’ll show you how you can work it around.