Skip to Content

Expert Routine – why not? (1/2)

Expert Routine – why not (part 1)

It is just simple test if it is worth to use expert routine or not.
In my example I’m using 2 DSOs, one is a copy of the other, so the mapping is 1:1 for every field. Firstly let’s check standard transformation:


And then loading time: /wp-content/uploads/2011/11/standardload_122109.jpg

1 minute 7 second for over 2.2 mln records it’s not a bad result.
But let’s check if we can improve it using Expert routine:


Simple ABAP code:

clear record.
loop at SOURCE_PACKAGE ASSIGNING <source_fields>.
  record = record + 1.
  RESULT_FIELDS-record = record.

And test result: Expertload.jpg

16 seconds quicker…

To make the result more reliable I repeated the loading a few times. Below you can find the final result:

Counter Standard Expert routine
1 1:07 0:51
2 1:08 1:01
3 1:01 0:57
4 1:04 0:54

Results are quite similar so at this point it’s really hard to say if it is worth to use Expert Routine. So let’s try something more complex. The same approach, but for DSO with 254 characteristics, and over 4 mln records:

To make test more reliable, the setting in DTP for number of Parallel Processing is: 1 process.

Test result:
Loading using standard transformation:


And with Expert routine:


As you can see even with 1:1 mapping you can save around 25 minutes what is over 30% improvement.

But, there are some disadvantages of using Expert routine like:
1. Aggregation of Key Figures (Summation, Maximum, Minimum) is not maintained
2. You cannot use initial value for Characteristics and Key Figures, there is always overwrite mode for all involved infoobjects.

In the next part I’ll show you how you can work it around.

You must be Logged on to comment or reply to a post.
  • Sorry, but I must disagree.  I would never use any routine in BW if you can accomplish straight field mappings.  I don't care if if saves a few  seconds or minutes of performance.  I'm speaking from a support perspective.

    The problem with ABAP routines in BW is the documentation and maintenance.  You are needlessly complicating any future problem analysis on the transformation when you have to dig through lines of code to see what is being done in the transformation, especially when it can be displayed easily in the field mappings.

    It is difficult enough to have to dig through some of the complex start and end routines I've seen done by other developers, without also having to hunt for the field mappings in the code.

    Only use routines when necessary, and for expert routines this would only be when you want to access both the Source_Package fields and Result_Package fields in the same routine.

    Just my opinion.

    • I am not trying to convince anybody to use expert routine instead of standard transformation for 1:1 mapping. I would never use is in practice too. I used expert routine in many cases, and for really complex ABAP I was able to reduce loading time even 70-80%, but I was wondering if I can do any improvement in simple 1:1 mapping, and I can... And if you have better performance for straight mapping you can save much much more time with complex logic.
      Even from support point of view, it is much easier to have the whole code in one place when you do debuging than when code is split between start routine, rules for every separate infoobject, end end routine.
      People are afraid of Expert routines, and they use is hardly ever, one of the reason is, as you mentioned "Support", and explanation that it is to difficult for support to understand it. In fact it is not if you know basics of ABAP... this is also what I was trying to show in my blog.
      But I respect your opinion, and as I already mentioned I am not trying to convince anybody to use Expert routine for 1:1 mapping.
  • Hello,

    it's pretty clear that standard routines have normally an overhead, BUT in your showcase you show a transformation without transformation. I can imagine NO context in with this makes ANY sense.



    • I don't want to convince anybody to use Expert routine for straight mapping. I wanted to show in my blog how simple it is and it has huge impact on performance during data loading, especially for big amount of data and complex ABAP.
      • Hello,

        maybe I was to fast.

        Improvement of coding instand of MOVE-CORRESPONDING direct assignment of fields

        Another thought I to share is the following:

        encapusulation in transformation

        In BW you have often code snippets which do pretty much the same. Encapsulation these code snippets in global classes und re-use would boost the maintainence and relialibty of your system.

        All the best


      • Hello,

        in order to improve performance you make a direct assignment between source and result, because of the cost of MOVE-Corresponding.

        Additionally I have often seen similiar code snippets in BW. I personally believe that encapuslation into global classes would increase maintenance significant.

        All the best,


  • First I develop ABAP code (and other code), second our BI/BW analyst rely on us only when they have long running jobs.  ONLY long running jobs. 

    Our BI/BW team does write ABAP code.  The reason is that our mapping can become very complex.  Now I'll have a BI/BW person read this blog when he gets back from vacation.  He could probably explain why something similar to this is used when doing loads.

    He writes the code.  When the code has issues, the developers help with maintenance.  So maintenance is more complex. There is a need for another resource, things can change, and no he doesn't use it for simple mappings.

    Maybe we are not a normal shop, but even a half an hour is a GREAT save.  We have a lot of jobs running overnight.  We need them to complete for the reports in the morning.

    This example is simple.  But it is easy to follow, and perhaps a start for a person with no development experience.

    In other words - nice blog from my point of view.  I'll have our BI guy check it out so I can get his point of view too!

    Looking forward to part 2.


  • Hi,

    Nice blog and indeed there are some advantages/disadvantages that people have noticed.

    However, recently I could hear from SAP that Expert Routines do not work with SAP HANA... That would be the most difficult issue for using expert routines (that I do use as often as I can/need).


    • It's really interesting what you are saying. What will happen with Expert routine during upgrade from 7.0 -> 7.3? Do you know that?


      • Hi Dariusz,

        I have not worked on 7.3 yet. I got the information from SAP Switzerland.

        I guess that expert routines still work for data loading (legal reasons with upgrapdes on release+1 without regression) but seem not being eligible for HANA.

        Too  bad but to be confirmed... Anyone from SAP please?


  • You do not need to waste time proving that lean expert routing is faster then transformation which has more additional code. I would not compromise transformation clarity for the sake of minor performance gain. Can you share something more practical and helpful from your Senior BW Consultant experience?
  • Good insight on how useful Expert Routine can get.. never used it before. I will keep this in mind and use it for one of our many daily FULL loads when the data grows substantially and affects loading performance.

    Don't worry about the insecure guys who posted pessimistic comments, who are afraid of knowing/learning or trying something different if the need strikes..

    • Ahhhh...

      But that is what SCN is all about.  Writing your own opinion.  Being open enough to debate the comments.  Sadly some of the comments here are not debating comments.  They are simply negative comments.

      So I agree don't let that stop you.  But don't stop commenting either.  Adding examples, and reasons would make your comment more substaintial.  And would help the blogger learn.  Help others reading this learn...

      It is a circle.  We all have different thoughts we can add.  There is a difference between constructive negative feedback.  And simply negative feedback.

      Enjoy!  I learn a lot from the comments to my blogs.  And you were brave enough to write one.  They were brave enough to comment.


  • I think this is quite a nice thing.

    I have never thouht about expert routines until i stumbeled about one made by a consultant and said to myself - nice good overview not clicking arround.

    Then I forgot about it and started to build up the prop. layer for 2lis. We decided to fully expand the structure in order to not have to initialize with new field.

    Now I ask all you sceptics
    - where is teh clearity advantage of 250 1:1 connection
    - who wants to maintain them realing up and down, searching.

    This fricking mapping table is a complete brain child  of developers who never have had to work on operational basis with their products. As so many functionalities in BW.

    Having a that neat little code is the ultimate clearity and maintainability. Added to that having to load a full blown EDWH from FI over MM to SD 30min savings per Datasource are a gift and not peanuts


  • Dariusz, in your 2nd set of tests using the standard transformation, you have Error Handling turned on (the one where you obtained 1H 21m 55s). Whereas for the Expert routine you have Error Handling turned off.

    The error handling causes an overhead and slows down performance.

    So your test sets are biased and cannot be used for comparison.

    • You want to say that 20% of uploading time is error handling?

      I do not have access to this system anymore, and I couldn't repleat this test. but I did test it in live system running the same DTP with and without error handling. There is not a big difference with 1:1 mapping transformation. But please let me know your if you have different experience.