Skip to Content
Author's profile photo Sriram Vijay R

When and how to write Start Routine

Every now and then we will see a thread posted in BW space seeking help in transformation routines.

Start routine, end routine, field routine and expert routine; which one to use, how to use, when to use. These questions can be answered only with respect to the logic that we need to apply in any particular case.

I would like to share here, how I see, approach and write start routine…

Start Routine:

In a data transfer process (DTP), start routine runs first. Then it is followed by transformation rules and end routine.


In medium and complex transformations, we will be having set of logic to be implemented.  Those logic include exclusions, look ups, conversion, calculation etc.

A plan shall be there with us for what to write in start routine. This will be decided based on the fact that start routines runs first.

When to write;

Scenarios which are good to be written in start routine are,

1. Delete out unwanted data.


Ex: You want to delete a record if its delivery flag is not set to X, in this case you have to use start routine.

2. Populate internal table by Select data from DB table, which will be used during lookups.

Ex: In schedule line Datasource currency field is not filled for some records, you want to fill them with company code currency. For this you have to look up Company code master. In start routine you can fill an internal table with all the company and currency details. The same can be done for transaction data also.



3. Sorting the records. Based on sorting further transformation rules can be written.

In a Goods movement Datasource, if you want to process you inward deliveries against the PO number chronologically, we can sort the source package in start routine and in transformation rules they can be processed serially.

How to write;

Simple filter


DELETE SOURCE_PACKAGE  WHERE /BIC/OIFLAG NE ‘X’.

It is better to delete unwanted records in start routine, because it won’t be processed unnecessarily in subsequent steps and reduce the data loading time.


Populating Internal table


SELECT comp_code country currency
FROM /bi0/pcomp_code
INTO CORRESPONDING FIELDS OF TABLE it_compcd(internal table)
FOR ALL ENTRIES IN SOURCE_PACKAGE[]
WHERE comp_code = SOURCE_PACKAGEbukrs.


When you write a select in field routine, it means that you are writing a select inside a loop. For an every iteration of loop, SELECT statement will hit DB table. It will result in performance issue. So it is good to write the select statement in start routine and take all possible records from DB table into an internal table.

This internal table can be looked up using READ statement.

Sort


SORT SOURCE_PACKAGE BY vendor createon po_number.

  This code will sort source package by vendor, date and PO. this mean your oldest PO processed first in transformation rule.








Thanks!!!


Assigned Tags

      8 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Ethan Jewett
      Ethan Jewett

      Nice blog. I don't disagree with anything here, but I'll just add my pet issue 🙂 That is, please be careful to never use a routine if the goal can be accomplished using standard functionality. Using routines unnecessarily makes systems less maintainable, often creates performance problems, complicates the use of analysis tools (like data lineage tools), and makes it so that your transformation can't make use of new functionality like execution of transformations in HANA.

      In the case of the examples in this blog, there is a place for each of them, theoretically, but in the general case I'd say that

      1. Filters should be applied in DTPs using the standard filter functionality. These filters get pushed down to the database layer. If a routine is used to filter, all records in the source object are read out of the database and transferred to the application server. There might be a place for filters in the start routine in some scenarios, but it would be very rare.

      2. Generally, lookups should no longer be coded in routines. BW now supports both master data and DSO lookups as standard, and that functionality should be used. If you need to do some other type of lookup, then yes, use this functionality, but be mindful of what your are giving up. Make sure to use a sorted table and binary search!

      3. Sorting is probably the best example here and the most common place where routines are truly needed. The BW team should look into standard logic for some of these processes so that I can start to complain about people using routines for them 😉

      Cheers,

      Ethan

      Author's profile photo Suhas Karnik
      Suhas Karnik

      Let me add my caveats to this as well 🙂

      Agreed that routines must never be used if standard functionality is available due to maintainability and performance problems you mention. Having said that there are still several gaps in the standard offering where these routines still fit in.

      For instance, DTPs do provide a filtering functionality, but in most production systems any support person can change/remove these filters entirely. This can be a serious problem when the filter is required by the logic or if the target must only contain data fulfiling the filter criteria. We can't de-authorize support people from changing the filters because there are legitimate reasons for filter changes as well. In such cases implementing these in TRFNs provides an additional layer of safety which is not as easily bypassable as DTP filters.

      Also, routine coded lookups can be replaced by the standard lookup feature only if we're talking about the trivial "get field X from DSO Y and fill up field X in the target" scenarios. But all too often there are scenarios where you need to get the latest/oldest X value, or some more complex lookup where the standard will not work.

      Author's profile photo Ethan Jewett
      Ethan Jewett

      Completely agreed. In fact, I was trying to figure out how to explain the DTP filter vs. Start routine filter "as a backup" scenario you just described and gave up. You described it much better. I'd hope that SAP will eventually add filter settings to transformations like they did for semantic group functionality, but until then we are stuck with routines as backups in many situations. Unfortunately, this is really bad in a BW-on-HANA system as IIRC, it makes it so none of the transformation logic can get pushed down to HANA. Makes me sad that there is not a better option. 🙂

      I would say though: Always consider if it is possible to remove the need for a routine through a different architecture approach. For example, if we need very complex lookup logic, maybe it makes sense to apply that logic somewhere else (like in another transformation) and then keep the lookup simple. This approach won't always make sense, but sometimes it can result in simpler, more maintainable BW systems.

      Cheers,

      Ethan

      Author's profile photo Suhas Karnik
      Suhas Karnik

      Agree to that. Over-reliance on routines can create technical debt which the support guys end up paying.

      Author's profile photo Sriram Vijay R
      Sriram Vijay R
      Blog Post Author

      Hi Suhas/Ethan,

      Thanks for reading and spending time to write comments.

      Scenarios explained here are just examples.

      In any developed BW system, you will definitely see routines everywhere.

      We cannot avoid routines at least for now.

      I am not sure about Whether DSO lookup gets data into an internal table or it is hitting DB everytime.  Have to check this.

      -Sriram

      Author's profile photo Ethan Jewett
      Ethan Jewett

      My understanding is that it looks up into a internal table sorted by the lookup key (like the master data lookup). But I haven't checked on it personally.


      Definitely understand that the scenarios are just examples. And it's a good blog, I think! Please continue sharing your knowledge and experience 🙂

      Author's profile photo Deepak Salokhe
      Deepak Salokhe

      Hi

      I have one query here. As mentioned in the blog above, the START ROUTINE can be used for populating internal table from some Transparent Table and then reading this Internal Table in individual routines.

      However I cam across this How To document , which specifically talks about End Routine in such cases for populating the Internal table and then copy value to individual fields in loop.

      So my question here is whether End Routine should be preferred over Start Routine for some performance reasons. Can any one provide some info about this?

      Regards,

      Deepak Salokhe

      Author's profile photo Suhas Karnik
      Suhas Karnik

      Deepak, from a performance standpoint, there is no rule that says that populating the itab in Start routine (or the End routine) will be faster. If you fire the same select statement from the Start and the End routine, both will have similar performance.

      Generally, I try to do all my lookups (including the select) and business logic in the End routine and use the Start routine primarily for filtering out unnecessary data, deleting duplicates etc. By doing so, the lookups and logic work on smaller data sets. My intention is to ensure that the For-All-Entries table used in the WHERE clause is as small as possible - often that is in End routines because the Start routine might have removed unwanted data. By doing that I also end up with all the lookup code in the same routine, so its easy to eyeball the select and read statements together which is nice.

      That is only a general guideline and a stylistic preference though, not a silver bullet of performance. In many cases I'm also unable to do the select in the End routine, for instance because the lookup fields are only present in the source package and not in the result package. Obviously the select must be done in the start routine in such cases.

      Ultimately it is dependent on what logic you have and what is your stylistic preference.  Just don't do Selects in a field routine though.