Additional Blogs by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member184494
Active Contributor

Usage of generic extraction

In every BW installation there is always a requirement to get custom fields into SAP BW. Often the tool of resort is the generic extractor. Some of the things I found useful when creating generic extractors are:

I have tried to summarize main points or gotchas as they are usually referred to. In many cases because parameters like these are overlooked , generic extraction becomes something of a step child and the ETL and transformation logic in SAP BW becomes more complicated.

Which type of generic extractors to use?

This is a very vital question - of course the options available to you are the usual suspects of View / Infoset Query / Function Module.

First examine the data that you want to extract and understand the nature of the data and the data target in SAP BW. Some of the questions would be:

1. Load to a DSO or Cube (Master Data can be thought of as DSO for now)
This is vital since this tells you the nature of records being extracted - and future reconciliation is also to be taken into account.

2. Data Volumes being extracted
If your data volumes are too high - for example if you want to extract from the Document flow table VBFA from your R/3 system - make sure that you have selection conditions on the keys else your extractor is going to bring in huge data quantities at the first load which you will have to anticipate and prepare for.
Also if the data volumes are high but you would need only a small part of it - consider having selection conditions in the view for better performance and ease of use.


3. Can this also be got by enhancing an existing extractor?
This is a persistent question which always merits some attention. It is always better to enhance an existing BI Content extractor instead of creating your own because the coding for the same is IMHO easier as opposed to creating another generic extractor. However some business content extractors have notes which clearly mention that they not be enhanced. I have come across some CRM Extractors which have these restrictions.


Delta Functionality:


It is always better to test the delta functionality - the usual suspects for delta are the created on and changed on fields - in many cases when a record is newly created , the changed on field is not filled - make sure that the delta logic is complete and that all changed records are pulled into BW.


Safety Limits:

The safety limits determine the amount of overlap between deltas and the upper and lower safety limits should be taken advantage of especially if you are having an extractor which extracts to a DSO with the status new for records. For a cube however, if you have additive deltas - test the delta fields before using the same.

Other delta fields like timestamp and record pointer have to be tested thoroughly before using. Record pointer - in many cases like the CATS table for timesheets in ESS works by this and many more but be sure of the application logic before proceeding.

Testing the extractor:
Testing the extractor for Delta is tough since it is hard to use RSA3 for delta testing - since delta is specific to the target system. Give the details of the target system before testing a Delta load in RSA3.

Complex delta logic:<br />This is very common when you have to identify changed records - a very typical example might be when a table has changed and created on fields and the delta logic is:</p><p><Delta field> = <changed on > if changed on is not blank else take <Created on ><br />This however cannot be achieved using the standard delta management. For any delta that uses multiple fields for determining delta - you would have to use a function module extractor. I will explain function module extractors and an easy way of writing the same in a separate blog.</p><p>I know that generic extractors have been explained in some detail in the forums and the SAP Help documentation . I am just trying to add to the body of knowledge for anyone who is looking at simplifying   their ETL by using generic extractors better and building the complex ETL logic into their extractors as opposed to bringing in the data into BW first and then having the lookups and logic imposed within SAP BW.</p>

7 Comments