Skip to Content
Author's profile photo Arun Varadarajan

Generic Data Extraction in SAP BW Decoded – Part 1

Usage of generic extraction

In every BW installation there is always a requirement to get custom fields into SAP BW. Often the tool of resort is the generic extractor. Some of the things I found useful when creating generic extractors are:

I have tried to summarize main points or gotchas as they are usually referred to. In many cases because parameters like these are overlooked , generic extraction becomes something of a step child and the ETL and transformation logic in SAP BW becomes more complicated.

Which type of generic extractors to use?

This is a very vital question – of course the options available to you are the usual suspects of View / Infoset Query / Function Module.

First examine the data that you want to extract and understand the nature of the data and the data target in SAP BW. Some of the questions would be:

1. Load to a DSO or Cube (Master Data can be thought of as DSO for now)
This is vital since this tells you the nature of records being extracted – and future reconciliation is also to be taken into account.

2. Data Volumes being extracted
If your data volumes are too high – for example if you want to extract from the Document flow table VBFA from your R/3 system – make sure that you have selection conditions on the keys else your extractor is going to bring in huge data quantities at the first load which you will have to anticipate and prepare for.
Also if the data volumes are high but you would need only a small part of it – consider having selection conditions in the view for better performance and ease of use.

3. Can this also be got by enhancing an existing extractor?
This is a persistent question which always merits some attention. It is always better to enhance an existing BI Content extractor instead of creating your own because the coding for the same is IMHO easier as opposed to creating another generic extractor. However some business content extractors have notes which clearly mention that they not be enhanced. I have come across some CRM Extractors which have these restrictions.

Delta Functionality:

It is always better to test the delta functionality – the usual suspects for delta are the created on and changed on fields – in many cases when a record is newly created , the changed on field is not filled – make sure that the delta logic is complete and that all changed records are pulled into BW.

Safety Limits:

The safety limits determine the amount of overlap between deltas and the upper and lower safety limits should be taken advantage of especially if you are having an extractor which extracts to a DSO with the status new for records. For a cube however, if you have additive deltas – test the delta fields before using the same.

Other delta fields like timestamp and record pointer have to be tested thoroughly before using. Record pointer – in many cases like the CATS table for timesheets in ESS works by this and many more but be sure of the application logic before proceeding.

Testing the extractor:
Testing the extractor for Delta is tough since it is hard to use RSA3 for delta testing – since delta is specific to the target system. Give the details of the target system before testing a Delta load in RSA3.

Complex delta logic:<br />This is very common when you have to identify changed records – a very typical example might be when a table has changed and created on fields and the delta logic is:</p><p><Delta field> = <changed on > if changed on is not blank else take <Created on ><br />This however cannot be achieved using the standard delta management. For any delta that uses multiple fields for determining delta – you would have to use a function module extractor. I will explain function module extractors and an easy way of writing the same in a separate blog.</p><p>I know that generic extractors have been explained in some detail in the forums and the SAP Help documentation . I am just trying to add to the body of knowledge for anyone who is looking at simplifying   their ETL by using generic extractors better and building the complex ETL logic into their extractors as opposed to bringing in the data into BW first and then having the lookups and logic imposed within SAP BW.</p>

Assigned tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member
      hey Arun thanks this was an good overview about generic extraction
      Author's profile photo Former Member
      Former Member
      Hi Arun,

      Good Blog!

      Could you pls suggest me some good material which gives in and out of LO Delta extraction?

      Thanks in advance

      Author's profile photo Arun Varadarajan
      Arun Varadarajan
      Blog Post Author
      The definitive source of information for Lo extraction are the blogs by Roberto Negro. I think there are some problems with searching for Blogs but then that is the source of information to go to for LO extraction.
      Author's profile photo Shanthi Bhaskar
      Shanthi Bhaskar
      here is the link


      Author's profile photo Former Member
      Former Member

      Thank youArun Varadarajan, I am very new to SAP BW and I find your posts very helpful. Now that you have pointed to another expert Roberto Negro and his work on LO extraction is a true eye opener.

      Author's profile photo Former Member
      Former Member
      looking forward to Part II.
      Author's profile photo Arun Varadarajan
      Arun Varadarajan
      Blog Post Author
      Generic Data Extraction in SAP BW Decoded - Part 2

      all that you wanted to know about generic extraction using function modules.