Additional Blogs by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

 

What do you feel after having a glance at the above snapshot.............

Wait a minute I know what you guys are thinking..

Few of us will say immediately with out any second thought that it is a graphical representation of various extractors involved in data acquisition.

Great! We got the answer

But if the image still dwells on your mind and if it motivates you to know more on how the various extractors behave (or) how the extraction happens?

This blog is created to answer this.

 

Note:Please refer "One stage stop to know all about BW Extractors-Part2", to know how customer generated extractors behave.

 

Extractor in a simple terminolgy is used for extracting  the data from various sources to BW.

For this purpose we have SAP pre-defined extractors (LO extraction etc...) and customized extractors (Generic extractors)

  

Application specific BW content extractors:

Lo Extraction:

Logistics refers to the process of getting a product or service to its desired location upon request which involves transportation, purchasing, warehousing etc.

Main areas in logistics are: 

Sales and Distribution (SD)          :  application 11, 13, 08 (in LBWE T-code)

Materials Management (MM)        :  application 03, 02

Logistics Execution (LE)               :  application 12

Quality Management                  :  application 05

Plant Maintenance (PM)              :  application 04, 17

Customer Service (CS)               :  application 18

Project System (PS)                   :  application 20

SAP Retail                                    :  application 40,43,44,45

 

 

How the data extraction happens?

Extraction can be done using either Full update/delta update.

 

Full load: Incase of logistic application, Full/Initialization will extract the data from setup tables (contains only historical data).

So if you have decided to go for full load, wait a minute there is a road block

For full update the data will be taken from setup tables, so in order to capture the changes you need to fill setup tables every time ,which will be a laborious task.

So, it is always suggestible to go for delta loads which makes loading life easier.

Read the below note to get details on delta load-:

 

Initialization: Data will be fetched from application table to setup tables (In Lo extraction, the extractor won't allow the direct communication with the application tables) from here, data finally reaches the target (info cube/ODS).Remember this process is for onetime.

Pre-requisites: Prior to initialization make sure the following steps are completed:

  1. Maintain Extract Structure

  2. Maintain data sources

  3. Activate Extract Structure

  4. Delete Setup tables

  5. Fill  setup tables

 

Delta load: Once after successful initialization, we can use delta update to capture the changed /new records

Once a new transaction happens/an existing record is modified, upon saving it goes to the respective application table. From here depending on the update mode LOGISTIC COCKPIT DELTA MECHANISM - Episode three: the new update methods the data will be populated in delta queue (RSA7) and finally reaches to BW.

 

 

 

Pre-requisites: Prior to delta loads make sure the following steps are completed:

1.Define periodic V3 update jobs 2. Setting up the update mode (direct/queued/Un serialized v3 update)

 

LO- Delta Mode:

Info object 0Recordmode helps in identifying the delta

Check the field "delta "in ROOSOURCE /RODELTAM table

Incase of Lo extraction it is "ABR"

ABR: An after image shows the status after the change, a before image the status before the change with a negative sign and the reverse image also shows the negative sign next to the record while indicating it for deletion. This serializes the delta packets.This process supports an update in an ODS object as well as in an Info Cube.

  

FI extraction:

FI Module deals with accounting and financial needs of an organization.

Financial Accounting is broken down into the following sub-modules:

 

  • Accounts Receivables
  • Accounts Payable
  • Asset Accounting
  • Bank Accounting
  • Consolidation
  • Funds Management
  • General Ledger
  • Special Purpose Ledger
  • Travel Management

Note: Only discussing key areas (AP/AR/GL/SL) briefly because of the complexity of the area

 

We can extract the financial data at totals level / line item level.

In general, we will use R/3 line item tables as the data source for extracting the data to allow drill down capability from summarized data to line-item details.

Financial Accounting data can be extracted directly from the tables.

Depending on the business requirement we can use either FI-SL or standard BW content extractors (FI-AR, FI-AP, and FI-GL) to fetch FI data.

 

Note: FI-SL will be discussed under "One stage stop to know all about BW Extractors-Part2 "which explains about Application specific customer generated extractors

 

FI-AR, FI-AP, and FI-GL:

General Ledger: All accounting postings will be recorded in General Ledger. These postings are real time to provide up-to-date visibility of the financial accounts.

Account Receivable: Accounts Receivables record all account postings generated as a result of Customer sales activity. These postings are automatically updated in the General Ledger

Accounts Payable: Accounts Payables record all account postings generated as a result of Vendor purchasing activity. Automatic postings are generated in the General Ledger as well.

 

Standard FI data sources:

0FI_GL_4 (G/L Accounts- line items)
Takes the data from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS).

0FI_AP_4 (AP-line items) and 0FI_AR_4 (AR- line items
Selections are made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable)

 

How the data extraction happens?

In FI extraction 0FI_AR_4 and 0FI_AP_4 are linked with 0FI_GL_4 in order to maintain consistent data transfer from OLTP system (it is called coupled data extraction, Ref OSS notes 428571).

Note: Uncoupled" extraction possible with Plug-In PI 2002.2, see OSS note 551044

 

0FI_GL_4 writes the entries into the time stamp table BWOM2_TIMEST in the SAP R/3 System with a new upper limit for the time stamp selection.

And now, 0FI_AP_4   and 0FI_AR_4 will copy this new upper limit for the time stamp selection during the next data extraction in the SAP R/3 System. This ensures the proper synchronization of accounts payable and accounts receivable accounting with respect to G/L accounting.

Full load: Not a valid choice because of large volumes of detailed R/3 transaction data.

 

Delta load:

Note: Here the delta identification process works differently for new financial records and for changed financial records.

New Financial accounting line items which are posted in SAP R/3 sytem will be identified by the extractor using the time stamp in the document header (Table BKPF-(field) CPUDT).

By scheduling an initialization IP all the historical data can be  loaded into BW from the application tables and it also sets "X" indicator in field LAST_TS (Flag: 'X' = Last time stamp interval of the delta extraction).That means after the last delta, initialization was done.

 

OLTPSOURCE

AEDAT/AETIM

UPD

DATE_LOW

DATE_HIGH

LAST_TS

0FI_GL_4

16 May 2007/20:15

Init

01 Jan 1990

15 May 2007

 

0FI_GL_4

24 May 2007/16:59

delta

16 May 2007

23 May 2007

 

0FI_GL_4

21 June 2007/18:12

delta

15 June 2007

20 June 2007

X

0FI_AP_4

18 May2007/21:23

Init

01 Jan 1990

15 May 2007

 

After this, daily delta loads can be carried out depending on timestamp by scheduling delta info packages.

During the delta load , the SAP R/3 system logs two time stamps that delimit a selection interval for a Data Source in table BWOM2_TIMEST(fields TS_LOW and TS_HIGH).

In case of changed FI documents, selections will be based on tables:

BWFI_AEDAT and (timestamp table) BWOM2_TIMEST (See OSS note 401646 for more details).

Delta extraction using delta queue method can also be possible incase if we want,

  •  Serialization of the records
  •  To distribute delta records to multiple BW systems.

 

FI -Delta Mode:

A time stamp on the line items serves to identify the status of the delta. Time stamp intervals that have already been read are then stored in a time stamp table (BWOM2_TIMEST).

(Info object 0Recordmode plays vital role deciding delta's .Check the field "delta "in ROOSOURCE /RODELTAM table to identify the image)

The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method).

AIE: This delta method is not suitable for filling Info Cubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (Info Cubes) can be provided with data from this ODS object.

It uses delta type E(pull) means the delta data records are determined during the delta update by the data source extractor, updated to the delta queue and passed on to BI directly from there.


Check the below helpful links:

  

CRM extraction:

Customer relationship management (CRM) is broadly about managing the relationships with customers, and is useful to analyze customer, vendor, partner, and internal process information.

 

How the data extraction happens?

We can do both full load and delta load depending on the CRM extractor behavior.

Initialization:

During the initialization, all data that can be extracted using a data source is transferred from SAP CRM into SAP BW.

  • Execute the initialization of the delta process in SAP BW by creating and scheduling an Info Package.
  • SAP BW calls up the BW Adapter using the Service API.
  • The BW Adapter reads the data from the respective database.
  • The selected BDoc data is converted into the extract structure from a mapping module that is also entered in the BW Adapter metadata.
  • The type of Business Add-In (BAdI) that is called up by the BW Adapter depends on the BDoc type
  • The requested data package is transferred to SAP BW using the Service API.

Delta load:

 

 

  • Any new postings/uptation of old postings from the source sytem (CRM )side will be communicated via Middleware in the form of a BDoc.
  • The flow controller for Middleware calls up the BW Adapter. 
  • The BW Adapter first checks whether the change communicated via the BDoc is relevant for SAP BW. A change is relevant if a Data Source for the BDoc is active.
  • If the change is not relevant, it is not transferred to SAP BW and the process is complete.
  • If it is relevant, then the BW Adapter calls up the corresponding mapping module and BAdi (the type of BAdi that needs to be called up in turn depends on the type of BDoc).
  • And finally these will help in converting the BDoc data into the extract structure.

Note:The mapping module and  the BAdis that are called up during delta upload are same as those called up during the initialization of the delta process.

The change is transferred to SAP BW using the Service API.

CRM-Delta Mode:

The delta will be identified /communicated via middleware in the form of Bdoc to BW adapter.

CRM standard data sources support AIMD (After-Images with Deletion Indicator Delta Queue)

  

HR extraction:

The HR module enables customers to effectively manage information about the people in their organization, and to integrate that information with other SAP modules and external systems

HR broadly has the following modules:

PA (Personnel Administration) and Organization Management

Personnel Development

Payroll Accounting

Time Management

Compensation

Benefits

Training and Events

The Personnel Administration (PA) sub module helps employers to track employee master data, work schedules, salary and benefits information. Personnel Development (PD) functionality focuses on employees' skills, qualifications and career plans. Finally, the Time Evaluation and Payroll sub modules process attendance and absences, gross salary and tax calculations, and payments to employees and third-party vendors

HR delivers a rich set of business content objects that covers all HR sub-functional areas.

 

How the data extraction happens:

Before getting into how the data gets populated into HR info cube

Let's understand the term info type

"An info type is a collection of logical and/or business-related characteristics of an object or person"

Here the data will be extracted from an info type (PA, PD, time management etc) and for few other applications it is from the cluster tables (Payroll, compensation etc.)

HR is basically master data centric because it is always related to people related Info Objects, such as Employee, Person. In most of the cases HR master data is defined as Time Dependent to enable historical evaluation. HR R/3 system records a specific period of validity for each Info type.

 

Procedure to extract the HR data:

  • Activate Data Sources in the Source system (R/3)
  • Replicate Data Sources in BW system:
  • Activate business contents in BW.
  • Populate HR cubes with data by scheduling info packages.

Note: Master Data should be loaded first

Except for payroll and time management rest all sub-functional areas supports only full load.

In case of full loads, old data needs to be deleted to avoid duplicated records in the target.

Check the below links for more information

To know about customer generated extractors, Please refer "One stage stop to know all about BW Extractors-Part2"

8 Comments