Skip to Content

Datasources

Datasource: A Datasource is not only a structure in which source system fields are logically grouped together, but also an object that contains ETTL related information.

There are two types of Datasources:

  1. Application Specific: These are SAP given datasources, they further divided into
  • BI content 
  • Customer generated Datasources
  1. Cross-Application: User-created Datasources
  • Generic Datasources

/wp-content/uploads/2013/06/1_1_236570.png

Business Content: It is a complete set of BW objects developed by SAP to support OLAP tasks. It contains pre-defined roles, workbooks, queries, infocubes, key figures, characteristics, update rules, infosources, transformations, DTP’s etc.

SAP delivers Business content in ‘D’ version, then in order to use it, we need to activate it, then the version changes to ‘A’ version. Changes are saved in ‘M’ version.

Version

Meaning

Description

D version

Delivered version

SAP delivers BC in new ‘D’ version

M version

Modified version

Changes are saved in the ‘M’ version

A version

Active version

You need to activate BC before working with them

Installing Business Content:

We install the  BC :

  • After the content release upgrade
  • After installing a content support package
  1. Goto RSA1 –> in the modelling select BI content tab.

/wp-content/uploads/2013/06/1_236571.png

In the BI content tab, you can see that the window is divided into three parts.

  • In the left hand side of the window, you determine how the objects are viewed in the middle window.
  • In the middle of the window, you select the objects that you want to activate.
  • In the right-hand side of the window, you will simulate and install the business content.

Grouping:

  1. 1.Only necessary objects: if you select this option, then only necessary objects will be installed.
  2. In data flow before: If you select this option, system will automatically add, the objects that lie below the object you selected, according to the dataflow.
  3. In data flow afterwards: If you select this option, the system will automatically add, the objects that lie after the object you selected, according to the dataflow.
  4. In data flow before and afterwards: If you select this option, for the given object, the system will add objects that are before and afterwards in the dataflow.
  5. e.g. If you select the cube 0ic_c03, system will add all the objects below the cube like Datasources, transformations, dtps etc, and objects that comes afterwards like queries, workbooks etc.

Collection mode:

  • Collect automatically(default setting): Objects are collected automatically
  • Start manual collection: objects are collected, when you click on gather dependent objects

Check the objects after collection to install.

/wp-content/uploads/2013/06/2_236572.png

Install:

While installing, check the install boxes available for objects.

Install all below: the objects and the dependent objects are checked for the installation.

Donot install all below: The checkboxes are removed when you select this option.

Match X or copy:

When the object is available in the active version, you have to decide whether to keep the active version or to install the latest SAP delivered version of the object.

when you check the match, then the customer version of the object is merged with the new SAP version and a new customer version is created. If you do not check the match, SAP delivered version of the object is overwritten.

There are four options in Install:

  • Simulate installation
  • Install
  • Install in background
  • Install and transport
    1. Simulate installation: The system runs a test to check whether there are any errors.
    2. Install: installation happens in the foreground.
    3. Install in Background: Installation happens in background, you can monitor the request in SM37.
    4. Install and transport: All the objects are installed and then written to the TR automatically.

FI Datasources:

Finance is a back-bone of any organisation, and for that reason, SAP provides FI module which records, collects or processes the financial transactions or information on a real-time basis to fulfill the reporting requirements. It is tightly integrated with other modules like MM, SD, PP, HR, and CO.

The sub-modules of FI are:

  1. FI-AA(Asset Accounting)             : It deals with financial side, like depreciation, insurance etc. Starting with Procurement of assets and ending with scrapping or sales.   
  2. FI-AP(Accounts Payable)       : It deals with vendor transactions with payment program for making payments to the vendors.
  3. FI-AR(Accounts Receivable)   :  It deals with customers and receivables, with credit management functionalities and ‘dunning’ program.
  4. FI-GL(General ledger)         :      It gets simultaneous postings from FI-AA, FI-AR, FI-AP. All accounting postings are recorded here, to provide up to date visibility of the financial accounts.
  5. FI-SL(Special Purpose ledger) :  It provides summary information from multiple applications at a level of detail that the user defines.
  6. FI-LC(Legal consolidations)     :   It deals with financial operating results of the companies within a group to provide overall results for the group.
  7. FI-BL(Bank accounting)          :
  8. FI-FM(Funds management)     :
  9. FI-TM(Travel management)      :

We can extract the financial data at totals level/ line-items level. In general, we will use R/3 line item tables as the datasource for extracting the data to allow drill down capability from summarized data to line item data.

Example of Installation:

We are going to install the cube 0FIAA_C01 in the Business content.

Steps:

1. In order to check the dataflow for the cube 0FIAA_C01, you have to goto Metadata repository, there click the Business content. Now select the Infocube and search for the Infocube 0FIAA_C01.

FI.png

2. Double-click on the Infocube. Then double-click on the option ‘Network display of dataflow’, here you can see whole dataflow for this infocube.

FI2.1.png

3. Go to RSA5 in ECC side, and Activate the three datasources shown above in the figure. Once activated, those datasources must be available in RSA6.

4. Now, Go to Business Content tab, Select the cube 0FIAA_C01, group objects according to your requirement, then install it.

FI3.png

5. After BC  installation is completed, replicate the datasources. Once you done that, now go to the Infocube 0FIAA_C01, select it, and right-click. Now select the ‘show data flow’ option. The below is the dataflow for the cube.

FI5.png

6. Now, as that the dataflow is completely installed, create Infopackages for the datasources, and load the data.

FI4.png

7. Once the data is loaded successfully, you can click on manage of infocube, and monitor the requests. Similarly, you can go to LISTCUBE and check the data in the cube.

FI6.png

Logistics Datasources:

LO extraction gives better performance with reduced volume of data. It provides us a single solution for all logistic applications. LO extraction is a detailed extraction, updated with batch process (V3 update). Only data that are changed from the BW perspective are added to the delta queue.

LO data extraction: customizing cockpit:

LO1.png

The document flow for LO data extraction is shown below:

  • Whenever document is created, it is stored in the respective database table via communication structure.
  • The structure of the datasource is called extract structure.
  • The data from DB table is passed on to the datasource through Extract structure.

LO2.png

Datasource naming convention:

The SAP standard datasources has a specific naming convention provided by SAP.

  1. e.g: 2LIS_11_VAHDR

In this datasource,

2 is a symbol for logistics, all datasources belong to logistics, start with 2.

LIS stands for LO information structures.

11 is a application number.

VA further specifies that the datasource is a sales order.

HDR means header data, it specifies the type of data.

HDR – Header data

ITM   – Item data

SCL  – Schedule line

Application

Description

02

Purchasing

03

Inventory

04

Shopfloor

05

Quality Mgt

06

Invoice verification

11

Sales order

12

Shipping

13

Billing

17

Plant Maintainance

LO3.png

Different types of Updates are available:

  • V1 update: It is called as synchronous update. Whenever document is created (or) posted, it is going in DB table and extraction tables/update tables at the same time.
  • V2 update: It is called as asynchronous update. Whenever document is posted, the data is stored in the DB tables first then in the update tables/extraction tables.
  • V3 update: It is Asynchronous update with Background. When the document is posted, It will be posted directly to BW delta queue via update tables. You can schedule a background job for v3 update.

LBWQ is a trsn for QRFC monitor

RSA 7 is a trsn for BW Delta Queue

SM13 or SMQ1 is a trsn to see extraction tables or update tables.

There are three types of Delta methods in LO data extraction.

  • Direct Delta: Data is transferred to BW delta queue directly for each document posting.
  • Queued Delta: Data is transferred to QRFC monitor, then to the BW delta Queue. Good for higher volumes of data, and frequent updates.
  • Non-serialized V3 update – The data is directly posted to the BW delta Queue, but the sequence is not guaranteed.

CO-PA:

Controlling and Profitability Analysis (CO-PA) is mainly used for Sales and contribution margin reports. It helps you determine how profitable your market segments are, with CO-PA, you will be able to define which segments, like customer, product, geography, sales organization, of the market are required for analyzing the operating results/profits. CO-PA mainly focuses on external market segments profitability.

There are two types of reports in CO-PA

  1. Costing-based  : It is used to valuate sales orders, billing documents to determine the deductions and costs.
  2. Account-based : By using cost and revenue elements, It provides you with a profitability report, which is reconciled with FI.

The transaction code for creating CO-PA datasource is KEB0.

CO-PA.png

Here, we will create a CO-PA datasource as shown in the figure above. We are going to create Costing based datasource.

For that, you need to first go to transaction code KEB0. Then give the datasource name, it should start with 1_, here our DS name is 1_CO_PA_Profit.

CO-PA1.png

Give the operating concern, as shown in figure, and execute.

CO-PA2.png

Now, give the descriptions and field name for partitioning as shown above. Click on Infocatalog symbol above. You will be asked for the developer key, fill the key and continue.

CO-PA3.png

Now, you need to select the selection fields. Once you select it, these fields will be available as selection parameters in RSA3, and InfoPak.

CO-PA4.png

  • Selection: If you select the selection checkbox, the field will be used for selection.
  • Hide field: If you select the Hide field, the field is hidden in the BI system.
  • Inversion: It is used for data inversion.
  • Field only: When you enhance the datasource, this option is available for the added ‘Zfields’. When you select this option for the field, that field cannot be used for selection. 

Now, Save the datasource. The CO-PA datasource is created in ECC side. You can go to RSA6 and check this datasource.

In RSA3, you can check the number of records this datasource contains, as shown in the figure below.

CO-PA5.png

As you can see from the figure, this datasource contains 10,264 records. You can extract these records by creating dataflow in BW.

Steps in BW:

  1. Replicate the datasource, right-click the CO-PA application component, and replicate.
  2. Take the field names and search for the appropriate infoobjects in RSOSFIELDMAP(Database table).
  3. Create DSO, create transformation and DTP, target as DSO.
  4. Create Cube, Create transformation and DTP, target as Cube.
  5. Load the data first to the DSO, then to the cube.

Note:

This document may contain the definitions, and the architecture of the book:

Data warehousing with SAP BW7. By christine mehrwald, sabine morlock.

Please visit my website for my blogs on SAP BI.

http://www.shaikshahidimam.com/

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply