Skip to Content

Applies to:

SAP BW 3.5x / Business Intelligence 7.x, Time Dependent, Master Data, Human Resource

Summary:

This document is intended to a requirements where we want to merge Time dependent master data attributes which are coming from different sources. Normal practice is to load complete data to Master Data, but performance is very slow in such a case. We can merge data before loading it in Master data. Also as we are processing all data together before loading in master data it will avoid lock to master data while loading data in Master Data and data load can happens in parallel for all sources.

Author(s)     :  Ganesh Wathare

Created on  :  10 January 2014

Author Bio:

Ganesh Wathare works as Technology Lead in Infosys Limited. He has 7+ years of experience in IT Consulting Industry during this time he has worked majorly on SAP NetWeaver BW and relational databases together with different BI reporting tools. He has worked on Various Support and implementation Projects.


1. Summary

Many times  we need to load data from multiple sources (multiple datasources). If this data load is in time dependent master data then we face issue as we have to load data to same Master data tables. This king of issue happens majorly in HR related Master Data’s where we get data from multiple sources e.g. for 0Employee we get data from 6 different datasources in Business Content. Solution to that is change in data flow and merge data before we load to master data and then load to master data once. Provide statement has a ability to identify the time intervals based on all sources and then help to pick data from specific sources.

I have tried to keep this Document is a global so that it can be used in other similar scenarios as well.

There are limitations as well, from all sources we need to identify unique source which will be having all the values for a master data, same source will be used for identify values for Master Data, intervals will be identified from all sources which are feeding to target.

2.How to implement

1) Firstly I will take a example of the normal data flow where we load a time dependent master data from 6 different datasources to a Master Data. Below will be flow diagram for same. Here one master data is getting loaded from 6 datasources where master data is time dependent.

2) In this approach we need to load from one data source then from second and so on while in this case time interval split happens at InfoObject level and along with each individual loads.

3) Now we can change design slightly and try to make data load faster, below will be new design for this flow to achieve the requirement.

4) In above design we will load data from Datasource1 to <DSO1> and so on for all 6 sources. Once data is loaded in all 6 DSO’s we will merge data to Consolidated DSO and then load into Master Data. For same we will use an Expert routine in transformation.

In this case consolidated DSO will take care of time interval split from all 6 DSO’s and also assign values to respective fields.

For below code there is a data coming from 6 DSO’s named DSO1, DSO2, DSO3, DSO4, DSO5 and DSO6. This routine is written from DSO3 to Consolidated DSO. Hence DSO3 is a DSO which has all the records which are required to be consolidated. Default Time interval considered is from ‘19000101’ to ‘99991231’. <Key>, <Date From> and <Date To> are the Keys for all underlying DSO’s as well as consolidated DSO’s.

3.Source Code

Below is a code for Provide statement.

Declarations:

TABLES: /BIC/A<DSO1>00,
        /BIC/A<DSO2>00,
        /BIC/A<DSO3>00,
        /BIC/A<DSO4>00,
        /BIC/A<DSO5>00,/BIC/A<DSO6>00.
"DECLARATION OF INTERNAL TABLE AND WORK AREA FOR <DSO1>.DATA: i_<DSO1> TYPE SORTED TABLE OF /BIC/A<DSO1>00
                  WITH UNIQUE KEY <Key> <Date From> <Date To>,
      ls_<DSO1> LIKE LINE OF i_<DSO1>.
"DECLARATION OF INTERNAL TABLE AND WORK AREA FOR DSO2.DATA: i_<DSO2> TYPE SORTED TABLE OF /BIC/A<DSO2>00
                  WITH UNIQUE KEY <Key> <Date From> <Date To>,
      ls_<DSO2> LIKE LINE OF i_<DSO2>.
"DECLARATION OF INTERNAL TABLE AND WORK AREA FOR DSO3.DATA: i_<DSO3> TYPE SORTED TABLE OF /BIC/A<DSO3>00
                  WITH UNIQUE KEY <Key> <Date From> <Date To>,
      ls_<DSO3> LIKE LINE OF i_<DSO3>.
"DECLARATION OF INTERNAL TABLE AND WORK AREA FOR DSO4.DATA: i_<DSO4> TYPE SORTED TABLE OF /BIC/A<DSO4>00
                  WITH UNIQUE KEY <Key> <Date From> <Date To>,
      ls_<DSO4> LIKE LINE OF i_<DSO4>.
"DECLARATION OF INTERNAL TABLE AND WORK AREA FOR DSO5.DATA: i_<DSO5> TYPE SORTED TABLE OF /BIC/A<DSO5>00
                  WITH UNIQUE KEY <Key> <Date From> <Date To>,
      ls_<DSO5> LIKE LINE OF i_<DSO5>.
"DECLARATION OF INTERNAL TABLE AND WORK AREA FOR DSO6.DATA: i_<DSO6> TYPE SORTED TABLE OF /BIC/A<DSO6>00
                  WITH UNIQUE KEY <Key> <Date From> <Date To>,
      ls_<DSO6> LIKE LINE OF i_<DSO6>.
"OTHER DECLARATIONSDATA : flag1(1) TYPE C,
       flag2(1) TYPE C,
       flag3(1) TYPE C,
       flag4(1) TYPE C,
       flag5(1) TYPE C,
       flag6(1) TYPE C,
       Counter type i value 1.

Expert Routine:


DATA : i_package type _ty_t_SC_1.

 If NOT SOURCE_PACKAGE[] is initial.

 i_package[] = SOURCE_PACKAGE[].

 "SELECT FOR DSO1.
 SELECT *
 FROM /bic/A<DSO1>00
 INTO TABLE i_<DSO1>
 for all entries in i_package[]
 where <KEY> = i_package-<Key>.

 "SELECT FOR DSO2.
 SELECT *
 FROM /bic/A<DSO2>00
 INTO TABLE i_<DSO2>
 for all entries in i_package[]
 where <Key> = i_package-<Key>.

 "SELECT FOR DSO3.
 SELECT *
 FROM /bic/A<DSO3>00
 INTO TABLE i_<DSO3>
 for all entries in i_package[]
 where <Key> = i_package-<Key>.

 "SELECT FOR DSO4.
 SELECT *
 FROM /bic/A<DSO4>00
 INTO TABLE i_<DSO4>
 for all entries in i_package[]
 where <Key> = i_package-<Key>.

 "SELECT FOR DSO5.
 SELECT *
 FROM /bic/A<DSO5>00
 INTO TABLE i_<DSO5>
 for all entries in i_package[]
 where <Key> = i_package-<Key>.

 "SELECT FOR DSO6.
 SELECT *
 FROM /bic/A<DSO6>00
 INTO TABLE i_<DSO6>
 for all entries in i_package[]
 where <Key> = i_package-<Key>.


 loop at i_package assigning <source_fields>.
 "CLEAR ALL WORK AREA AND FLAG VARIABLES.
 CLEAR: ls_<DSO1>,
 ls_<DSO2>,
 ls_<DSO3>,
 ls_<DSO4>,
 ls_<DSO5>,
 ls_<DSO6>,

 flag1,
 flag2,
 flag3,
 flag4,
 flag5,
 flag6.


 provide fields * from i_<DSO1> into ls_<DSO1> valid flag1
 bounds <Date From> and <Date To>
 where <Key> = <source_fields>-<Key>

 fields * from i_<DSO2> into ls_<DSO2> valid flag2
 bounds <Date From> and <Date To>
 where <Key> = <source_fields>-<Key>

 fields * from i_<DSO3> into ls_<DSO3> valid flag3
 bounds <Date From> and <Date To>
 where <Key> = <source_fields>-<Key>

 fields * from i_<DSO4> into ls_<DSO4> valid flag4
 bounds <Date From> and <Date To>
 where <Key> = <source_fields>-<Key>

 fields * from i_<DSO5> into ls_<DSO5> valid flag5
 bounds <Date From> and <Date To>
 where <Key> = <source_fields>-<Key>

 fields * from i_<DSO6> into ls_<DSO6> valid flag6
 bounds <Date From> and <Date To>
 where <Key> = <source_fields>-<Key>

 between '19000101' and  '99991231'.
 clear RESULT_FIELDS.
 ***          move-corresponding <source_fields> to RESULT_FIELDS.

 RESULT_FIELDS-<Key> = <source_fields>-<Key>.
 RESULT_FIELDS-RECORDMODE = <source_fields>-RECORDMODE.
 RESULT_FIELDS-<Date From> = ls_<DSO3>-<Date From>.
 RESULT_FIELDS-<Date To> = ls_<DSO3>-<Date To>.

 *******************************************
 ******<DSO1>
 *******************************************
 if flag1 is not initial.
 RESULT_FIELDS-/BIC/<Object1> = ls_<DSO1>-/BIC/<Object1>.
 ……………
RESULT_FIELDS-/BIC/<Objectn> = ls_<DSO1>-/BIC/<Objectn>.
 endif.


*******************************************
 ******<DSO2>
 *******************************************
 if flag2 is not initial.
            RESULT_FIELDS-/BIC/<Object1> = ls_<DSO2>-/BIC/<Object1>.
 ……………
RESULT_FIELDS-/BIC/<Objectn> = ls_<DSO2>-/BIC/<Objectn>.
          endif.
*******************************************
 ******<DSO3>
 *******************************************

 if flag3 is not initial.

 RESULT_FIELDS-/BIC/<Object1> = ls_<DSO3>-/BIC/<Object1>.
 ……………
RESULT_FIELDS-/BIC/<Objectn> = ls_<DSO3>-/BIC/<Objectn>.
          endif.
*******************************************
 ******<DSO4>
 *******************************************

 if flag4 is not initial.
 RESULT_FIELDS-/BIC/<Object1> = ls_<DSO4>-/BIC/<Object1>.
 ……………
RESULT_FIELDS-/BIC/<Objectn> = ls_<DSO4>-/BIC/<Objectn>.
 endif.
*******************************************
 ******<DSO5>
 *******************************************


 if flag5 is not initial.

 RESULT_FIELDS-/BIC/<Object1> = ls_<DSO5>-/BIC/<Object1>.
 ……………
RESULT_FIELDS-/BIC/<Objectn> = ls_<DSO5>-/BIC/<Objectn>.
 endif.
 *******************************************
 ******<DSO6>
 *******************************************
 if flag6 is not initial.
 RESULT_FIELDS-/BIC/<Object1> = ls_<DSO6>-/BIC/<Object1>.
 ……………
RESULT_FIELDS-/BIC/<Objectn> = ls_<DSO6>-/BIC/<Objectn>.
 endif.


 RESULT_FIELDS-RECORD = counter.
 counter = counter + 1.
 Append RESULT_FIELDS to RESULT_PACKAGE.
 endprovide.

 endloop.

 endif.
 *$*$ end of routine - insert your code only before this line         *-*





4. Points to be Noted:-
    1. Data load’s for all source DSO’s and Consolidated DSO should be always Full delete and full load.
    2. If there is difference in Keys then there is chance of a miss-match of data which should be considered while providing similar solution.
    3. Best Suited for a HR Master Data’s like Employee, Organizational Unit, Person, etc.
To report this post you need to login first.

23 Comments

You must be Logged on to comment or reply to a post.

  1. Anshu Lilhori

    Approach seems to be apt in such scenarios..I appreciate your efforts that you have put in.

    Real time scenarios definitely give us more insight and educates us on modeling techniques and flexibility on the designing part

    Regards,

    AL

    (0) 
  2. Harald Schwenger

    Hi Ganesh,

    i have noticed there is not a single pale-faced comment here. So let me make the Point: Your document is an excellent contribution! Thank you and Keep on sharing your valuable knowledge!

    Cheers, Harald

    (0) 

Leave a Reply