Skip to Content

Summary

There are Data Sources which are not supporting Delta-Updates (BI Content and custom). In usual cases the Change-log table can be used to generate an additive delta within the SAP BW. However this won’t work for all Data Sources. Different Scenarios are thinkable:

1. BI Content Data Sources

Some of the BI Content Data Sources such as 0CO_OM_CCA_30 are not supporting delta method even if a Standard DSO is used.

2. Custom Data Sources

Sometimes there are Scenarios thinkable where the Source System only delivers the result of an operation (delete, change, new) but not the change which lead to the result. This is really hard if there is a source system which deletes data and the SAP BW does not get an delta information about that.

3. Other Scenarios like the use of complex routines are thinkable.

A standard solution must be found to create a really clean delta for the reporting.

Solution

The customer want to create a reporting, which is based upon the Data Source 0CO_OM_CCA_30 (You can use the approach of this paper for the BI Content DataSource 0CO_OM_CCA_20, 0EC_PCA_4, etc. either). In respect to SAP Help the data have to be loaded in Full Mode every time. No Delta generation should be possible using a Data Store Object between the InfoCube and the DataSource.

However in this How to Paper a solution is described which can be used to overcome this limitation by the technical architecture which is illustrated below:

/wp-content/uploads/2012/12/architecture_169136.jpg

Step 1: Create DSO Delta


This DSO will be used to generate the Delta. At first step create a simple 1:1 transformation to the DataSource 0CO_OM_CCA_30.

dso design.jpg

In addition please add the three InfoObjects:

– 0TCTSTIMSTMP

– 0DM_UDATE

– 0DM_UTIME

Step 2: Create DSO STD


This Data Store Object will later on store a clean additive delta. Create the DSO with the same structure like the DSO Delta without the Info Objects 0TCTSTIMSTMP, 0DM_UDATE and 0DM_UTIME. The transformation between the two DSO is 1:1 and the aggregation behaviour is summation.

Step 3: Create transparent table ZBW_TIMES


This table is used to store the global timestamps per Data Store Object and data load.

timestamp table.jpg

Step 4: Create ABAP-Program to generate and store Timestamp


Create an ABAP-Program with the coding below and the parameter DSONAME. It generates a timestamp for the Data Store Object DSONAME and store it in the table ZBW_TIMES.

DATA: lv_timestamp TYPE TIMESTAMPL,
ls_zbw_Times
TYPE ZBW_TIMES,
date  type UMC_Y_ICHAVAL,
date_calc
TYPE sydatum,
from  TYPE UMC_Y_FISCPER,
to    TYPE UMC_Y_FISCPER.

**********************************************************************
* Check if Datarow is already existing
SELECT SINGLE * FROM ZBW_TIMES INTO LS_ZBW_TIMES
WHERE DSO   = DSONAME.

**********************************************************************
* Delete existing ones
IF sysubrc EQ 0.
DELETE FROM ZBW_TIMES WHERE DSO EQ DSONAME.
ENDIF.

**********************************************************************
* Get actual Timestamp
GET TIME STAMP FIELD lv_timestamp.

**********************************************************************
* Update Database Table

ls_ZBW_TIMESDSO = DSONAME.
ls_ZBW_TIMES
TIMESTAMP = lv_timestamp.

MODIFY ZBW_TIMES FROM ls_ZBW_TIMES.

Step 5: Read Timestamp within the Transformation Data Source -> DSO Delta


Now a routine must be implemented to read out the timestamp and save it in the Data Store Object DSO Delta. In my case I used an Endroutine but this is up to you.

It is very important to use a global timestamp which is stored in the database. If not, there is the danger that one Timestamp per data Package is created – even if the Timestamp is generated in the global part of a start/end-routine.

DATA:    lv_timestamp TYPE TIMESTAMPL,
lv_dm_udate  
TYPE D,
lv_dm_utime  
TYPE T,
LS_ZBW_TIMES
TYPE ZBW_TIMES.

SELECT SINGLE * FROM ZBW_TIMES INTO LS_ZBW_TIMES
WHERE DSO = ‘Delta DSO’.

IF sysubrc EQ 0.

CONVERT TIME STAMP ls_ZBW_TIMESTIMESTAMP TIME ZONE
sy
zonlo
INTO DATE lv_dm_udate
TIME lv_dm_utime.

ENDIF.

DATA: date    TYPE /BI0/OIDATE,
month  
TYPE /BI0/OICALMONTH.

LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.

        *     Set uploading date and time to each record
<RESULT_FIELDS>
TCTTIMSTMP = ls_ZBW_TIMESTIMESTAMP.
<RESULT_FIELDS>
DM_UDATE   = lv_dm_udate.
<RESULT_FIELDS>
DM_UTIME   = lv_dm_utime.

       ENDLOOP.


Step 6: Create Realignment – Transformation

Please create a transformation for the Realignment (Source & Aim: DSO Delta) and set zero as a constant for all key Figures. The aggregation behaviour must be overwrite.

If the transformation is executed this will cause a deletion of the data in the data target.

Step 7: Create Realignment – Data Transfer Package


The DTP should only load the data which is old and ‘delete’ it. For this reason all records are selected which are older than the maximum time stamp in the DSO Delta. This must be done by a filter routine:

DATA: lv_timestamp TYPE /BI0/OITCTTIMSTMP.
DATA: low_lv_timestamp TYPE /BI0/OITCTTIMSTMP.

SELECT max( TCTTIMSTMP ) FROM Delta DSO
INTO lv_timestamp.

read table l_t_range with key
fieldname
= ‘TCTTIMSTMP’.

l_idx = sytabix.

lv_timestamp = lv_timestamp 1.

l_s_rangeiobjnm    = ‘/BI0/OITCTTIMSTMP’.
l_s_range
fieldname = ‘TCTTIMSTMP’.
l_s_range
sign      = ‘I’.
l_s_range
option    = ‘BT’.
l_s_range
low       = low_lv_timestamp.
l_s_range
high       = lv_timestamp.

if l_idx <> 0.
modify l_t_range FROM L_s_range.
else.
append L_s_range to l_t_range.
endif.
p_subrc = 0.

Step 8: Create Process chain


The process chain must now combine all elements we developed earlier.

  1. ABAP Program of Step 4 with a variant for your DSO (here DSO Delta) to set the global Timestamp
  2. Load Data into DSO Delta from Data Source
  3. Activate Data in DSO Delta
  4. Load Realignment with Timestamp selection
  5. Activate Data in DSO Delta
  6. Load Data from DSO Delta into DSO Std.

If the chain is executed it generates a clean additive in the DSO Std. It is important that the DSO Std. does not have the Timestamp in the data fields. Since the Timestamp changes at every data load this would lead to a change log where every data record has been changed. If you load data from a DSO like this always all data is extracted.

Fictional Example


The data record with the Key 12345 is deleted in the source system without delivering a delta information.

Data Load 1: 25-12-2012

DSO Delta (before and after Realignment)

Key

Timestamp

Key Figure

12345

  1. 20121225….

100

DSO Standard

Key

Key Figure

12345

100

Data Load 2: 26-12-2012 Key 12345 deleted, Key 23465 created


DSO Delta (before Realignment)

Key

Timestamp

Key Figure

12345

20121225….

100

23465

20121226….

333

DSO Delta (after Realignment)

All data records where the Timestamp is older than the maximum will be set to zero.

Key

Timestamp

Key Figure

12345

20121225….

0

23465

20121226….

333

DSO Standard

Key

Key Figure

12345

0

23465

333

Result


The solution which is described in this document is a generic approach to reach a delta for many applications.

In case of such full data sources it makes sense that scenarios like above are enhanced by dynamic time period selections in the Info Package of the Data Source to restrict the amount of data which is transferred within the SAP BW.

In my projects a good approach was to load only the last three periods every day since older periods can’t be changed anymore. If you restrict the data by filters think of the DTP of the Realignment. It must have the filter as the Info Package.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply