COMPLETE REFERENCE FOR DSO IN SAP BI
COMPLETE REFERENCE FOR DSO’S IN SAP BI
Contents
1 Business Scenario. 3
2 Types of DSO.. 3
3 Motivation for DSO.. 3
4 Architecture of Standard ODS / DSO (7.x) 3
5 The Transition: ODS Objects (3.X) to DSO (BI 7.0) 4
- 5.1 Structure of ODS.. 4
6 Sample Scenario for a Standard DSO.. 5
- 6.1 Naming Conventions. 5
- 6.2 Change Log. 5
- 6.3 Detailed Study on Change Logs. 6
- 6.4 Record modes. 7
7 Work Scenario. 7
8 Step–by – Step Guide. 8
- 8.1 Trouble Shooting. 26
- 8.2 Handling Summation & Delta Updates. 31
9 Summary. 32
1 Business Scenario
Sometimes it is desirable to combine data from different Data Sources before the same is stored into the Info Cubes. Also, there are analyses that need access to the detailed data than that found in the Cubes.
2 Types of DSO
- Standard Data Store Object (Ref. Fig. A)
- Data Store Object with Direct Update (Transactional ODS using 3.x)
- Write Optimized Data Store – BI 7.0
3 Motivation for DSO
- Consolidation & Cleansing
- To store data on document level
- Overwrite capability of characteristics
- Reporting
4 Architecture of Standard ODS / DSO (7.x)
“ODS Objects consist of three tables as shown in the architecture” – Source: SAP Docs
- Fig. A – ODS Architecture – Extracted from SAP Docs
5 The Transition: ODS Objects (3.X) to DSO (BI 7.0)
The ODS consists of consolidated data from several Info Sources on a detailed (document) level, in order to support the document analysis. In the context of the DSO, the PSA makes up the first level and the DSO table makes up the second level of the DSO. Therefore, the first level consists of the transaction data from the source system, and the second level consists of the consolidated data and data from several source systems and Info Sources. You can run this analysis directly on the contents of the table, or run it from an Info Cube query into a query by means of a drilldown.
- Fig. B. Sample schema for Reporting using ODS Objects (using Update Rules & Transfer Rules)
* Note: UR refers to Update Rules
Prior to existence of DSO, decisions on granularity were based solely on data in Info Cube. Now Info Cube can be less granular with data held for a longer period of time versus the DSO which can be very granular but hold data for a shorter period of time. Data from the ODS can be updated into appropriate Info Cubes or other ODS Objects. Reporting on ODS can be done with the OLAP processor or directly with an ODS query.
In this Fig. B, data from Data Source A and Data Source B is uploaded to a PSA. The PSA (Persistent Staging Area) corresponds to DSO. From the PSA we have the possibility, via transfer rules, to upload data to DSO. The DSO is represented here as one layer, but depending on the business scenario, BI DSO can be structured with multiple levels. Thus, the ODS objects offer data that are subject oriented, consolidated and integrated with respect to same process on different source systems. After data has been stored, or while the data is updated in the ODS, we have option of making technical changes as well as data changes. In the ODS, data is stored in a de-normalized data structure.
5.1 Structure of ODS
While transferring data from PSA to ODS objects, rules (Transfer Rules) can be applied to clean records and transform them to company-wide standards for characteristic values. If it is meaningful at this stage, business logic may also be applied (Update Rules).
6 Sample Scenario for a Standard DSO
Consider an example involving a Standard DSO in SAP BI 7.0.
Let’s check flat file records, the key fields are customer and material and we have a duplicate record (Check Rec.2). The ‘Unique Data Records ‘option is unchecked which means it can expect duplicate records.
Figure C. Explains how records are captured in a DSO (Refer selected options below)
After update rule, Record 2 in PSA is overwritten as it has got same keys. It’s overwritten with most recent record. The key here is [M1000 | Customer A].
If we note the monitor entries, 3 records are transferred to update rules & two records are loaded in to Active Queue table. This is because we haven’t activated request yet & that duplicate record for key in DSO gets overwritten. The key figures will have the overwrite option by default, additionally we have the summation option to suit certain scenarios and the characteristics will overwrite always.
6.1 Naming Conventions
- Tech. Name of New data / Activation queue table is always for customer objects – /bic <name of ODS>140 and for SAP objects – /bio<DSO_Name>140.
- Name of active data table /BIC/A<DSO_Name>100 and /BI0 for SAP (Click here
- Name of change log table – The technical name is always /BIC/<internal generated number>.
Once we activate we will have two records in DSO’s Active Data table. The Active Data table always has contains the semantic key (E.g. Customer & Material for instance)
6.2 Change Log
The Change Log table has 2 entries with the image N (stands for ‘New’). The technical key (REQID, DATAPACKETID, RECORDNUMBER) will be part of change log table. (Refer Fig. D)
- Fig. D – Data is loaded to CL & ADT (Pl. refer Fig. A for more details)
Introducing a few changes, we get the following result as in Fig. E.
- Fig. E – Changes Introduced from the Flat file is reflected on PSA à ADT & PSA à CL
6.3 Detailed Study on Change Logs
We will check Change log table to see how the deltas are handled. The records are from first request that is uniquely identified by technical key (Request Number, Data packet number, Partition value of PSA and Data record number). With the second request the change log table puts the before and after Image for the relevant records.
- Fig. F – Study on the Change Log on how the Deltas are handled
In this example Customer and Material has the before image with record mode “X”. And also note that all key figures will be having “-” sign if we opted to overwrite option & characteristics will be overwritten always.
A new record (last row in the Fig. F) is added is with the status “N” as it’s a new record.
- Fig. G – Final Change Log Output
6.4 Record modes
The record mode(s) that a particular data source uses for the delta mechanism largely depends on the type of the extractor. Ref. OSS notes 399739 for more details.
7 Work Scenario
Let’s go through a sample real time scenario. In this example we will take the Master data object Customer, Material with a few attributes for the demonstration purpose. Here we define a ODS / DSO as below where material and customer is a key and the corresponding attributes as data fields.
- ODS / DSO definition
- Definition of the transformation
- Flat file Loading
- Monitoring the Entries
- Monitoring Activation Queue
- Monitoring PSA data for comparison
- Checking Active Data Table
- Monitoring Change Log Table
- Displaying data in suitable Info provider (E.g. Flat File ààà
Note: In SAP BI 7.0 the status data is written to active data table in parallel while writing to Change log. This is an advantage of parallel processes which can be customized globally or at object level in system
8 Step–by – Step Guide
Start transaction RSA1 in SAP BI 7.0 and follow the following steps as illustrated.
- Fig. 1.1 — Creation of an Info area in RSA1 — An Info area is very similar to a folder for files (herein context we must think files to Info objects & catalogs). Go to ‘Info Objects’ à Right Click & ‘Create Info area’.
- Fig. 1.2 – Naming the Info area: Give the short & long description for the Info Area.
- Fig. 1.3 — Creation of Info Object Catalog: Right-click on the Info area & choose “Create Info object catalog”
- Fig. 1.6 — After creating the Info object catalog, we create IOs like Char Info Objects: [Customer ID, Material ID, Status] & Key Figures: [Qty]
- Fig. 1.7 — Now, we must create AC Areas. For this go to “Data source” & right click on the top bar choosing “Create Application Component…..” This creates an Application component area for DS.
- Fig. 1.8 — The technical names & long description for Application Component Area must be enlisted here.
- Fig. 1.9 — Now, right clicks the Application Component Area & chooses “Create Data Source”, which creates the DS for holding object components for transfer.
- Fig. 1.12 — Now, that we have created the Data source, we need to set the parameters for the ‘Extraction’ to be done. Here, we have the settings like ‘Full upload’, 1 Header lines to be ignored, Comma Data separator, Double hyphen Escape sign etc. Here we need to specify the name of the flat file which holds the data. The input file can be either loaded from a AL11 Common server or directly from the desktop / local NAS / local workstation etc.
- Fig. 1.13 — The next tab here elicits the ‘Proposal’ methods that need to be taken care of while transforming. Here we need to key in the fields one by one (editable) or generate the field proposal as stated (by default). We can also load some of the example data here.
- Fig. 1.14 — The next tab is ‘Fields’, which displays a prompt if we must copy the field properties from the info object or not. Once done, we can move to the next screen tab called ‘Preview’ which generates a sample output.
- Fig. 1.15 — The last tab is called ‘Preview’. As, show the data from the CSV is as displayed here. We can easily generate a small sample snap of the raw data from the source system at this level.
- Fig. 1.16 — Creation of DSO
Now go to ‘Info provider’ & right click on Info provider to “Create Data store object”
- Fig. 1.19 — At the DSO level, we need to specify the Key fields, Data fields that constitute the DSO. We here specify the [Customer ID, Mat ID] as the Key Fields.
- Fig. 1.20 — Note that [Stats] is assigned as a Data fields — We now need to Save this and activate further.
- Fig. 1.21 — Also, before this step, we must also not forget the Key figure IO [Quantity]. We need this as a Data field along with its unit of measure [0UNIT]. Now we can save the DSO & activate.
- Fig. 1.22 — Further this step, we need to create a transformation for loading data. We now right click the DSO and choose “Create Transformation”.
- Fig. 1.23 — Here at this step, we need to specify the Source & Targets of the transformation. It’s evident that every transformation will necessitate a source to target loading of data. Here the DSO is the target & the Data Source [DS_TEST_MINE] forms the Source. Please note that Source system being a CSV is declared as Flat file.
- Fig. 1.24 — This is full view of transformation. We can see whole view of source DS and its target DSO. We can note that DSO has a counter for 0RECORDMODE which has the number of records the DSO contains.
- Fig. 1.25 — Once we collapse the structure, we can see the actual view without the rule groups.
- Fig. 1.26 — Now we must create DTP. Now, right click on and choose the option “Create Data Transfer Process”. A DTP is very important interface / bridge for transfer of data from the PSA to Info provider.
- Fig. 1.27 — As in transformation, here also we need to specify the source & the targets. We have the source as the Data source [DS_TEST_MINE] & the target DSO which was created in the previous steps. Once done click OK & proceed next step.
- Fig. 1.29 — Data flows from source to target PSA via the Info package, which triggers the transfer. Here, just specify the description of the IP and proceed further.
- Fig. 1.30 — In the IP, Clicking the ‘Start’ with the option “Start Data Load immediately” will execute the data transfer from flat file CSV to Persistent Staging Area. This can be scheduled as a background job also, but for the time being we’ll make it ‘Immediate’ start.
- Fig. 1.31 — Once done, we now click on the “Call Monitor for BI object”.
- Fig. 1.36 — Click & drag the objects as dimensions. We have created Material, Customer & Status dimensions in the Info cube.
- Fig. 1.37 — Saving & activating the Info cube
- Fig. 1.40 — At the transformation level, we can view the fields in the Info cube group are aggregated as one for the Quantity. But it’s not like what it really looks like.
- Fig. 1.41 — Now expanding this group, we can not the bifurcation as explained for the Info cubes.
- Fig. 1.47 — As we know the DSO has 2 parts to hold data. One the activation queue & the other the change log. Now click on the display data to view in data browser.
- Fig. 1.49 — Notice, the Active Queue data
[SID, DATAPID, REC, /BIC/CUSTID, /BIC/MATID, /BIC/STATS, /BIC/QTY, 0UNIT, 0RECMODE]
- Fig. 1.50 — Now click on the “Change Log” (Display Change Log Tab) in the Info provider administration.
- Fig. 1.51 — Now, Click on ‘Activate’ which will Activate DS object data
8.1 Trouble Shooting
- Fig. 1.53 — The ‘Refresh’ option here will enable the activation of data in the DSO. But, note that here we have a small error, that we have the Yellow request ID. But, we must trouble shoot out this error.
- Fig. 1.54 — So, what meant error is like we have Status [o], but we need to have Status as [O]
(P.S. Kindly note that the status here is mostly mistaken & is sometimes case sensitive)
It should be as above in CSV.
- Fig. 1.55: Now that we have changed status, “Request available for reporting” as in Green.
- Fig. 1.56 — Further this, we display the data in the data browser.
- Fig. 1.57 — Here, as we click the Active Data button, we can view the data browser.
- Fig. 1.58 — Now, note that the data here is as [M1000, C1000,”0″, 100000, KG].
- Pl. check the status here, note it is [0].
- Fig. 1.59 — Now we click on CL (Change Log) & display the data in the data browser
- Fig. 1.60 — Its to be noted that the Change Log registers lot of parameters than the Active Log.
- Fig. 1.61 — Right click on the DSO to display the data.
- Fig. 1.62 — Here we must choose the options necessary for data view. Click on ‘Fld. Selection for Output’ for more detailed options. In the Fld. Selection o/p, Choose the basic necessary details necessary for user view, like [Qty], [Cust.ID],[Material ID].
- Fig. 1.65 — Here once data is loaded, we check the display data option in the Info cube. Perform a right click on the Info provider and check the transferred data. Choose the necessary options.
8.2 Handling Summation & Delta Updates
- Fig. 1.67 — Note: Summation in the Rule Type > Aggregation: Summation.
- Fig. 1.68 — Note: To perform full or delta updates, we need to check these following options.
9 Summary
Data Store Object (DSO/ODS object) is used to store consolidated & cleansed data (both master & transaction data) on a Document level (i.e. Atomic level). It can be perceived as a Data set consolidated from one or more Info Sources or Transformations (BI 7.0)