Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
sander_vanwilligen
Active Contributor

The challenging conditions of BI and data warehousing are well known: it is a highly dynamic, volatile and incremental world. From a data warehousing perspective, it means a steady growth of data and metadata caused by new BI applications and/or newly incorporated organizational units. One can also observe an ongoing pressure to adapt and expand existing BI applications. Sooner or later it will become obvious for everyone that without standardization of data warehouse management described by an architecture blueprint, the entire system will become more and more difficult and costly to administrate and operate. The same applies to development and maintainability of BI applications. Keeping our service level agreements with respect to reporting availability, on time delivery (time-to-market), overall stability and performance will become extremely difficult.

The most popular conceptual standardization of data warehousing is known under the term Enterprise Data Warehouse (EDW) introduced by Bill Inmon. The following key words describe the EDW concept: singe version of the truth; extract once - deploy many; support the unknown; controlled redundancy; corporate memory; historical complete, comprehensive, granular; integrated view.

SAP’s BW Layered, Scalable Architecture (LSA) describes the design principles of service-level oriented, scalable, best practice BW architectures founded on generally accepted EDW principles. The godfather behind LSA is juergen.haupt.

During the past years I have seen many clients struggling with the implementation of their Customer LSA. In my opinion there is an urgent need for sharing implementation best practices: working and easy to use generic, reusable building blocks to standardize and accelerate your LSA implementation considerably.

You might have noticed that I contributed in this area during the past years. My mission is to continue publishing in the future. That's why I decided to create an overview blog with links to all my publications, categorized per topic and with regular updates.

Last but not least, I hope that you will continue reading and enjoying my stuff. I look forward to receiving your experiences, feedback and suggestions. Stay tuned !!

Extraction Enhancement

Extraction enhancement can be defined as supplementing additional fields to an extract structure. One part is to extend (enhance) the extract structure with one or more fields. The other part is the program logic to fill these additional fields. The best practice for applying the program logic is using the Business Add-In (BAdI) technique, i.e. implementing BAdI RSU5_SAPI_BADI. To make it even more comprehensive, you can create a single “generic” BAdI implementation where dynamically the DataSource specific class is determined and called. The document presents such a generic approach.

Implementing Extraction Enhancement using SAPI BAdI - Encapsulation via Classes

An alternative approach is creating a single “generic” BAdI implementation in combination with an Enhancement Spot. All implementation steps are presented in the following document.

Implementing Extraction Enhancement using SAPI BAdI - Encapsulation via Enhancement Spot

Data Warehouse InfoObjects

In an Enterprise Data Warehousing context, InfoObjects often play an arbitrary double role: they are used for modeling the Data Warehouse Layer and multi-dimensional modeling the Reporting Layer. I advise segregation of duties by introducing a dedicated, independent set of InfoObjects: Data Warehouse InfoObjects. The following blogs introduce the topic Data Warehouse InfoObjects.

Introducing Data Warehouse InfoObjects - Part 1: Conceptual Overview

Introducing Data Warehouse InfoObjects - Part 2: Technical Details

I created an ABAP program to generate Data Warehouse InfoObjects for SAP source systems’ DataSources. The blog series explains how to use the program.

Generating Data Warehouse InfoObjects - Part 1: Introduction

Generating Data Warehouse InfoObjects - Part 2: Metadata Repository

Generating Data Warehouse InfoObjects - Part 3: Customizing

Generating Data Warehouse InfoObjects - Part 4: Optimizing Results

The following 3 documents provide detailed technical instructions on how to create the ABAP program and all related ABAP Workbench objects.

Implementing Data Warehouse InfoObjects - Part 1: ABAP Dictionary Objects

Implementing Data Warehouse InfoObjects - Part 2: ABAP Programming & Other Objects

Implementing Data Warehouse InfoObjects - Part 3: ABAP Developments in SAP Source System

Mapping Services

In every data flow we have to deal with the first inbound mapping of DataSource fields to InfoObjects within the transformation. Therefore, it makes sense to identify this process as a modeling pattern and use an ABAP building block to facilitate the process. The blog introduces the topic Mapping Services.

Mapping Services - Inbound Mapping of DataSource Fields to InfoObjects

The document covers all details on creating the ABAP Objects classes, including the source code and all necessary ABAP Workbench objects.

Implementing Mapping Services - Inbound Mapping of DataSource Fields to InfoObjects

Data Unification Services

Data Unification in an LSA context is all about supplementing data records with administrative characteristics and can be classified as a modeling pattern. The Data Unification Services' ABAP building block and central control tables are designed to standardize and facilitate this process. In the blog I will discuss Data Unification Services, the concept behind it and how to use it in the transformation.

Data Unification Services - Supplementing Administrative Characteristics

The document covers all details on creating the ABAP Objects classes, including the source code and all necessary ABAP Workbench objects.

Implementing Data Unification Services - Supplementing Administrative Characteristics

Reporting InfoObjects

In an Enterprise Data Warehousing context, InfoObjects often play an arbitrary double role: they are used for modeling the Data Warehouse Layer and multi-dimensional modeling the Reporting Layer. I advised segregation of duties by introducing a dedicated, independent set of InfoObjects: Data Warehouse InfoObjects.

But how about those Reporting InfoObjects? Should we simply activate all the Business Content InfoObjects we need? Or do we have to introduce our own set of InfoObjects, customized and fit to the Business Users’ requirements? Or a combination of both? I would like to propose an alternative approach: generating Reporting InfoObjects in the customer namespace based on Business Content InfoObjects using a program.

I created an ABAP program to generate Reporting InfoObjects based on Business Content. The blog series explains how to use the program.

Generating Reporting InfoObjects based on Business Content - Part 1: Introduction

Generating Reporting InfoObjects based on Business Content - Part 2: Metadata Repository & Customizi...

Generating Reporting InfoObjects based on Business Content - Part 3: Optimizing Results

The following document provides detailed technical instructions on how to create the ABAP program and all related ABAP Workbench objects.

Implementing Reporting InfoObjects based on Business Content

Pattern-based Partitioning

In this blog series I would like to share how to realize central definition and maintenance of partitioning patterns. The standard Semantically Partitioned Object (SPO) functionality of SAP NetWeaver BW 7.3 is enhanced by implementing Business Add-in (BAdI) RSLPO_BADI_PARTITIONING and few control tables. It facilitates an automated (re)partitioning, either individually or collectively, and enables partitions, criteria, texts and Data Transfer Processes (DTPs) to be generated automatically and consistently.

Pattern-based Partitioning using the SPO BAdI - Part 1: Introduction

Pattern-based Partitioning using the SPO BAdI - Part 2: BAdI Managed Maintenance

Pattern-based Partitioning using the SPO BAdI - Part 3: Partitioning Patterns

Pattern-based Partitioning using the SPO BAdI - Part 4: Use Cases

The documents contain all technical details of creating the control tables and the related ABAP data dictionary objects, implementing BAdI RSLPO_BADI_PARTITIONING and the necessary ABAP Object Oriented programming in the implementing class.

Implementing Pattern-based Partitioning using the SPO BAdI - Part 1: Control Tables

Implementing Pattern-based Partitioning using the SPO BAdI - Part 2: BAdI Implementation

Data Flows and Data Flow Templates

SAP BW release 7.3 introduced a new modeling object called Data Flow. A Data Flow acts as a container for storing the data modeling objects of a data flow, e.g. InfoProviders, Transformations, DTPs, etc. It can also be used to incorporate documentation belonging to its data modeling objects. Furthermore, it’s possible to define customized / tailor-made Data Flow Templates to facilitate standardization of data flow patterns in the context of your SAP BW implementation and architecture guidelines.

In the blog I would like to discuss standardizing data flow patterns using Data Flow Templates, creating new Data Flows based on such a Data Flow Template and the advantages of this approach.

Standardizing Data Flow Patterns using Data Flow Templates

8 Comments
Labels in this area