Generating Reporting InfoObjects based on Business Content – Part 1: Introduction
In an Enterprise Data Warehousing context, InfoObjects often play an arbitrary double role: they are used for modeling the Data Warehouse Layer and multi-dimensional modeling the Reporting Layer. In my blog Introducing Data Warehouse InfoObjects – Part 1: Conceptual Overview I advised segregation of duties by introducing a dedicated, independent set of InfoObjects: Data Warehouse InfoObjects.
But how about those Reporting InfoObjects? Should we simply activate all the Business Content InfoObjects we need? Or do we have to introduce our own set of InfoObjects, customized and fit to the Business Users’ requirements? Or a combination of both? In this blog I would to like to present an alternative approach.
I created an ABAP program to generate Reporting InfoObjects based on Business Content. This blog series explains how to use the program. In Part 1 we will have a look at the rationale, the program, the application log, the generated InfoObjects and Template InfoProvider.
The blog series consists of the following blogs in addition to this blog:
- Generating Reporting InfoObjects based on Business Content – Part 2: Metadata Repository & Customizing;
- Generating Reporting InfoObjects based on Business Content – Part 3: Optimizing Results.
The document Implementing Reporting InfoObjects based on Business Content provides detailed technical instructions on how to create the ABAP program and all related ABAP Workbench objects.
Since the earliest SAP NetWeaver BW releases SAP delivers so-called Business Content (a.k.a. BI Content). It’s a multitude of BW data modeling objects, amongst others InfoObjects. Strong advantages can be materialized in pure SAP implementations. The Business Content is developed in synch with the SAP source system and perfectly complements standard business processes with analytical scenarios.
However, there are in my opinion some drawbacks to take into account. Activation of Business Content can lead to a massive number of new InfoObjects. All dependencies are considered and can go many levels deep. This can lead to an extensive data model which might also include unused SAP modules, business processes and even Industry solutions. Such a data model will become increasingly difficult to understand and won’t make any sense from a Business User’s perspective.
The installation of Business Content in a productive system can even be dangerous. There are many cases where previously activated Business Content is enhanced. These enhancements can be overwritten by an inappropriate activation. No matter how experienced you are, one day it can happen to all of us.
I would like to propose an alternative approach: generating Reporting InfoObjects in the customer namespace based on Business Content InfoObjects using a program. All mandatory dependencies will be respected (i.e. compounding InfoObjects and reference InfoObjects). For Characteristics however, generation of attributes will be restricted to the highest level. This will prevent an uncontrolled expansion of the data model as we can observe with the Business Content activation.
Starting the Program
You can start the program by using t/code YRIOBJ.
Figure 1: Selection Screen
There are 3 ways to run the program:
- For one or more single Business Content InfoObjects;
- For one single Business Content InfoCube;
- For one single Business Content DataStore Object.
Make the appropriate selection on the selection screen. You can use the F4 search help functionality. The program will check the input afterwards and gives an error message in case of any incorrect input. Press the Execute push button to start processing.
Note that the program will check on authorization object YBWREPIOBJ. Please make sure that an appropriate authorization role is assigned to your user-id. This will be explained in Part 2 of the blog series.
Analyzing the Application Log
As the last processing step the program will display an application log.
Figure 2: Application Log
The program collects all messages issued during processing and adds them to the application log. Here you can obtain an overview of all InfoObjects that have been generated as well as the Template InfoProvider. If applicable any error messages can be found here. The various processing blocks can be identified by the “Start of processing” and “End of processing” messages.
Note that you can always review previous application logs retrospectively via t/code SLG1.
Make sure to fill in appropriate selection criteria such as Object YBW , Sub Object YBWREPIOBJ, date/time and user-id to narrow down the search results.
Figure 3: Analyze Application Log
The program checks the Metadata Repository if an InfoObject already exists for the respective Business Content InfoObject. If yes, then it will proceed by skipping this InfoObject. Otherwise it will generate a new InfoObject that will be appended to the central Metadata Repository table and adds it to the appropriate InfoObject Catalog.
Figure 4: InfoObject Catalogs for Generated InfoObjects
Please note that there is a restricted set of “special” InfoObjects which is excluded from the generation process. It concerns InfoObjects in table RSDIOBJFIX which have a special purpose in the system. One can think of Time Characteristics but also Characteristics like 0LANGU, 0UNIT and 0CURRENCY.
Generated Template InfoProvider
After the processing of the InfoObjects the program generates a so-called Template InfoProvider depending on the processing mode. In my example the processing mode was InfoCube and the program generated a Template InfoCube.
Figure 5: Generated Template InfoProviders
Such a Template InfoProvider acts as a container for all InfoObjects and can be used as a starting point for creating your own DataMart.
In this blog we discussed the rationale of generating Reporting InfoObjects based on Business Content, the program, the application log, the generated Reporting InfoObjects and the Template InfoProvider. In Generating Reporting InfoObjects based on Business Content – Part 2: Metadata Repository & Customizing we will have a look at the Metadata Repository and Customizing. In Generating Reporting InfoObjects based on Business Content – Part 3: Optimizing Results we will discuss several ways to optimize the results.
Unfortunately when activating Business Contents, other than the large amount of unnecessary objects got also activated, InfoObject Catalog and the InfoArea which it got activated also very cluttered. This method might make everything more manageable.
I think that while Business Content is very helpful for speeding up development process, somehow I wish that we can use it as building block without even activating it for above reasons and because we will be maintaining two kinds of namespace standards, which won't be good for documentation purposes.
I hope the implementation guide will come soon enough.
First of all, thanks a lot for your positive feedback. I totally agree with the remarks you made, this drove me into the direction of developing this program.
I am convinced that the program can contribute to maintaining a transparent and sustainable data warehouse architecture. You can rely on your own InfoArea hierarchy and InfoObject Catalogs. The program does not activate any Business Content, it only reads the metadata of version 'D' which is available in the standard SAP tables.
I have already prepared the implementation guide and I am about to publish it. You can be sure that it will come soon.
Last but not least, I hope that you have the opportunity to try out the program. Please share your findings.
Thanks for the response.
There's also another thing which might hamper the use of this program, which is the routine within transformation.
I read the part 3 for this topic and I see that this program has yet to be able to utilize the transfer routine within the InfoObjects, which I think will also translate into also unable to utilize the start/end routine and routine within transformations from business content. For InfoObject the problem is not much, since the activated content object is very minimal, but it's not the same with routine within transformation.
Maybe the only way through this is by doing it manually, but the objects's quantity amount of this manual activation job is the nearly same with normal methods, e.g. we need to activate the source and target InfoProvider, which will also activate the dependent content InfoObject.
Unless there is a way to also copy these routine within transformation, we will need to copy the content transformations to the generated InfoProvider manually then maybe delete the activated content object again to keep everything clean.
I will still look forward to the program though.
Thanks again for your remarks. I am glad to announce that the implementation guide is available now. Please see Implementing Reporting InfoObjects based on Business Content.
I can confirm that the program will not generate any transformation routines which was also not the intention. The purpose of the program is to generate InfoObjects with all dependent InfoObjects and attributes restricted to the highest level. Additionally, it can generate a template InfoProvider depending on the choice on the selection screen.
Transformations is another topic and in my opinion hard or maybe even impossible to automate. Please let me explain what is my point-of-view and also how I deal with it in practice.
First of all, the Business Content data flow might be quite different from your custom data flow. It will be dependent on your data warehouse architecture with particular mandatory layers. In other words, the transformations can be positioned in a totally different place in your own data flow (in an LSA context usually in the Harmonization Layer and Business Transformation Layer).
In the transformations with routines, nowadays you will normally face migrated routines, i.e. routines which were developed using 3.x data staging technology and "technically migrated" to 7.x transformations; refer to SAP Note 1052648 - Migrating transfer rules and update rules for BW7.x for more information. From a maintenance perspective, I would really advise to avoid this. Let's invest a bit more time to understand what is going on in the routines and recreate them in the proper way, without all the overhead of the migration. As a positive side-effect, you will understand much better what is the purpose of the routine.
Be aware that different field names will be used in Business Content transformations. Here the field names will correspond to Business Content InfoObjects. You will have to translate them to field names which correspond to your own Reporting InfoObjects.
As a work-around to cover the Business Content transformations, I use in practice a 2-step approach. First of all, I try to cover all one-to-one/inbound transformations by analyzing tables RSTRAN and RSTRANFIELD. Here you can get the assignment of DataSource fields to Business Content InfoObjects. Using the central Metadata Repository table YBWREPIOBJ you can complete the mapping.
Please note that with Data Warehouse InfoObjects, you have a second mapping step to take into account (DataSource fields to Data Warehouse InfoObjects - table YBWMAPPING). Refer to my blog Generating Data Warehouse InfoObjects - Part 1: Introduction for more information.
As the next step, I normally activate in a Sandbox system the Business Content data flows which you are working on. Here you can analyze in detail all transformations and make proposals for recreating them in the Development system, especially if routines are used.
My conclusion is that only a human being can make the appropriate decisions. In a way you have to re-engineer the transformations. Hopefully you will have a lot of one-to-one transformations which are relatively easy to recreate. Routines are a different story and can turn out to be time-consuming to recreate (depending on your approach and ambition level).
Maybe a future development could be a program to generate transformation proposals. Again, the one-to-one/inbound transformations will be the easy ones. There are certainly ways to indicate where routines are used in transformations. At this point-in-time, I am not able to offer any better solution than the work-around which I described.
Let's stay in touch and please share your findings.