BW analysis of Obsolete data for Performance Optimization and System Stability
Analysis of Obsolete data
For large SAP BW implementations that have evolved over a period of time, we find a lot of solutions that are not used at all / not used efficiently. This calls for an optimization of the system/making system leaner. Deletion/archival of the un-used data from the system becomes a necessity.
Companies and Vendors follow different approaches in order to do this. In the paper I would like to discuss the basic criterions considered for getting the data out from an SAP BW system.
While we are discussing about the data reduction, we need to focus primarily on physical data stores (Cubes, DSOs, Masterdata). The transformations that might alter the data volume between different layers are purely based on the business logic. There could be a good scope of improvement by re-engineering the entire solution. However, since that’s very specific to the Business requirement, we cannot generalize an approach in order to optimize the Transformations. I shall restrict the approach to a pure technical and a generalized plan that can be used across any SAP BW landscape irrespective of the Business vertical.
We shall hereby discuss the main points that need to be considered in order to analyze the physical data stores and their usability.
The physical data stores as mentioned earlier could be either InfoCubes, Masterdata or DSOs. Following are the first questions that need to be asked in order to analyze if the data in these Objects is relevant:
- Is the data being loaded to the Object?
- Is the object sourcing data to further targets? Direct loads or Look-ups.
- Is the object being used for reporting?
We shall discuss in detail the above 3 points:
1. Is the data being loaded to the Object?
Manage Info Provider
Check the date of the last loaded request.
If there are multiple objects to be analyzed, use table: RSSMONICDP. The table gives the details at each request level.
Based on the above analysis, it can be found when the data was last loaded in the InfoProvider.
2. Is the object sourcing data to further targets? Direct loads or Look-ups
The next phase of analysis is whether the Object is sourcing data to the further data targets. There could either be direct transformations or code look ups. If there are structure references, I would suggest do not delete the object completely – rather move the object under an Obsolete InfoArea for better categorization (we’ll discuss further).
Dataflow through Transformations:
Whether the Object being analyzed is sourcing data to other objects through transformations can be quickly analyzed just by referring the dataflow upwards.
Typically, look-ups can be found on DSO (Active tables) and Masterdata (P tables). It is very important to check for any Look-Ups while we try to analyze the object before deleting the data. Even if there are no data loads happening up-stream, check if the data is being used in any Look-Ups. The easiest way to find for any active look-ups is to look for the where used list of the active table throughout the system.
3. Is the object being used for reporting?
Instead of going for an analysis for the reports built on the object, you can simply look at the standard statistics table: RSDDSTAT_DM.
Give the object name in the highlighted field – InfoCube. The table displays all the reports that fetch data from the mentioned object. It also gives the users that have used this report and the number of times these reports were used. However, please note this table stores data for the past 30 days only.
You may also refer to other Statistic InfoProviders (0TCT_*) to find the Object usage.
Based on the above technical analysis, conclusions can be drawn on whether the data in the InfoProviders need to be retained or deleted/archived. This can give you a good space saving on server, better performance, leaner system, reduced maintenance.