Skip to Content
Author's profile photo Former Member

House Keeping Activities for Archiving in BW System

Applies to:

SAP NetWeaver Business Warehouse (formerly BI). This will also work on SAP BI 3.5 and BI 7.0.

Summary :

This document will help to understand the various Housekeeping Activities required to perform before implementing Archiving in BW system.

Author: Arpit Khandelwal.

Company: Accenture Services Pvt. Ltd.

Author Bio:


Arpit Khandelwal is a SAP BI Consultant currently working with Accenture Services Private Limited. He has around 2 years of experience in  BW/BI implementation and support projects. He has been particularly involved in Archiving Activities in SAP BW system.

Methodology ::

Scope of a project needs to be defined while starting a Archiving Project.

For this, you need to identify the DSO’s  and Infocube’s where most of the data is retained for many years.

Next part of the project scope is the Basic Housekeeping Activities for the System/Log Tables. Also apart from the System/Log Tables we need to do housekeeping for the Data Targets also with the data that might be obsolete for reporting  or  legal retention pupose . This is where we will particularly focus in this article.

Eventually on the basis of DB size and growth of data in in system log/tables we need to define the House Keeping Activities ::

This might include some of the below House Keeping activities::

1)  Deletion of DTP Error Logs::

House keeping of DTP error logs should be deleted using Standard SAP programs. Hence further check on DTP integrity won’t be required.

DTP Error logs are stored in system log table RSBERRORLOG.

Program RSBM_ERRORLOG_DELETE  is used to delete all the DTP Error Logs.

Steps to do the same ::

a) Run Program RSBM_ERRORLOG_DELETE. Below screen will appear.


b) Input the Start Date and End Date as per requirement. Else mention the “but always older than … days” for which we want the retention period for these logs and rest all older than that will be deleted.


c) Execute the program in background.

2)  Compression of Infocube Data ::

An Infocube won’t be archived until the requests of the cube are compressed.This is a mandatory condition.

How to compress request would have been quite known to all of you. Still steps are provided as below.

Steps to do the same ::

a) Go to RSA1 transaction and then goto Manage option of the Infocube.


b)  Goto Collapse tab and put the request ID till the level where you want to compress the requests.


c) Click on Release. Collapse symbol will appear in Cube’s request.


3)  Deletion of unused Aggregate ::

Aggregates needs to be shortlisted for this. Criteria should be that those particular aggregates should not been used in query execution and aggregates should be almost as large as the parent aggregate from which they were created.

Steps to do the same ::

1) Function module RSDDK_AGGREGATES_DEACTIVATE can be used to delete the unused aggregates.

2)  In case, this functional module is not available then we need to go through manual process of Aggregates Deletion.

a) Goto Maintain Aggregates option of the Cube for which aggregate has been decided already to be deleted.


b) Then we need to select the aggregate required for deletion. Click on delete symbol to delete the aggregate.


PS :: Kindly note that you should be very sure and careful of what you are doing at this stage.

4) Clean up of BI Statistic Table ::

We can delete old BI statistics tables data through ABAP Program RSDDSTAT_DATa_DELETE We can delete old BI statisticstables data thru ABAP program RSDDSTAT_DATA_DELETE. We can choose specific statistics tables to delete & older than date criteria in the selection screen below.



5) Deletion of BW Background Management (RSBATCH) information ::

Delete the messages from BI background management as well as the internal parameters of the background processes that are executed by background management on a regular basis. This prevents table RSBATCHDATA from overflowing.

Steps to do the same ::

a) Run transaction RSBATCH BI Background Management, Logs and Toolsscreen.


b) On the next screen, define after how many days the internal messages of BI background management and the internal parameters of
the background processes should be deleted.
This setting should normally prevent table RSBATCHDATA from being overfilled.

When defining the deletion selections, make sure that you keep the data as long as necessary in order to track any problems that might occur.


c) To define the starting conditions for the deletion job, choose Schedule. Select immediate or appropriate time in the Job schedule screen.


d) A Batch Manager Logs deletion program is executed in background.

Verify that the job is executed successfully in SM37.


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo prashant songara
      prashant songara

      This will provide a good overlook to a BW team starting with archiving project for their client.

      I have one question: Other than above mentioned 5 points, what others points one can think of before starting an archiving project from a BW guy's perspective? Simple points might help me a lot. Thanks,



      Author's profile photo Former Member
      Former Member
      Blog Post Author


      You need to look at your BW system and identify the DSO's and Cube necessary for archiving. In short, you have to carry out an analysis for the data growth rate which will be highest for some DSO/Cube's.

      Once this is identified, you need to check with your business to know whether old years data is required for business or not. For example, say your cube/Dso has data for 2005 but this may not be required by the business for its current purpose/legal retention requirements. Such kind of old data will be obsolete and hence data deletion for that specific year can also be included in the House Keeping Activities.

      Best Regards,


      Author's profile photo Anshu Lilhori
      Anshu Lilhori

      This is something really helpful..One shot lot of information.



      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Thanks a lot for your comments.



      Author's profile photo Former Member
      Former Member

      Dear Arpit..

      It is really Good and informative.. 🙂

      Keep it up..


      Kiran N

      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Thanks a lot for your kind remarks.



      Author's profile photo SAI V
      SAI V

      Hi Arpit,

      A very good informative doc for archiving process. Currently we are planning to go for Data Archiving before going to start I would like to know how to identify the Objects (DSO/Cube) which are having huge amount of data.



      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Hi Sai,

      Thanks a lot for your for your observation.Sorry for coming back late on you query.. Right now stucked in a heavy Data Source implementation as of now 😎

      Yes, that is very important question for us how to select (Dso/Cube). You need to check in DB02 to find out the pattern for the size of heavier DSO's/Cube. You need to also take care of the version of BW in which you are working. Every BW comes with restrictions of their own for archiving the objects...e.g. you cannot use Non-Commulative Key figure containing providers.You also need to analyze the period of data available in Cube's/DSO's and what is business expectation on the same...whether they want to keep the data for last 5 years or more or less.

      This is the summary. I will come out with a SCN document covering the identification points for the same as soon as I become free.



      Author's profile photo Former Member
      Former Member

      thanks for sharing your knowledge ...

      happy new year ...


      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Hi Marcio,

      Thanks for your comments.

      Best Regards,


      Author's profile photo Ashok Babu Kumili
      Ashok Babu Kumili


      Very userful info. Thank you.

      Author's profile photo Colm Boyle
      Colm Boyle

      Hi there,

      I don't think you have to compress data on the later releases of BW before you archive, I think this is new functionalty in 7.3 SP05 or thereabouts. You can test.

      Also, for analysing large tables you can check tables RSMDATASTATE_EXT for Cubes etc and RSICCONT for DSO's containing a large number or requests/records. Combine this with your expected growth to help determine what housekeeping tasks are required.

      Very Good document,



      Author's profile photo Former Member
      Former Member

      good stuff ! this is very useful for me.

      i have another question.

      when i tried to reorg database , i found a huge table - RSBMREQ_DTP , the row count is over 190 million.

      might you guide me how to purge it?