Skip to Content
  1. Introduction

Basically the purpose of this process is to move data from original cubes to new “archiving” cubes reducing the amount of data stored in the original cubes and doing more accessible the most recent data.

  • Why archive data:

Year over year the BW DB growths and we can find cubes with tons of data which size impacts to BW Activities such as loads, BIA re-builds,etc.. and we aren’t always accessing to the historical data.

  • How archive data:

     We’ll move data from our current cubes to new “Archiving” cubes (being a copy of the original) reducing the original cubes size and maintaining the common accessed data accessed data only (last 2 FYs)).

AR_1.PNG

  • Objective:
    • Restrict SAP BI database current size
    • To reduce risks on BIA Server
    • To reduce Backup time
    • To increase performance

2. Archiving Process:

    – > To move data from original cubes to new “archiving” cubes reducing the amount of data stored in the original cubes and doing more accessible the most recent data.

AR_2.PNG

Common back-end architecture:

Ar_3.PNG

In the backend new multiprovider is created which include both the origina infocube and the Archiving cube and the data will be moved from the original infocube to the archiving infocube.

Four different steps in the Archiving Process

1. Copy data

    –  Movement of data from the original cube that we want to archive to the “Archiving” cube that will receive that data.

2. Validation of copied data.

  –   Check that data sent to the “Archiving” cube is the same data that we have in the “Original” cube to be archived.

3. Drop data from origin.

  –  Delete the archived data from the “Original” Cube.

4. Validation of dropped data

–   Check that data was deleted (ensuring we only drop archived data).

3. Archiving tool:

We have created the Archiving Tool in order to manage our archiving process trying to avoid risks during the process. To access this tool we have created a
customized tcode:Z_ARCHIVING.

Archiving Tool shows an interface like this:

ar4.PNG

We can identify two different areas in the screen: the Process Monitor and the Archiving Process.

3.1.       BR Archiving Tool – Process Monitor area

Process Monitor areas help us to identify what are the archiving process that are running or were executed and what is the latest
status for process in progress.

You can see a grid with the information of each process.

ar5.PNG

Now you can see what means each column:

Infoprovider: Contains the technical name of the cube that we are archiving.

Closing Period: Contains the period (snapshot) that we are archiving. Currently we can find three types of snapshots: Fiscal
Year Period, Calendar Year Month and Forecast Period .

Start Date: Contains the date when the archiving process started.

End Date: Contains the date when the archiving process is completed. While process is running this field is empty.

Last Update Time Stamp: Contains the date and time of when the last status is modified. We are modifying this value at the beginning of each sub-process and at the end.

User Name: Contains who executed the process. In case that one process is executed for more than one person (one personexecuting on sub-process and other one the rest) is updated with the last person who executed the last sub-process.

Sub Status: Contains the status of the process showing information about what we are doing in that moment (what sub-process is done or is running). You can find three lights: yellow when a sub-process is running, green when a sub-process is complete and red when a sub-process failed.

– Main Status: Contains the status of the whole process. When process starts, light goes to yellow and maintain that color until the whole process is completed that turns to green.

3.2.       BR Archiving Tool – Archiving Process area

Archiving Process area allows you to execute the archiving process and provides a direct access to the Drop BIA Index program and the BIA Index Re-build program used to refresh BIA.

ar6.PNG

Drop BIA button : This button will open a new session accessing to the T-Code used by Operations to delete the BIA Index tables to start refreshing the BIA.

BIA Re-build button: This button will open a new session accessing to the T-Code used by Operations to generate and fill the BIA Index tables.

BIA Status light: This light is showing the current status of the BIA Index for the selected info-provider. We can find two status, green and yellow.

Green means that BIA Index for the selected cube was updated after the last completed archiving process for that cube.

Yellow means that BIA Index for the selected cube is not updated yet after complete the archiving process for the selected cube. This works as reminder to don’t forget that you should refresh the BIA Index after finish the archiving process, otherwise data in reports could be showed twice.

4. BW ARCHIVNG- EXECUTING  THE PROCESS

4.1. Data Copy Process:

In order to start the process, you should enter a valid From/To value (let’s see what are the valid values in the next chapter) with the range of periods that you want to archive and press Execute button.

After press Execute button, tool will validate the input fields and if everything is correct will show you a message for confirmation:

ar7.PNG

In this process, we trigger the Process chain that uses a DTP following a standard BW data load.

Ex: Z_CEME101_ARCHI_COPY

ar8.PNG

Based on the DTP filter,the data will be copied from the original cube to the archiving cube.

If the DTP filter is set to the Fiscal year:001.2005 –12.2005,all the data in the cube for the fiscal year 2005 will be copied to the archiving cube.once the data is copied to the archiving cube,we will do the roll up for the archiving cube.

4.2. Copy Validation Process:

Copy Data Validation is executing a report in background to know if there is any variance between the data in the original cube and the data copied to the archiving cube. All this process is triggered by one job so you can access to the Job Overview T-Code (SM37) to know the status of the validation.

When you execute the Copy Validation process a job called Z_VAL_COPY is created so you can look for it to know the status. Keep in mind that job is created and triggered using your user in order to do the search easier.

First the report will be executed on the original cube for the fiscal year for which the data is copied to the archiving cube. the report results will be stored in the table. Later the same report will be executed on the archiving cube and the obtained result will be compared with data of the original which is stored in the table.

If there is no variances during comparing, the validation is successful otherwise the validation is fail.

All these process is taken care by a job:Z_VAL_COPY.ar9.PNG

4.3.Drop process:

In this process, the data from the cube will be deleted for the fiscal year which is copied to the archiving cube. Basically we will delete the request ids from the cube for the particular fiscal year.

Job Z_DROP_PROCESS is always created when we execute the Drop process and it’s the main job. It contains the different phases of the process including the
creation of the expected results data before deletion (used in the Drop Validation process) and the selective deletion of data when is necessary .

ar10.PNG

Jobs triggered for deletion by Request ID: When program identifies that one set of data can be deleted from the cube using the Request ID (is done when one Request ID only contains data related to periods included in the range of periods that we are archiving), is creating as jobs as Request IDs can  use for deletion.Created jobs follow the same naming: XXXXXXXXX_REQU_YYYYYYYYYYY where X is the info-provider’s technical name and Y is the request id (first characters). Forexample Z_CEME101_REQU_0C5GH32DXBM.

Points to keep in mind:

– Not always the Drop Process creates jobs to delete data by request id so if you don’t find this type of jobs could be correct.

– Sometimes main job (Z_DROP_PROCESS) finish long time after the request ids jobs finished. This could be provoked due to the process also is executing a selective deletion.

4.4. Drop Data Validation process:

Drop Data Validation is executing a report in background to know if there is any variance after the delete process. All this process is triggered by one job so you can access to the Job Overview T-Code (SM37) to know the status of the validation.

When you execute the Drop Validation process a job called Z_VAL_DROP is created so you can look for it to know the status. Keep in mind that job is created and triggered using your user in order to do the search easier.

ar11.PNG

During the drop process, the expected result will be created for the drop validation process. The executing the report will be executed by excluding the fiscal for which you are dropping and the obtained results will be stored in the table.After the data is dropped from the original cube for that particular fiscal year,the same report will be executed on the original cube but without any filtered.the obtained results will be compared with the expected results stored in the table.so basically we will compared key figures vales of both the report values.

If there is no any variances,the drop validation process is successful other wise it is failed.

ADDITIONAL TASKS AFTER ARCHIVING:BIA Re-built

After the BR Archiving process, we should refresh the BIA Server updating the index for the original cubes that we have archived. This action is very important to avoid data problems during our reporting. If we don’t update the BIA Index,data will appear twice in the reports.

BIA Server refresh is executed in two steps (must be executed following this order):

  1. Drop BIA process.

This process deletes the index tables in the BIA Server.

You’ll be able to access to this process clicking on Drop BIA button in the BR Archiving tool. A ew transaction window will be opened and the Drop BIA program will be showed.There you should:

  • Check the Really delete? option in order to confirm that you want to delete the index.
  • Enter the cube or list of cubes that you want to refresh. Program will delete the index in the BIA Server for all of them.
  • As optional, you can select the Display appl. log immediately? option.Selecting it, the log of each cube will be showed when process finish.

ar12.PNG

2. BIA Re-build process.

You must execute this process when Drop BIA Index process has finished.

You’ll be able to access to this process clicking on BIA Re-build button in the Archiving tool. A new transaction window will appear showing the BIA Re-build program. There you should:

  • Enter the number of process in parallel (we suggest 3) in the Number of infocubes to process field.
  • Enter, in the Infocube selection field, the cube or list of cubes to re-create the BIA Index in the BIA Server.

ar13.PNG

To report this post you need to login first.

4 Comments

You must be Logged on to comment or reply to a post.

  1. Harish Allachervu

    Hi,

    That’s really handy blog, but i’ve few queries here it will be great if you can address that

    as you are saying

    We’ll move data from our current cubes to new “Archiving” cubes (being a copy of the original) reducing the original cubes size and maintaining the common accessed data accessed data only (last 2 FYs)).

    if you are doing this means there is redundant will be maintained in our BW(prod) system and moreover you are just splitting data and reducing the burden on single cube keeping in multiple cubes anyhow it leads same memory consumption on  DB level!!!


    as per my knowledge main objective of archiving is we will move the data from BW(prod) to some external DB systems and delete the old historical data in BW(prod) system by doing this approach we can reduce DB space on prod system.

    Please correct me if i am wrong 😕

    Regards,

    Harish  

    (0) 
  2. Antony Jerald J

    Hi,

    Nice doc 🙂

    What’s Archiving tool here?  Is this a client we’ve to install ?  Or is it available as a part of SAP GUI client for BW?

    Where does data be stored after we archive?  Will this use database for existing BW or will it uses some different database?

    Kindly suggest.

    Regards,

    Antony Jerald.

    (0) 

Leave a Reply