Additional Blogs by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
tanuj_gupta
Explorer
0 Kudos

Scenario:

• DSO/InfoCube having very large number of entries > 10 million entries
• Database is Oracle
• Customer is adding new fields to the DSO/InfoCube and transporting the changes to the Production system.

Background:

Whenever the changes like
a) Adding new key figures
b) Adding new characteristics and dimensions.

are performed on an object (DSO or InfoCube) which already contains data, the activation process has to adjust the table(s) structure. This adjustments process has to alter the table by adding new column and initialize the column to "NOT NULL" and with initial default value. This should work without any problems given the fact that there are enough database resources to perform this change on the database.

Adding not null fields requires checking of the entire "existing" data for null values which requires large amount of space in the transaction log and in the database and comprise many I/O operation. Depending on I/O throughput of your storage it takes hours.

Reason for initializing the newly added column is that from BW reporting and extraction point the column has to be not null and initialized with default value, as BW can not handle null values. General recommendation in such cases where it is required to remodel the object by adding new fields where there is already much data is to follow the below workaround.

The general workaround involves making a copy of the existing data in the cube using datamart interface or Open Hub services. For example
1. Make copy of the existing Cube (Cube1) to Cube2
2. Add new fields to Cube2
3. Load the data from Cube1 to Cube2 using datamart (use Cube2 for further purposes.

If you have some reporting queries defined on Cube1 then need to do the following

1. Make copy of the existing Cube (Cube1) to Cube2
2. Load data from Cube1 to Cube2 using datamart
3. Delete Data from Cube1
4. Add new fields to Cube1
5. Reload the data from Cube2 to Cube1 using datamart

Alternative Solution:

An alternative to this solution is SAP Note 1287382 which observed to be much faster than the normal process.

1287382 Extending BW objects on Oracle with large tables

We have observed a significant performance improvement using this SAP Note although the implementation of this SAP Note, require good application knowledge.

Here are the steps for your reference:

1. Have a list of the large DSOs (identify the size of active table, change log table) and InfoCubes (identify the size of E, F table).
2. Archive or clear change log tables, compress the InfoCubes, minimize the number of records in the tables.
3. Have a list of objects to be transported, identify the dependency of the objects, create the change requests (try to group large DSOs and InfoCubes into separate change request), identify the sequence of the transport.
4. Check the current SP level and implement SAP Note 1287382 and correction mentioned in the SAP Note 1340922.
5. Run program SAP_RSADMIN_MAINTAIN, maintain the RSADMIN parameters as mentioned in the SAP Note 1287382.
6. Schedule background job to run program SAP_NOT_NULL_REPAIR_ORACLE with suitable parameters (table name, parallelism). More parallel processing could be scheduled depending on your hardware capacity. For InfoCubes, the job has to run separately for all the DIM tables and Fact tables.

Job SAP_NOT_NULL_REPAIR_ORACLE
- Renames the table name (/BIC/AMYODS00) to (/BIC/AMYODS00_OLD)
- New table /BIC/AMYODS00 is created in the system and new fields are added to this new table
- Data from the /BIC/AMYODS00_OLD is copied to the newly created table /BIC/AMYODS00

7. Ensure that Job in SM37 is successfully executed.
8. If any errors occur observed in SM37:

The program fails before copying :- Check whether "/BIC/AMYODS00_OLD" exists. Normally, however, this should not be the case because, if errors occur, "/BIC/AMYODS00_OLD" is renamed "/BIC/AMYODS00". If the table exists, before the program is restarted, the table "/BIC/AMYODS00" must be dropped (!!! MUST NOT CONTAIN DATA) and "/BIC/AMYODS00_OLD" becomes "/BIC/AMYODS00".
1.) Drop the new empty table "/BIC/AMYODS00" (using SE14 or SQL: DROP TABLE "/BIC/AMYODS00")
2.) Rename "/BIC/AMYODS00_OLD" to "/BIC/AMYODS00" (must be done with native SQL )
3.) Update in DD03L at least one column to nullable

The program fails after copying: - If the number of records in "/BIC/AMYODS00_OLD" and "/BIC/AMYODS00" differ.

- ANALYSIS and repetition in accordance with the previous step (The program fails before copying). /BIC/AMYODS00_OLD should have complete data in the table.
The program should not be restarted. Instead, the remaining steps (structuring an index and structuring statistics) are carried out manually after the error analysis.

9. Drop *_OLD tables if the conversion is successful.

If the DSO/InfoProvider is contain more than 10 million entries, the above alternative solution should help to reduce the amount of time significantly.

2 Comments