After loading master data you need to run the attribute change run for activating the master data.

This process will check if the aggregates of your Infocubes need to be adjusted.

If your aggregates contain navigational attributes and/or hierarchies and it’s master data has changed, those aggregates will be adjusted.

If your cubes are big and you have multiple aggregates this process can be very time consuming, but there are 2 effective ways for speeding things up:

  1. Note: 780104 – DB6: Performance Impact of BLOCKSIZE on Aggregate Build

Before this note, the system was taking a long time to rebuild the aggregate, after we set he BLOCKSIZE to 10 Mio. (it was empty), the process now runs 5 to 6 times faster!


Block Size.jpg

Figure 1: After setting the block size according to note 780104 you can notice that system is splitting the query based on the MDC characteristic (only DB2), making it much more faster.


2. Note 903886 – Hierarchy and attribute change run (Point (D1) D delta mode, parameter DELTALIMIT)

Before this note, every time there was a change in a hierarchy used in the aggregates, the aggregate was rebuilt, instead of adjusted.It was empty and we set in to 20. After that the system now decides if the aggregates needs to be adjusted (delta records, faster) or rebuilt, depending on the number of changes in the hierarchy.


Any other relevant information regarding aggregates and change runs ?



Cheers!





To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply