Skip to Content

Recently my team and I had some issues with the performance of some packages in our project. So I wanted to share with you my tiny little experience (we can read a lot of information, but a pratical example is always the best way to understand the impact).

Scenario: implementing the /CPMB/LOAD_INFOPROVIDER_UI between two BPC infocubes (from different environments), to transfer data.

I can tell you that after we implemented the Light Optimize process chain (/CPMB/LIGHT_OPTIMIZE) in the source infocube, the package execution time decreased from 5 hours to 15 minutes.

It is impressive the influence that this little “optimization” has on your BPC infocubes performance. So I gattered all the information I found, and wrote this blog.

Hope you enjoy it!

Characteristics

Light Optimization compresses and indexes the stored records and updates infocube statistics for system processing. It is one of the tasks that belong to the BW InfoCube performance maintenance tasks.

  • Index rebuild;
  • Rebuilds statistics;
  • This process deletes the infocube indexes and moves records from the F to the E fact tables;
  • Multiple records in the cube will be collapsed to one record with zero value;
  • It is possible to supress  zeros with a manual option (To supress records with zero values please refer to Problem removing zeros in a model – BPC 10 NW);

Recommendations

  • When to Run?

          From SAP 1508929 “There is no rule of thumb for how often to run optimizations. The need can vary depending on the characteristics of your hardware environment and your application.” There is no set recommendation as to the frequency, however it is generally recommended that these under some specific circunstances:

    • When a new model/application sets and applications are created run a Full Optimization;
    • Whenever a huge set of data is loaded into input schedules/reports;


  • Schedule the process chain during the night


    • Altough it is claimed that the execution of this process chain should not affect the normal read/write behavior of the infocube, we have had some cases where the custom logic process chain, failed during the Ligh Optimize execution;


  • Models with more than 12 dimensions are not recommended.
    • In this cases you should perform a Full Optimize instead (be careful because full optimize will “destroy” your infoprovider and than rebuilt it and that can cause name change. Altough, now it is possible to perform a full optimize without any risk;


  • By SAP:

/wp-content/uploads/2014/05/light_optimize1_462406.png


FAQ’s and Note’s


You might have some problems during the configuration of this process chain if the model/application has too many records (in our case we only scheduled this process chain after 1 year and half).

1. Symptom: Dump Time Out

/wp-content/uploads/2014/05/dump_time_out_462439.png

Implement the SAP Note 1947726 – UJ0_EXCEPTION during a Light Optimize with large number of request


After a new execution of the process chain, you can always use the following note, to be sure that the Light Optimization was sucessuful: 1626814 – How to determine if a BPC Lite Optimization was successful


Also related: Optimizing Performance


You can try to prevent or analyze some performance issues. Please read the following documentation, provided by our forum:


Business Planning and Consolidation 5.x Performance Tuning Guide (PDF 458 KB)

This guide is an update to a previously posted document regarding BPC 5.x performance tuning guide.

Performance Analysis and Tuning Guide for SBO Planning and Consolidation version for Netweaver 7.x (PDF 1 MB)

This guide describes different aspect of performance relevant topics in the context of customer defined functions of SAP BusinessObjects Planning and Consolidation version for Netweaver application. It highlights some of the design aspects that can influence the performance of a specific application as well as providing general guidelines, recommendations and best practices.


Thank you for your attention!


Happy  learning 🙂


Best regards,


Raquel Oliveira


To report this post you need to login first.

1 Comment

You must be Logged on to comment or reply to a post.

  1. SAP User

    Generally we use the delta mode to load data from one model to other model. In case we trigger light optimization before DMP trigger, delta will not be possible. Please correct me if I’m missing anything.

    (0) 

Leave a Reply