Skip to Content

Business Objects Data Services: Performance Tuning, it’s the little things… 

We all know ensuring our jobs meet service level is sometimes a hit and miss, especially in a new environment.   We will get requirements that state you have to transfer x amount of rows in x amount of hours.  You have no benchmark to base the feasibility of the required throughput.  Unless you are one of the lucky ones, you will have no chance to performance tune your jobs while in development.  Instead you will be given a subset of data, 10% if your lucky on hardware that is undersized and shared among your peers.  There are several small things you can do to ensure maximum performance benefits are achieved without the cost of extremely complicated designs that affect time to market.   

Performance tuning rarely comes down to just one tweak or change, typically it’s a combination of things together that will help you to achieve maximum throughput.  Performance tuning is more of an art then a science, every scenario and situation needs to be evaluated however the below things should apply to most situations.

1 – Push Down: This cannot be stressed enough, ensuring your jobs are pushed down will provide you with the best performance increase.  If you find out that your job is not being pushed down then try to figure out why.  Don’t be afraid to split your data flow up in order to get the large data movement pushed down.  For example if you are transferring 10 million rows and your data flow has a validation object that checks for nulls, the null rows follow a different path then the not null rows.    Consider breaking this up into two data flows and instead of the validation object apply the logic to the where clause.  This would then allow the two data flows to be fully pushed down with the additional bonus of potentially running the data flows in parallel. 

2 – Bulk Loading: If you are loading into a target database that supports bulk loading like Oracle, SQL Server, Teradata.  Consider implementing the bulk loading capabilities of the target database.

3 – Database links:  Don’t be afraid to ask the DBA’s about database links to help with performance.  Data Services can utilize database links to help push down cross schema data movement.

4 – Indexes: Review your SQL execution plans and verify they are taking advantage of your indexes in the most efficient way.

5 – Data Conversions:Understand the supported data services data types and implicit conversion rules.  Ensure you don’t needlessly convert fields back and fourth for no reason.

6 – Monitor rate: Data services monitor rate default is set extremely low.  If you are processing large amounts of data then ensure to change the monitor rate to 50,000.  This helps ensure resources are not needlessly wasted.

7 – Audit: Using the Data Serivices audit functionality will force your data flows to not be pushed down.  Consider alternate auditing functionality if required.

8 – Array fetch size: Review your array fetch size, if you know the production environment job servers and databases are powerful then consider increasing this size.

These 8 small things will help ensure you send your best code forward with performance in mind.

For in-depth review and information including additional steps to help with performance please review the Data Services Performance and Optimization Guide. 

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply