Skip to Content
Author's profile photo Former Member

Performance Tuning for Table-Comparison

Often you have an EIM prozess where you get source data, which includes new and updated data. In the data flow, you can use the table comparison transformation, to identify the changes.

If you have a large dimension table or a large fact table and only little load data, the process often have a long processing time. This depends on the big comparsion table in the table-comparison component.

You can tune this, if you reduce the comparsison data to the minimum. This can be done for a cutomer dimension like this:

  • source table for new/changed customer data: imp_customer
  • target table for customer: dim_customer
  • for the comparison create a view like:
    • create view comp_customer as

               select d.* from  dim_customer d inner join imp_customer i on (d.cutomer_num = i.customer_num);

  • Chance the table-comparison  transformation and set the comparison table to the view comp_customer

Now, the processing time of the transformation is improved.

Assigned Tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      This post is fairly old, but I implemented this solution.  I have a source table of 62109 delta records, of which 7674 are inserts, and 54435 are updates.  My target table has 5 million records.  So, I created a view (PHN_PLACE_PRODUCTS_TCV) and use it as the comparison table as described above.  The record count of the view is 62109–as expected.

      The ultimate target table PHN_PLACE_PRODUCTS, has the same layout and primary key as the input table.

      It took BODS almost 2 hours to process the 62109 records.  How can this be?

      The data is sorted, the view is obviously not sorted, but performance is not reasonable.

      Thank you.