Skip to Content

Saving BODS Dataflow Audit statistics

My previous BLOG (  describes how to collect Audit Statistics on DataFlows and display them as part of Job Log during Job execution. However, it would be more meaningful to insert these Label values in a Database Table so that they can be used for analysis or reconciliation report creation.

Plan 1:  /wp-content/uploads/2013/04/save_2_201046.jpg

A script is placed after the Dataflow on which Audit functionality is implemented. An insert statement is written in the script to insert the value in the Audit Label to a database table. However an error message is generated because the Audit label is not valid outside the Dataflow.


Plan 2 :

The Audit Label values are saved by BODS in the BODS Repository table AL_AUDIT_INFO. A query can be written on this table, to select the latest value for the specified Label. The query would look something like this :

SQL(‘<repo datastore store’,’SELECT * FROM AL_AUDIT_INFO
WHERE LABEL = \'<label>\’
WHERE LABEL = \'<label>\’)

This approach is not recommended as it involves querying the BODS Repository. Usually this is not made available as a Datastore to BODS developers.

Plan 3: (Recommended)

The third plan to write the insert script within the Dataflow as the Audit Label values are valid there. It is a well-known fact that scripts cannot be placed within a dataflow. However , if we open the Audit window, we can find the means to write this insert script. The Audit Window contains a second tab to define Audit Rules. These Audit rules can consist of comparison of different Audit Labels (say source Label Value = Target Label Value). Alternatively the rule can be written in the form of a script. This facility is used to write the insert statement as shown below. As the insert statement will always return NULL, the rule will always be satisfied. Hence the insert operation is achieved through creation of a DUMMY Audit Rule.


1 Comment
You must be Logged on to comment or reply to a post.
  • Hi Priya,

    Thanks for sending the information. previously, I don't know about the  Data flow Audit

    Statistics. Now, I am understanding the Data flow Audit tool.

    i need ask one Question Priya?

    How to migrate Data Integrator to Data Services.

    Previously, I worked in data Integrator tool, Now they are moving Data Services side

    I need to test the Data integrator to Data Service. 

    Please let me know how to test the Data integrator to Data Services?