Skip to Content

How to capture error log in a table in BODS

I will be walking you through (step by step procedure) how we can capture error messages if any dataflow fails in a Job. I have taken a simple example with few columns to demonstrate.

Step 1: Create a Job and name it as ‘ERROR_LOG_JOB’

Step 2: Declare following four global variables at the Job level. Refer the screen shot below for Name and data types.


Step 3: Drag a Try Block, Dataflow and Catch block in work area and connect them as shown in diagram below. Inside dataflow you can drag any existing table in your repository as a source and populate few columns to a target table. Make sure target table is a permanent table. This is just for demo.


Step 4: Open the Catch block and Drag one script inside Catch Block and name it as shown in below diagram.

Catch Inside.png

Step 5: Open the scrip and write below code inside as shown in the diagram below.

Script Msg.png

The above script is to populate the values in global variables using some in-built BODS functions as well as calling a custom function to log the errors into a permanent table. This function does not exits at this moment. We will be creating this function in later steps.

Step 6: Go to Custom Function section in your repository and create a new custom function and name it as under.

Fun sec.png

Step 7: Click next in above dialog box and write the below code inside the function. You need to declare parameters and local variables as shown in the editor below. Keep the datatypes of these parameters and local variables what we have for global variables in setp 2. Validate the function and save it.


Step 8: Now your function is ready to use. Considering that you have SQL Server as a database where you want to capture these errors in a table. Create a table to store the information.


      [SEQ_NO] [int] IDENTITY(1,1) NOT NULL,

      [ERROR_NUMBER] [int] NULL,

      [ERROR_CONTEXT] [varchar](512) NULL,

      [ERROR_MESSAGE] [varchar](512) NULL,



You may change the datastore as per your requirement. I have taken ETL_CTRL as a datastore in above function which is connected to a SQL Server Database where above table is being created.

Step 9: Just to make sure that dataflow is failing, we will be forcing it to throw an error at run time. Inside your dataflow use permanent target table. Now double click the target table and add one text line below existing comment under load triggers tab. Refer below screen shot. This is one way to throw an error in a dataflow at run time.

Tbl Error.png

Step 10: Now your Job is ready to execute. Save and Execute your Job. You should get an error message  monitor log. Open the table in your database and check if error log information is populated. Error Log shall look like as shown below.

Monitor Log.png

ERROR_LOG table shall capture the same error message in a table as under.

Error Result.png

Hope this helps. In case you face any issue, do let me know.

You must be Logged on to comment or reply to a post.
    • Hi Arun, Thanks for providing the link. Both articles talk different methods of capturing the error log. I think we should not restrict ourselves to stick with one particular method. We should explore more possibilities of doing things differently. 🙂

      • Hi Mohammad,

        In the step 5, you have mentioned that "Writing the error information to ERROR_LOG table" but what is CF_ERROR_LOG_SCRIPT..
        Can u say whats that ??

        I am very new to this platform..

        Thanks in Advance.

          • Thank you for sharing this.

            BTW, i think it would be more make sense to write this kind of function into a script and surround with a workflow. This workflow can also be reused by each job's catch transform and easier for customer to read/understand/modify. People would like to see all the logic in one place instead of going here and there and sum them up together to get idea what we tried to tell. 🙂

  • Hello Muhammed,

    very nice to share so useful Information thanks a lot.

    how can we replace the step of writing in log table with sending email?

    should you use smtp_to or mail_to, in case yes, should we create variables for instance for header or subject and about body mail  and email address ?



  • Hi Mohammad,

    Thanks for very nice explanation. However while I implemented this, system is throwing  error as it not able to insert NULL value to column Seq_Num, and in your code you did not specify how the sequence number column is getting values.

    Requesting you to please explain me on it, also what else scenarios this functionality can be used.



    • Seq_Num is an identify column in SQL Server which inserts a value while inserting the row in a table. It is mentioned in the draft already. Below just in case someone else having the same issue. If you are using a different database then you need to change the code accordingly.



      [SEQ_NO] [int] IDENTITY(1,1) NOT NULL,

      [ERROR_NUMBER] [int] NULL,

      [ERROR_CONTEXT] [varchar](512) NULL,

      [ERROR_MESSAGE] [varchar](512) NULL,



  • Hi Ansari

    Good explanation...i followed as per your document but didn't get error like your post as well as not able to insert error details in error_log table.

    i followed these steps

    1. Created global variable
    2.  Try==>Data flow==>Catch (For getting error in where clause applied having condition)
    3. kept Script in catch and written steps as like you mentioned
    4. Called custom function in script
    5. Executed the job


    • Former Member

      I would suggest you to validate your function and see if script is working properly. Or you can share here and I can see.


  • /
  • Hi Ansari

    job successfully executed as well as error data also inserted into error_table but  not inserting proper data which i seen in error log.


  • /