Skip to Content
Many a times we face a situation where we have to insert huge records of data in a target database table.  I have seen that whenever transactions were happening over WAN and especially with the case of Oracle database, the inserts into the table were very slow. We tried out all sets of permutations and combinations on the receiver channel like trying to insert in Batch mode with an updated JDBC driver but with no luck. While doing my share of RnD on the issue, I came across an interesting workaround. In one of the links which I found during googling out a solution, I found the below syntax for inserting multiple rows in a single statement for Oracle DB. insert into table (COL1, COL2, COL3, COL4)           select ‘RC11’, ‘RC12’, ‘RC13’, ‘RC14’ from dual union all select ‘RC21’, ‘RC22’, ‘RC23’, ‘RC24’ from dual  The above statement inserts two rows at one go and in one single statement. I decided to leverage over this syntax and use it in PI for inserting data. Here is how I went about it: Firstly, create a data type conforming to the standard document type with action = “INSERT” as required by the JDBC receiver adapter. A sample data type structure is shown below. Make sure that occurrence of the access node is unbounded and that of the Statement node is 1.image
To report this post you need to login first.

13 Comments

You must be Logged on to comment or reply to a post.

  1. Daniel Graversen
    That looks like a strange solution, but I bet it works in improving the performance.
    (0) 
    1. Varun Mukund Post author
      Yeah it works big time:
      I’ve tested the scenario with 10000 records in normal mode and it was taking 30 minutes to insert.

      With this method, it got reduced to 9 seconds 🙂

      (0) 
      1. Badari Garre

        This is a nice blog, Varun. You can try the following XSLT mapping too, if you like.<br/><br/><?xml version=”1.0″ encoding=”UTF-8″?><br/><xsl:stylesheet version=”1.0″ xmlns:xsl=”http://www.w3.org/1999/XSL/Transform“><br/>     <xsl:output method=”xml” version=”1.0″ encoding=”UTF-8″ indent=”yes”/><br/>     <xsl:template match=”“><br/>          <ns0:GenericInsertRequest xmlns:ns0=”http://aa.varun.com/ReusableObjects“><br/>               <StatementInsertData><br/>                    <DataTable action=”SQL_DML”><br/>                         <access><br/>                         INSERT INTO mytable (COL1,COL2,COL3,COL4,COL5,COL6,COL7,COL8)<br/>                              <xsl:for-each select=”recordset”><br/>                                   <xsl:text>SELECT </xsl:text><br/>                                   <xsl:for-each select=”child::“><br/>                                        <xsl:text>'</xsl:text><br/>                                        <xsl:value-of select=”current()”/><br/>                                        <xsl:text>’ </xsl:text><br/>                                        <xsl:if test=”position() != last()”><br/>                                             <xsl:text>, </xsl:text><br/>                                        </xsl:if><br/>                                   </xsl:for-each><br/>                                   <xsl:if test=”position() != last()”><br/>                                        <xsl:text> FROM dual UNION ALL </xsl:text><br/>                                        <xsl:text><br/>                                        </xsl:text><br/>                                   </xsl:if><br/>                                   <xsl:if test=”position() = last()”><br/>                                        <xsl:text> FROM dual</xsl:text><br/>                                   </xsl:if><br/>                              </xsl:for-each><br/>                         </access><br/>                    </DataTable><br/>               </StatementInsertData><br/>          </ns0:GenericInsertRequest><br/>     </xsl:template><br/></xsl:stylesheet>

        (0) 
  2. Sudharshan Aravamudan
    Hi,

    Very useful blog. It seems to solve the mass insert issue.Need to try it:)

    Nice one . Keep blogging.

    Batch Mode in JDBC Adapter for Oracle DB:
    In case of Oracle DB, to make use of Oracle Batching which is faster than standard Batching, Prepared statements must be used and JDBC Adapter by default does not use prepared statements. I am really not sure how or whether PI JDBC Adapter implements Oracle Batching.

    Blog: PreparedStatement with JDBC Receiver Adapter illustrates the settings for prepared statements.

    Best Regards,
    Sudharshan N A

    (0) 
  3. Inder G
    If i have Hundred records to insert. Do i need to make 50 statements of 2 recordset each or we can send everything at once somehow.
    (0) 
          1. Varun Mukund Post author
            Yeahh!!! Also, I’m clubbing a group of 1000 at a time and the next set of 1000s go to another statement node. I used that logic after doing whole lot of testing and found that it was giving the most optimal results.
            (0) 
            1. Inder G
              Do we need to repeat the access field for each recordset as it is 0 to unbounded. It will be better if you post xml for 4 records. thanks very much.
              (0) 
              1. Varun Mukund Post author

                Yeah… The access has to be repeated an many times as the number of rows you have to insert. The below XML is the result after java mapping for 4 access nodes

                (0) 
  4. Edward Pelyavskyy
    When implementing this solution consider this:

    * If you need “… to insert huge records of data in a target database table …”, is PI the right choice? Do you need special tools to do that? The most basic one would be Oracle SQL*Loader

    * From what I understand you are sending a potentially very long INSERT … SELECT … UNION ALL … statement to Oracle for processing. Is there a limit on how long the statement can be?

    * What if, one of your SELECT … FROM DUAL produces a duplicate primary key. The whole INSERT will fail. Will you be able to identify the SELECT that produced the duplicate record?

    * On the positive note: INSERT … SELECT … will work faster than INSERT … VALUES …
    But consider the above points too

    (0) 

Leave a Reply