Skip to Content

Inserting Multiple Records into MaxDB Database

Inserting Multiple Records into MaxDB Database

Inserting multiple records into a database is a quite a bottle neck in most cases. But no one prefers manually entering a thousand records one by one. This is a quite a useful functionality that every ones prefers. I found a couple of threads in the MaxDB forum on the same topic and worked out a simple solution for them
How to insert multiple rows
🙁 Mass Data Upload to table in MaxDB

Some time back I posted a weblog on Batch Mode Operation in MaxDB using MaxDB Loader and Java which uses the logic for performing multiple query execution at one shot. But inserting multiple records is a little different as we need a little preprocessing to the raw data before it can be uploaded into the database. A simple technique I followed and did really solve my problem is given below. Then I developed a simple java application to download the MaxDB data to XML format. This weblog discusses the same considering a simple table with a few records for test purpose.

Table Definition

I considered a simple table for the test which had the below definition.


Table Contents

The number of records it contained was only one i.e. for the ROOT user.


Records to insert

Now the task is to upload a set of records shown below into the database.


The JDBC Program

I used a JDBC program for performing the multiple records insert into MaxDB. The program is given below.

The Result of the Program

The program did work perfectly and the test run results were displayed.



Did it really insert the data into MaxDB?


Yes, it did and a simple way a few lines of code to do the trick.

The assumptions I used for this are,
1. Comma “,” was used as the delimiter
2. No empty records or empty lines after the last record.

Hope this was a useful tip.

You must be Logged on to comment or reply to a post.
  • You should not store the records in an array, for larger number of records this will take a multiple time of  memory of the input data. It is easier to read it line by line.

    For larger data sets you might also want to split the inserts into chunks and commit them chunk-wise.

    Maybe the batch JDBC interface and the prepared statements also improve performance and stability.


  • JDBC Driver 2.0 onward supports batch update. With batch updates, instead of updating one row of a table one at a time, you can direct JDBC to execute a group of updates at the same time.
    like what you are trying to convey from the weblog.
    Batch update reduces drastically the time taken to update the records in the database
    Below is the sample code which shows how you can use batch update.
    example shows an UPDATE statement that takes two input parameters is then executed twice, once with each set of parameters.

    try {
    connection con.setAutoCommit(false);                  
    PreparedStatement prepStmt = con.prepareStatement( “UPDATE DEPT SET MGRNO=? WHERE DEPTNO=?”);           

      int [] numUpdates=prepStmt.executeBatch();            
      for (int i=0; i<numUpdates.length; i++) {            
        if (numUpdates[i] == -2)
          System.out.println(“Execution ” + i +
            “: unknown number of rows updated”);
          System.out.println(“Execution ” + i +
            “successful: ” numUpdates[i] + ” rows updated”);
    } catch(BatchUpdateException b) {
      // process BatchUpdateException


    Thanks for the useful info.
    Incase if one likes to perform other operation apart from UPDATE, then this will not be possible i suppose. The batch operation intends to perform any operation (delete, insert etc) and hence i opted for that method.

    But still i thank you again for passing this tip.

  • Hi Bernd,

    I intended to provide a simple a logic for the same and did not bother much on the Performance Optimation. I left for the users choice as they can take it on their own. I will work on the same and upadte this once completed.
    Thanks again for the suggestion.