Skip to Content

Inserting Multiple Records from XML files into MaxDB Database

In my last weblog I explained a simple technique to Inserting Multiple Records into MaxDB Database into MaxDB database server. I always prefer having a variety of options in most cases. In general the data we receive can be in any format and hence we cannot limit ourselves to specific formats. Some time back I posted a weblog for Creating an Extractor for downloading MaxDB Table Data to XML. I felt it will be good option to insert multiple data from XML files into MaxDB.
This will minimize the problems due to the delimiters and XML being widely accepted format things will be very much easier while extraction and upload. I would recommend all readers to have a look at JAXP if you want to understanding the XML parsing which is the main addition when compared to my Inserting Multiple Records into MaxDB Database weblog. And for the below two forum threads one more solution is here,
How to insert multiple rows
🙁 Mass Data Upload to table in MaxDB

Hoping things are much simpler except for the piece java code I will get into the solution straight away.

Table Definition

I considered a simple table for the test which had the below definition.

image

Table Contents

The number of records it contained was only one.

image

Records to insert

Now the task is to upload a set of records shown below into the database from XML file.

image

The JDBC Program

The JDBC program for performing the multiple records insert into MaxDB is given below.

The Result of the Program

The program did work perfectly and the test run results were displayed.

image

Verification

The data was successfully inserted into MaxDB. The table entries are shown below.

image

Hope this was useful.

To report this post you need to login first.

4 Comments

You must be Logged on to comment or reply to a post.

  1. Valery Silaev
    SUBJ:

    1. Ever heard about java.sql.PreparedStatement?
    2. Any thoughts about “batch updates”? MaxDB has JDBC 3.0 driver, AFAIK
    3. Loading complete DOM tree in memory for this use case is overkill. Not only resource consumption are very high, but the task in nature is “forward- only per-record stream processing”, so custom SAX handler is ideal here. SAX handler allows you to execute 2 steps in one. With DOM you first parse complete tree in unnecessary memory structures (step 1) then scan tree and execute updates (step 2). With SAX you can execute update (or add entry to batch update) as long as you have one record parsed.

    VS

    (0) 

Leave a Reply