Skip to Content
Author's profile photo Raj Thukiwakam

JDBC Receiver scenarios best practices – INSERT operation design-Part2

Design 1: INSERT operation

Lest start with standard INSERT statement design, every one familiar with insert operation  and it works really good way for medium and low high volume load.

There were many documents available on the same concept hence not going to give detailed description.

The first and foremost thing while working with the JDBC scenario is creating a JDBC receiver data type structure, SAP defined a standard format to use while working JDBC receiver scenario.

Refer below SAP help to understand the XML document structures for different operations.

Below screenshots is an example for inserting data in to single data base table using INSERT operation.


Design2:Inserting data into multiple tables without multi mapping

This case creates a data type with multiple statement objects like below and map accordingly.

The advantage with using multiple statement objects in one JDBC receiver  Data type is within one process call we can PI executes a scenario, utilization of PI recourse less compare to multi mapping design.

Below is an example shows inserting MEDIA Products IDoc data in  two data base tables MATERIAL ACCOUNT TABLE and Material Basic data table.


Use this design when you want to process medium volume records to data base (Eg : 20k), INSERT operation takes high resources and once INSERT operation succesfuly perfrormed then only it releases JDBC connection.


1)   1) Error handling is very easy because if insert operation fails due data issue or field length exceeds issue or unsupported data issue or network connectivity issues then we can track it in CC monitoring.

2)   2) ALERT framework covers JDBC adapter issues so that Support team can easily identify the root case.

3)   3)Easy to develop and dependency on data base team always minimal.


1)   1)Processing high messages not recommended with this design and i had a serious problems with INSERT statment design while dealing with hig volume.

2)JDBC adapter got locked many cases and all connections showed in utilisation mode ,eventually JDBC adapter locked after Java stack restart problem was solved.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      Though its been long time Since I touched JDBC adapter..and it always challenging to design the scenarios having databases as the design flaws if any, can potentially affect the whole server /interfaces..

      Have you tried creating four messages (as you have to insert data to four tables) using batch processing mode ? by tweaking the java thread counts at adapter level ?

      can you share the your result metrics..



      Author's profile photo Raj Thukiwakam
      Raj Thukiwakam
      Blog Post Author

      i am always against INSERT Statement, in my case i tried using 4 statements in one data type, within one message inserted data in to multiple tables but performance was not gr8 batch mode imporved a little bit.

      increased thread count to 20 after that performance was imporved but JDBC adapter locked many times,my requirement is to process 2 million messages per day:)

      but increasing receiver threads is one nice option to imporve performance.

      Author's profile photo Prabhakar Amuri
      Prabhakar Amuri

      Hi Raj,

      I have one issue I was trying to insert multiple records into the table but only 1st record is inserting into the table can u please let me know how to proceed.



      Author's profile photo Prabhakar Amuri
      Prabhakar Amuri

      Hi Raj,

      My issue was resolved now.



      Author's profile photo Tang Lambert
      Tang Lambert

      Hello, I have the same problem as you,I want to insert more than one piece of data into the database at a time, but from database monitoring,Only one data is inserted into the database at a time, which is a poor performance,

      Please comment,Can you show  me how to insert more than one piece of data at a time。

      This is an urgent need. Thank you very much