Skip to Content
Author's profile photo Raj Thukiwakam

JDBC Receiver scenarios best practices – How to handle High volume load-Stored procedure design-Part6

Stored procedure was developed in data base with 4 input parameters (header, line item, partner and DTN); stored procedure always expects input parameters in same order and last input parameter DTN was optional.

PI interface development point of view | as a field separator and $$ used as a record separator, really challenging task to send default blank value for all fields when there are no values coming from IDoc.

if PI mapping logic was unable to populate default blank value then PI mapping gets failed because all fields in data type are mandatory.

Number of JDBC calls:

Number of JDBC calls to data base is always one (header, line item, partner and DTN) irrespective of number of E1EDP01 and E1EDKA1 segments in IDoc.This is main advantage with this design.

IDoc packaging:

This design support IDoc packaging, refer below blog by Shabarish,–sap-pi-71-ehp1-and-above

The main advantage with IDoc package is number of  trfc calls to PI system is minimum , all collected IDoc will send one trfc Call, this packaging concepts avoids many performance issues.

The same way my design supports IDoc packaging to send collected IDocs in one JDBC calls to data base.

I tested my development with package size 1000  and results were positive and achived amazing performance..

As per my bussiness requirement 1000 IDocs required nearly 200k insert operations on data base  INSERT statement design and 1000 calls to stored procedure standard design but this design required only one data base call.

SAP recommends packaging   more than 4000 IDoc creates not a good practise.

Message Size:

Mapping output (stored procedure xml) size always less compare to standard designs, the memory took in BC_MSG JAVA table was less.

Challenges with this design:

1)     1)Writing PI mapping logic to send default blank value is really challenging when you have complex mapping logic.

2)     2) If mapping output stored procedure XML characters exceeds more than 32k then Data base side they need to change coding in stored procedure to accepts more than 32k,I have faced this problem my data base team had fixed it.

3)      3) Error handling logic required at data base level to validate the values of stored procedure XML because JDBC adapter does not throw any error if you send incompatible values, like INSERT statement design.


I consider this is one of the best design to achive best perfromance with JDBC receiver scenarios,we performed 15 million DB operations  with this design in 23 hours.–insert-operation-design-part2–multi-mapping-design-part3–stored-procedure-design-part4–high-volume-stored-procedure-design-part5

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      Hi Raj,

      Nice series of blogs on JDBC. However, personally I would prefer two approaches to further handle heavy volume DB operations in small frame of time yet in open standard format.

      1. Zipped and then base64 encoded XML data suitable for database schema(s) (columns/tables etc..) as input to java stored procedure. Java stored procedure in oracle should be capable enough to unzip and link this data to database schema(s). Stored procedure can also have some kind of logging mechanism to handle errors. Since we are relying on XML it is open standard and we can use SAX or DOM in java stored procedure.

      2. As you mentioned flat content (however in zip and then base64 encoded format) as input to stored procedure. But this approach requires more customization both end i.e. on XI side and DB side for handling flat content.

      Let me know your thoughts.


      Praveen Gujjeti

      Author's profile photo Raj Thukiwakam
      Raj Thukiwakam
      Blog Post Author

      Hi Praven,

      mentioned approaches looks promising but i never tried, but will try definetly.

      DB team has to flexible enought to change development at DB level to try different designs but most of the time that is not the case, so how i am lucky that DB team supported me 🙂