Skip to Content

Optimizing System Performance using Parallel Processing of IDOCs

What is Parallel Processing?

Processing multiple IDOCs at the same time in different dialog processes. This reduces the waiting time for multiple IDOCs in same dialog process.

e.g of a Typical Scenario:

In Direct Store Delivery(DSD), Sales Person synchs multiple orders from Handheld  to SAP. In SAP these Orders are processed as IDOCs and they need to be processed in a very short span of time and Information is to be sent back to Handheld. This activity is very much time critical and hence IDOC processing performance counts a lot.

Necessity for Parallel Processing:

While sending and processing mass idocs in ECC system there might be performance issues as generally idocs get processed one by one in single dialog process and then they wait in queue which increases overall lead time. Some real time applications where mass update is very much time critical, parallel processing helps to optimize overall performance.

Steps to achieve parallel processing in ECC:

 Step 1: Setting up server group to handle parallel processes

How to Achieve: Basis needs to define Server Group and allocate certain dialog processes to that server group.  These dialog processes will be specifically allocated to parallel processing used for this server group.

e.g.: Define Server Group as Parallel Process and allocate 8 dialog processes to this group.

This will allow mass IDOCs in all dialog processes parallel and more the number of processes more IDOCs will be picked up at runtime and improving processing time

CONS:  System Resources should be capable of dedicating the designated number of processes to parallel processing otherwise it could slow down the system performance for other processes due to lack of enough resources. The total number of dialog processes to be allocated to server group is totally a basis team’s call depending on system load.

 

STEP 2: BD51 Config

BD51 Config

Configure the function module for inbound processing to handle MASS Processing in INPUTTYPE. These transfer packets of IDOCs for which individual IDOCs are updated in the same Logical Unit of Work (LUW). For rest options, only single IDOC gets processed in one LUW.

Step 3: Setting up job with RBDAPP01

 We need to take care of 2 options while setting up variant for RBDAPP01:

 

  1. Packet Size: Packet processing enables one program or one function      module to process batches of data of the same type. This reduces the      number of dialog processes called and improves system performance. Packet      processing and parallelism complement one another. Packet processing and      parallelism complement each other, although in some situations they may      compete with each other. If there are too many master data objects in each      process, possibly not all available dialog processes will be used.

 

Ideally we should give 20-100 as packets,

2. Give      Server Group Under Parallel Processing option

    RBD Parallel

     

    3     Execution      of program: We can set up background job with variant and execute it.SAP      recommends that we should schedule next RBD job once the existing ones is      finished else it could lead to locking issues.            

    Consider       scenario of ECC receiving 50 idocs

    • Parallel Process Server group is ready with allocated 8 processes
    • RBDAPP01 Program is set for parallel processing with packet size 5

     

    Following Diagram depicts parallel processing:

    Parallel Processes

    40 idocs will be processed at the same time and rest 10 will be waiting to be picked up.

    5 idocs will be processed at a time within same LUW due to mass processing option and packet size.

    Benefits achieved:

    If we had not enabled Parallel processing option, all 50 idocs would have been fed to RBD program and will be processed one after other as per availability of dialog process at run time and we don’t have control on number of idocs being processed at one go.

    This definitely slows down idoc processing time.

    However, in case of parallel processing, since we have multiple processes dedicated to server group, it improves overall performance.

    Challenges involved in Parallel Processing:

     

    1. In      case we have requirement to run RBDAPP01 program in duration where earlier      program has not finished yet, same idoc might get picked up by multiple      processes and thus causing duplicate orders. This happens because RBDAPP01      pick up idocs only with status 64 and 66. So when Process picks up IDOC,      its changes to 66 only at the end and then it changes to 51/53 only at the      end. So in between all the time if idoc status in still 64, it gets picked      up by another process.

           This in turn causes duplicate idoc processing issues.

           SAP Note 902806   states the same problem and hence if we have such a        situation, we should try to customize above solution a bit.

     

    Solution to overcome Duplicate IDOC Processing:

    Create a custom RBDAPP01 program which picks up IDOC and  set it’s custom status in Custom table and  also checks custom status before processing IDOC so once IDOC is picked up, there are no chances of it getting picked again.

     

    But make sure to handle update on any custom table in parallel processing be controlled via ENQUEUE /DEQUEUE.

    To report this post you need to login first.

    5 Comments

    You must be Logged on to comment or reply to a post.

    1. Michelle Crapo
      But I bet the IDOC programmers will get in as much trouble as the developers.  We use too many processes by accident and we bring all other jobs to a halt.  There is a great deal of finesse you have to have to do this!

      I’d be interested to hear other stories are they all so “easy”.  (I know this one wasn’t)  Any issues after you went live with parallel processing?

      Thank you for an interesting blog!

      Michelle

      (0) 
      1. Sejal Purandare Post author
        Thanks Michelle for your reply.

        Yes this requirement was critical and we had to be extra careful on duplicate idoc processing. Custom solution proposed by me at the end helped us to overcome such challenges. So far there are no issues  after go live. No Duplicates or no loss of data.

        (0) 
    2. Martin Steinberg
      Hi Sejal,

      thanks for your very interesting blog.

      I’ve got an issue concerning idoc input processing, too. I face a huge amount of idocs being processed (approx. 100.000 each day). The decision against parallel processing was based on the requirement that all messages have to be processed in queue.

      Do you have any ideas how such mass processing can be accelerated without risiking the violation of the order?

      By using the mass processing option of RBDAPP01 it’s possible to process IDOC 4711 in process 1 much earlier than its predecessor 4710 in process 2. So there’s a risk of inconsistent data.

      Do you have any ideas? Thanks a lot for your reply 🙂

      Cheers, Martin

      (0) 
      1. Sejal Purandare Post author
        Hi Martin,

        Thanks for your reply.

        If IDOC order is critical, then we can not use Parallel processing as at run time any idoc gets picked up  by next available process.

        To optimize the performance in your case, you might have to check IDOC serialization or Processing IDOC packets (Logically divided). This might help you. I have not tried this as in my case IDOC order was not mandatory, each idoc carried individual information to be posted.

        (0) 
    3. SOUMYAJIT CHAKRABORTY

      hi sejal

      this time we are facing new issue where 2 idocs containing same order number gets processed . Creating 2 idocs with same reference is not an issue this can happen , but at the same time if one document is already posted for one order next document should not be posted , this is the logic . But here for some of the cases this is not getting performed and the code is allowing both of them to be posted.

      ca you can you please advise

      regards

      soumyajit

      (0) 

    Leave a Reply