Skip to Content
Technical Articles

ProcessDirect Adapter

Are you facing high network latency while establishing inter-communication between integration flows in SAP Cloud Platform Integration platform?

ProcessDirect adapter is the solution for you. You can use ProcessDirect adapter to provide direct communication between two integration flows and unlike HTTP/SOAP adapter communication between two integration flows will not be routed via load balancer.

Terms to keep in mind while working with ProcessDirect adapter:

Producer Integration flow:

When the ProcessDirect adapter is used to send data to other integration flow, we consider the integration flow as a Producer integration flow. This is when the integration flow has a ProcessDirect Receiver adapter.

Consumer integration flow:

When the ProcessDirect adapter is used to consume data from other integration flows, we consider the integration flow as a Consumer integration flows. This is when the integration flow has a ProcessDirect Sender adapter.

ProcessDirect Adapter Offerings

  • Allow multiple integration developers to work on the same integration scenario
  • Reuse of integration flows across integration projects
  • Decomposition of Integration Flows

For detailed information on the ProcessDirect offerings, see the Help portal documentation.

Yes, we know you might be using HTTP, SOAP or JMS queue adapters to decompose/split the integration flow into smaller integration flows.

Look at the below comparison chart to see what ProcessDirect adapter offers you over and above all:

Feature JMS Queue adapter HTTP/SOAP adapter ProcessDirect adapter
Availability Selective licenses All All
Latency High High Network traffic through load balancer Low
Processing Mode Asynchronous Synchronous Synchronous
Header propagation Yes No Yes
Transaction Propagation No No No
Session Sharing No Yes Yes
Across Network Yes Yes No


ProcessDirect adapter improves network latency, as message propagation across integration flows do not involve load balancer. For this reason, we recommend that you consider memory utilization as a parameter in scenarios involving heavy payloads and alternatively use HTTP adapter in such scenarios because the behavior will be the same.


Basic Configuration Settings for ProcessDirect Adapter

In this scenario, we are creating a producer and a consumer integration flows with the content modifier, wherein the message body is sent from producer to consumer and it is received via email configured at the consumers integration flow.

In producer integration flow, we are using HTTPS adapter at the sender end and ProcessDirect adapter at the receiver end.

Producer Integration Flow:

  1. Select the HTTPS adapter and go to Connection tab.
  2. Provide the producer address.
    Note: if your project is not CSRF protected, ensure to disable CSRF Protected checkbox.
  3. Select Content Modifier and go to Message Body tab.

Consumer Integration Flow:

In Consumer integration flow, we are using ProcessDirect adapter at the sender end and SFTP adapter at the receiver end.

  1. Select the ProcessDirect adapter and go to Connection tab.
  2. Use the same endpoint address in Address field as you have provided in producer integration flow at the receiver end.
  3. Select SFTP adapter configure the necessary settings.

Additional Tips: In case if you don’t want to configure the receiver, you can also log the output into a script as shown below:

Add the following to the script file: 

The integration developer needs to create the method processData

 This method takes Message object of package

which includes helper methods useful for the content developer:

The methods available are:

    public java.lang.Object getBody()

                public void setBody(java.lang.Object exchangeBody)

    public java.util.Map<java.lang.String,java.lang.Object> getHeaders()

    public void setHeaders(java.util.Map<java.lang.String,java.lang.Object> exchangeHeaders)

    public void setHeader(java.lang.String name, java.lang.Object value)

    public java.util.Map<java.lang.String,java.lang.Object> getProperties()

    public void setProperties(java.util.Map<java.lang.String,java.lang.Object> exchangeProperties)

       public void setProperty(java.lang.String name, java.lang.Object value)



import java.util.HashMap;

def Message processData(Message message) {

   def body = message.getBody(java.lang.String) as String;

   def messageLog = messageLogFactory.getMessageLog(message);

   if(messageLog != null){

     messageLog.addAttachmentAsString(“Log current Payload:”, body, “text/plain”);


   return message;




ProcessDirect Adapter Sample Use-cases

In this section you will find some typical scenarios which can be executed using ProcessDirect adapter.

Transaction Processing

Transactional processing ensures that the message is processed within a single transaction. For example, when the integration flow has Data Store Write operation, and the transaction handling is activated, the Data Store entry is only committed if the whole process is executed successfully. In case of failure, the transaction is rolled back and the Data Store entry is not persisted.

Scenario1: A basic scenario for setting transaction type to JDBC for producer and Consumer

In this scenario, the transaction type for producer and consumer are set to JDBC. Consumer with Script component throws an exception which is not caught and hence the database operations are rolled back in both producer and consumer.

Producer Integration Flow:

Consumer Integration Flow:


Scenario2: Producer throws an exception after request-reply and transaction type is set to JDBC

In this scenario, transaction type is set to JDBC for both producer and consumer. We have script after Request-Reply step which throws an exception. 2 MPLs will be generated with the status Producer as Failed and Consumer as Completed. In this case database operations are rolled back in producer but not in the consumer.

Producer Integration Flow:

Consumer Integration Flow:


Scenario 3: Producer throwing exception after Request-Reply and Transaction set to JDBC for Producer and Consumer set to Not Required.

In this scenario, transaction type is set JDBC for producer and consumer “Not Required”. We have script after Request-Reply step which throws an exception. Two MPLs will be generated with the status Producer as Failed and Consumer as Completed. In this case database operations are rolled back in producer but not in a consumer.

Producer Integration Flow:

Consumer Integration Flow:


Multiple MPLs

Scenario 1: Only Splitter in Producer, without any join/aggregator

In this scenario, the producer and consumer both will have Completed status in MPL logs and there will be three splits of message during the transaction.

Producer Integration Flow:

Consumer Integration Flow:

Scenario2: Iterating Splitter and Aggregator in Producer and Consumer respectively

In this scenario, MPL logs per split will be generated, one for producer and one for each split message for consumers. Due to the default behavior of Aggregator, one additional MPL is generated in the consumer integration flow.

Producer Integration Flow:

Consumer Integration Flow:

Scenario 3: Router in Producer, routing to Direct Adapter via two receivers.

In this scenario, depending on a condition the associated consumer is invoked.

Producer Integration Flow:

Consumer 1 Integration Flow:

Consumer 2 Integration Flow:

Scenario 4: Request-Reply and End Message in Producer and End Event in Consumer

In this scenario, the request-reply in the Producer integration flow forwards the message in the body of Content Modifier 1 to consumer 1 integration flow. In consumer 1, the received message is appended to the body of Content Modifier 1 and is returned to the Request-Reply in Producer integration flow. This message is then forwarded to consumer 2 which is later sent to the SFTP server.

Producer Integration Flow:


Consumer 1 Integration Flow:

Consumer 2 Integration Flow:



You must be Logged on to comment or reply to a post.
  • Hi Meenakshi,

    thanks for interesting blog. The adapter is now available on my tenant.

    Two questions:

    1. URL configuration
      SAPhelp does not give a detailed description how the URL should look like:

      Could you give a concrete example here? (is it only the URL path or the complete URL?)

    2. Can it be displayed in the monitoring from which message a target Integration Flow was triggered/called?


    Regards, Holger

  • Just tested the transfer of header, body and property values between the invoking and invoked Integration Flow. As expected property values are not passed over between the two Integration Flows. The exchange of the body works in both directions. For header values it seems that only the invoked Integration Flow can pass back values to its predecessor, but not in the other direction.

    Would be interested in your test results. I will continue with some more tests, since it would be useful if not only body but also header values can be passed from the invoking to the invoked Integration Flow.

    • I agree with your comments Hogler so all 3 parts of a message like a header, body, and properties should be passed to benefit out the features of the process call.

        • At least we need option to handle propagation of selected headers and properties from child process to main process in the adapter level similar to header from main process. Also we need to waiting time or timeout option in the main process channel to avoid failing of calling during multicast calls from the main process.

          • All headers are exchanged from the calling flow to the called flow. Via the field "Allowed Header(s)" you can specify in the child process which headers are allowed in your called flow.

            If you can control both flows, you should cleanup the headers that you don't need before returning to the calling flow.

            Why do you need a waiting time or a timeout? If the called flow is doing an outbound call you have a timeout parameter there. Why is the multicast call special? If you do multicast calls from the calling flow the called flow can be executed multiple times.

          • Exchange of header and body are possible but not the properties so need to try if attachments would exchange here.

            As mentioned in the other comment, sequential multicat will fail in case either one of the child processes failed to return within some time like if its more than 3 mins.

            At least timeout value of the main iflow's Sender Channel should be considered implicitly here.

    • The Header values can be transferred in both directions. For passing a header value from Producer iflow (invoking iflow) to Consumer iflow (invoked), the headers should be configured in <Allowed Header(s)> in the consumer integration flow's runtime configuration.

  • Does this adapter hand off the payload and continue to the next step or does it wait for sub-integration flow to complete to move forward?

    Reason, I am asking is that I have large payload (some time 100-200mb). In this case I don't want to hold the main integration flow.

    I am looking for a way to send the payload to sub-integration flow and continue without waiting for the it to complete.


    Thanks in advance



    • Hi All,

      I also have similar situation and would like to know the answer for the above question. If we are not using Request/Reply  in the Producer iflow and sending the payload to the consumer  and consumer process the data and pass it on further, does the producer waits until the consumer is finished or it will set the status to completed .

      Thanks in Advance.


      • it will wait for the process call to complete but if you have multicast used in the main process then it will fail after timeout incase anyone of the child process extended more than 1 mins.

        • Can you elaborate a bit more on that please?

          Why does the Multicast fail after 1min? I just tried it out myself. I used a script in one of the branches waiting for 70s. And nothing happened. The flow executed successfully.

          • In case of Idoc scenarios where If we use sequential multicast in the parent flow and the first process completed within few secs where the second process takes more than 2 or 3 mins then the main flow would set to fail so the IDoc will be in error state in the backend.

  • Good job from SAP and looking forward hear more in comparison to beat other iPaaS tool features of individual UX options to handle operations like below.

    1. Messages in data stores.
    2. Easy selection of external and dynamic paraments from message structures like header, body, properties and so on.
    3. Easy access to parameters in components like mapping, routing, exceptions and many.
    4. Easy un/deployments to multiple tenants, monitoring, transport and so on.



    • Process Direct adapter can be used to make inter-package iflow calls as well.

      Producer iflow can be part of Package A and Consumer iflow can be part of Package B. Once both the iflows are deployed, producer iflow can make a call to the consumer iflow.

    • Hi Vinay,

      The Process Direct is independant of the EDI Splitter. The Consumer flow acts as a standalone flow and therefore also EDI Splitter should be supported if your tenant is enabled for EDI Splitter.


  • Hello all,

    I have question on the behavior of this Process Direct Adapter.

    There are 2 iFlows. The first iFLOW calls the second iFLOW using the process direct adapter. The scenario runs fine no issues there. But the problem I'm seeing is with the timing. As I understand Process Direct adapter works based on Synchronous mechanism. The Consumer iFLOW takes approx 11 minutes to complete the process. Since the Process Direct adapter is Synchronous, I was hoping the Producer iFLOW also waits the same approx 11 minutes, but it gets completed in approx 5 mins 30 secs. So not sure if this is an issue or it is the behavior.

    Let me know if my question make sense and if anyone has any insights on this behavior.

    Best Regards,


  • Process Direct is not working as expected for integration flows for SAP Customer Cloud for Customer and SAP ECC.

    The scenario is the following.

    1. Process direct post-exit process is being called by setting the configuration to true.
    2. The Iflow consumer Service executes successfully making a mapping with the information received.  The following image describes the Iflow.Process%20Direct%20Post%20Exit
    3. After the process returns to the producer its body is still the same regardeless of the response of the consumer IFLOW just if like it never passed through the process direct request-reply call.
    4. Also there is an issue with the standard Replicate Business Partner from SAP ERP standard flow, because in the filtering process it is expected to received the information from the consumer in an specific format which is in a different order from the one send before the consumer call, nevertheless I adjust the payload accordingly to the filter step so it can work but the issue is the one called before.

    In a marketing integration scenario this didn't happened using the same steps at this point I don't know if this issue is related to the subaccount the marketing process was on SAP NEO Environment account this one having the issue is using SAP Cloud Foundry new process for self service configuration, hope someone can help with this issue.