Skip to Content
Technical Articles
Author's profile photo Deep Ganguly

Managing time intensive Integration Flows using SAP Cloud Platform Integration Suite

Problem Statement:

Integration Flows(iFlow) can be used to solve problems which are time intensive(the job takes ‘long’ time to end).Though there are necessary measures in place within Cloud Platform Integration Suite to handle such time intensive processes, if the endpoint of the iFlow is managed by an API Management layer then the long lived connection is a challenge to manage. This is because REST/OData architecture preaches short lived request-responses.iFlows in this case is long-lived. Therefore any request coming from the proxy endpoint can result in timeouts.

In this blog I am going to talk about two architecture patterns which can be used to solve this problem

Webhook Pattern:

The Flow:

When deployed the iFlow subscribes to a Queue within SAP Enterprise Messaging. An API is created in APIM system which is configured to publish to the Queue which the iFlow has subscribed.

  1. The client requests the API endpoint exposed with the payload the webhook URL.
  2. On receive of the request, APIM executes the policies provided and creates a job-publishes the payload received from the client in the Queue.
  3. It returns a response to the client as – a job has been created. Once the job is finished the webhook would be called.
  4. The iFlow is subscribed for jobs found in the queue
  5. iFlow starts processing the time intensive job.
  6. iFlow requests to several different systems to finish the job
  7. iFlow collects all the responses
  8. Once the job ends it extracts the webhook URL provided by the client and posts the response to the webhook URL.

Key points to consider:

  1. The client has to own the response payload . Any downtime at the client’s end can cause unavailability for the webhook URL . In such a case the job response payload might get lost.
  2. The request goes through the API managed gate, the response doesn’t go through any managed gate. This could lead to security issues for some organisation. How trusted is the webhook ?

 

LookUp Pattern:

The Flow:

When deployed the iFlow subscribes to a Queue within SAP Enterprise Messaging.There are two queues Request&Response. An API is created in APIM system which is configured to publish to the Queue which the iFlow has subscribed.

  1. The client requests the API endpoint .
  2. On receive of the request, APIM executes the policies provided and creates a job-publishes the payload received from the client in the Queue.
  3. It returns a response to the client as – a job has been created.
  4. The iFlow is subscribed for jobs found in the queue
  5. iFlow starts processing the time intensive job.
  6. iFlow requests to several different systems to finish the job.
  7. iFlow collects all the responses
  8. iFlow queues the job-completion status within the Response Queue.
    1. New job-status is found in the Queue
    2. An iFlow processes the completion status and stores the response within a Data-Store. Note- From Step-7 there could be direct insertion into the data-store as well. The response Queue might not be required at all, however in some cases its best practise to use a queue for ‘job’ handling.
  9. The client looks-up for the response . The client could poll in this case.
  10. Job completion Status is fetched from the Store.
  11. TheĀ  completion response is returned to the client.

Key points to consider:

  1. The provider is the owner for the storage of the response.How long will the response be stored? When and How will the response be deleted ? depends on the requirement and the landscape
  2. The client can poll to know the status of the job. This could impact the server as needless traffic be consumed due to polling.

Both patterns

  • ensure that there is a guarantee of response to the client – inbound request- response.
  • are asynchronous with respect to the client.
  • are proven to scale incase the job completion time increases or decreases

 

Edit: I am getting questions when to use this pattern :

Use this pattern when APIM is used with CPI with the following OR conditions

  • Completion of iFlows is going to take a lot of time. This is already known in advance.
  • Completion time of iFlows is unknown .Time taken by the iFlow may be long or short depending on the scenario.
  • Want a generic architecture which should be able to scale handling of both short(immediate) and long running flows in future

Please note for already known short running iFlows this architecture could be an overkill.

 

Assigned Tags

      4 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Fatih Pense
      Fatih Pense

      Hello Deep, thank you for the great architectural write-up!

      I have a question about using Enterprise Messaging as a store in the "Lookup Pattern". Can you search for a specific job id or other identifiers in the message queues? I couldn't find any information in the documentation about this feature. Can you refer me to the documentation, please?

      For example, Messaging API Rest service doesn't have the method:

      https://help.sap.com/doc/3dfdf81b17b744ea921ce7ad464d1bd7/Cloud/en-US/messagingrest-api-spec.html

      Best regards,
      Fatih

      Author's profile photo Deep Ganguly
      Deep Ganguly
      Blog Post Author

      Hi Fatih,

      There is no API to look-up from a Queue.Purpose of the Queue is notĀ  look-ups. The look-up is from the perspective of the Client. I have updated the diagram with a store which could be used to look-up.

      Author's profile photo Fatih Pense
      Fatih Pense

      Hi Deep,

      Thank you for updating the diagram. It is clear to me now. Enterprise Messaging doesn't have any functionality other than being queue and the whole Look-up scenario is not possible without using a store.

      Regards,
      Fatih

      Author's profile photo Raffael Herrmann
      Raffael Herrmann

      Hi Fatih,

      What about setting up an additional Datastore IFlow in CPI. The flow could manage two tasks:

      1. Poll the EMS queue for job done messages and write the job id into an CPI datastore
      2. Offer an public REST endpoint which allows clients to lookup their jobs in CPIs datastore