Skip to Content
Technical Articles
Author's profile photo Punith Oswal

SAP Cloud Integration – System Downtime handling by Automatic Deployment and Undeployment of the IFlows


The cornerstone of any project in a structured landscape lies in its integration. Integration plays a crucial role in establishing connections between applications, acting as a bridge that links two ends.

However, it’s important to acknowledge that integration, or the middleware, often encounters periods of downtime from either the source or target systems. The integration team bears the responsibility of effectively managing all integrations during these downtimes, ensuring data integrity and striving for minimal or zero failures within the system.

In this blog, I will delve into a mechanism that automates the deployment and undeployment of integration flows based on the scheduled downtime of specific applications.

This mechanism proves particularly valuable for synchronous integrations, where it becomes impractical to hold data for extended periods due to timeout constraints. However, for asynchronous flows, we have the flexibility to store messages in JMS queues, allowing for retries once the downtime is resolved.

Scenario :

We have an very high volume synchronous Integration where the Timesheet Application triggers the timesheet towards SAP ECC module, This integration operates in a synchronous manner, where the timesheet application waits for a response from SAP ECC after making a call. The message processing is considered complete only when the timesheet application receives a response. The entire processing must be completed within the threshold limit of 2 mins which can be further extended to 4 mins, but this should not exceed this value.

Now, let’s consider a scenario where SAP ECC has a scheduled weekend downtime for 4 hours. If we choose to keep the integration flow (IFlow) deployed during this period, we can expect a significant increase in failures both in SAP CPI (Cloud Platform Integration) and the timesheet application because SAP ECC is unavailable to acknowledge the request and provide the response in return.

To mitigate this issue, the best course of action is to undeploy the IFlow. By doing so, the failures will be limited to the timesheet application side, while SAP CPI will be spared from the high number of failures. This approach minimizes the impact on SAP CPI and allows for smoother operation once the downtime concludes.

Solution :

Scheduled downtimes are generally on the weekends, to avoid disrupting your weekend plans, I’ve devised a mechanism that automates the deployment and undeployment of integrations based on scheduled downtimes. This means you no longer have to dread working on weekends to handle such downtimes.

With this setup, integrations will be automatically deployed before the scheduled downtime begins and undeployed once it concludes. This mechanism offers a sense of relief, as it streamlines the process and eliminates the need for manual intervention during these periods. Now, you can enjoy your weekends without the worry of handling integration-related tasks.


IFlow Design:

This design is extremely simple with only 4 components in it, here we make use of predefind ODATA APIs provided by SAP, you can check out all the APIs from this link


IFlow 1:  Automatic UnDeployment of Integration Flow

The Idea is to call ODATA API “IntegrationRuntimeArtifacts” and send an email to the internal team when the operation is successfully concluded.

Triggering an email for deployment notifications is an optional feature. I have included email alerts as an additional measure to keep track of deployments. The intention is to provide an easy way to monitor the deployment process, and since emails can be accessed from mobile devices.



Image 1

  • Set Timer : Here you will set the timer as per your Downtime START time, this will make sure, this iflow is triggered at the right time and the Impacted Integration will be undeployed before the start of the downtime.

If the Downtime starts at 9:00 AM, I recommend the you set the timer at 8:58 AM, so that the              undeployment is completed well before the scheduled downtime.



Image 2

  • Content Modifier : We will declare a property to store the Integration Flow ID in scope.

IFlow name and IFlow ID are two different things, the API accepts only IFlow ID, so please                make sure you pass IFlow ID in property “UnDeployiflow”, this property is set as a configurable parameter so that we can change the IFlow name if needed.

Other properties such Downtime Start Time, Environment etc are optional and are used only for email alerts.

Image 3


  • Request Reply ODATA V2 Adapter :

ODATA API used : IntegrationRuntimeArtifacts – DELETE Operation

Address : https://{{YOUR CPIS TENANT}}

Authentication : Basic (Tenant’s P User ID and Password)


Image 4



Image 5


We will pass the value of property “UnDeployiflow” in the Delete operation, so that the API understand the request and undeploy that particular IFlow.

Once the call is completed, an email (optional) is triggered to the internal operations team, that the iflow is successfully undeployed.

Now we have successfully undeployed the Integration Flow in scope.


The Impacted Integration should get redeployed once the Downtime is concluded, for that we have an another IFlow which will do this job, both IFlow looks identical as shown in Image1 except for the ODATA API Adapter configuration.


IFlow 2: Automatic Deployment of Integration Flow


  • Set Timer : You will set the timer as per your Downtime END time, this will make sure, this iflow is triggered at the right time and the Impacted Integration will be deployed once the downtime is concluded.                                                                                                                If the Downtime starts at 1:00 PM, I recommend the you set the timer at 1:02 PM, so that the  deployment is completed only after the scheduled downtime.


Image 8


  • Content Modifer : the property name is different in this IFlow named “DeployIFlow”


Image 8


  • Request Reply ODATA V2 Adapter :

ODATA API used : DeployIntegrationDesigntimeArtifact – Function Import Operation

Query Options : Id=’${property.DeployIflow}’&Version=’active’

Version parameter : In this parameters, version of the Impacted Integration is passed, We use ‘active’ value to deploy the previous active deployed version.


Image 9


Except this, all the details are same as shown in Image 4.


This will deploy the impacted Integration and email will be triggered to the team stating the IFlow is successfully deployed.

I believe that this blog has provided valuable information and insights, ultimately simplifying the lives of integration teams during weekends. The implementation of this automatic process is expected to significantly benefit the team, making their tasks more manageable and efficient.


Thanks and Regards,



Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Venkata Subbareddy Baasireddy
      Venkata Subbareddy Baasireddy

      Hello Punith,


      This is good solution. Do we have any option to un-deploy multiple integrations in single go. I can see undeployiflow and deployiflow properties in your content modifier are allowing single iflow name ata time. If it is single iflow we can directly go to started iflows and un-deploy individually.




      Author's profile photo Punith Oswal
      Punith Oswal
      Blog Post Author

      Hello Venkata,


      As of now, the API call does not support multiple values. But to use this mechanism for multiple Iflows, we can make use of splitter and send the Integration Iflow ID values one by one for further processing and call the API one after other, with this approach we can undeploy and deploy multiple Integrations.

      I have not shown that mechanism in this blog to avoid complexity.

      On your point regarding single iflows can can deployed manually, Yes that is correct, But the blog focuses to ease the job of an integration team member during weekends, we can make use of this iflow to handle downtimes, we dont have to log into the systems just to undeploy and deploy the iflows during the weekend.


      Hope I answered your question!



      Author's profile photo Venkata Subbareddy Baasireddy
      Venkata Subbareddy Baasireddy

      Hello Punith,


      Splitting will be the good solution to handle multiple iflows otherwise we have change the configurable value for each and every iflow and deploy it.




      Author's profile photo Punith Oswal
      Punith Oswal
      Blog Post Author

      The idea is avoid manual intervention, deploying manually is not at all an option, Splitter can help in dealing with multiple IFlows.




      Author's profile photo Naresh Dasika
      Naresh Dasika

      Another excellent blog post !!

      The implementation of this solution has proven to be efficient in automating tasks during periods of downtime.