Skip to Content
Technical Articles
Author's profile photo Gagan HL

Custom Alert Notification for failed iFlows in SAP CPI

Introduction:

In a project, especially in a structured landscape, the importance of SAP CPI (Cloud Platform Integration) integration cannot be overstated. The successful implementation of SAP CPI integration serves as the cornerstone that holds the entire project together acting as a bridge that links two ends.

However, it’s important to acknowledge the occurrence of failed iFlows within the SAP CPI (Cloud Platform Integration). As organisations rely heavily on smooth and seamless data exchange between applications, identifying and addressing any issues that arise during this process becomes critical to ensuring the overall success of the integration landscape.By proactively addressing these failures, businesses can enhance their operational effectiveness and pave the way for continuous improvement in their data integration endeavors.

Scenario :

This business is focused on providing an automated solution for monitoring and reporting failed integration flows within the SAP CPI (Cloud Platform Integration) tenant. The primary requirement is to send an Excel report to the recipient three times a day. This report will consolidate the data from three different time periods, also allowing the recipient to have a comprehensive view of the integration flow failures in HANA database.

Solution :

The optimal solution to fulfill this scenario involves the creation of a tailored custom integration flow that seamlessly addresses the task of sending the report of failed IFlows via email to the intended recipients. Additionally, the solution incorporates the capability to efficiently post this essential data to the SAP HANA database.

By designing a custom integration flow, we can ensure that the entire process is optimised to meet the unique requirements of the organisation. This tailored approach guarantees a smooth and reliable transfer of the report, allowing to receive timely notifications and insights regarding any failed IFlows within the SAP CPI ecosystem.

IFlow Design:

Image%201

Image 1

  • Set Timer : Here you will set the timer as per your time intervals ( example : 4 times per day, lets say its after every 6 hours starting from 01:00 AM and followed by 07:00 AM, 13:00 PM, 19:00 PM )

Image%202

Image 2

  • Content Modifier : We will declare headers to store the current date with three different formats.
  1. ${date:now:yyyy_MM_dd} : Included in the table name while creating the table in HANA database.
  2. ${date:now:’T’HH:mm} : Used for routing the initial trigger which creates the table.
  3. ${date:now:yyyy-MM-dd’T’00:00:00.000} : Used in filter condition while fetching the message processing logs.

Image%203

Image 3

  • Router :
  1. Route 1 : When the iflow is triggered initially ( example: triggered at 01:00 AM) message routes through Route 1
  2. Route 2 : When the iflow is triggered with periodic intervals after the initial trigger ( example at 07:00 AM, 13:00 PM and 19:00 PM ) message routes through Route 2

Image%204

Image 4

Route 1 :

  • Content Modifier : Here the XML SQL query to create the table respective to the date is included in the body section of the content modifier. (XML SQL Format)

Image%205

Image 5

Image%206

Image 6

  • Request Reply JDBC Adapter : Connect to HANA Database using JDBC Adapter.

Image%207

Image 7

Route 2 :

  • Request Reply ODATA V2 Adapter :

ODATA API used : Message Processing Logs

Pre requisites :

  1. Create process integration runtime service in BTP Cockpit with API plan and include MonitoringDataRead role, then create the service key.

Image%208

Image 8

Image%209

Image 9

    2. Now use the data in the created service key to create the OAuth2 Client Credentials in Overview/Manage Security Material.

Image%2010

Image 10

In the connection section of Odata adapter use the url in the service key created in address field and enter the OAuth2 Client Credentials name that was created in Overview/Manage Security Material.

Example: https://<url from service key>/api/v1

Image%2011

Image 11

Image%2012

Image 12

Now in processing section of the Odata adapter configure the query as required. Below snapshot is provided for reference.

Operation Details : GET

Resource Path : MessageProcessingLogs

Image%2013

Image 13

Here ${header.date} expression is used in filter to get the data respective to present date.

  • Message Mapping :

All nodes of xml are one to one mapped with their respective names and Slno (serial number) node is added for counting the number of data

Image%2014

Image 14

Image%2015

Image 15

Image%2016

Image 16

  • Parallel Multicast : There are two branches, branch 1 is used for mailing the report and branch 2 is used for updating the HANA database with failed iflows data.

Branch 1 :

  • XSLT Mapping : Here XSLT mapping is used for converting XML data into excel data.
  • Receiver Mail Adapter : Here configure the mail adapter with required credentials, in the processing section of the mail adapter enter the sender and recipient mailId with customised subject and mail body

Image%2017

Image 17

Also add attachment with name (example : sheet.xls), Mime-Type : Application/XML and Source : body

Image%2018

Image 18

Branch 2 :

  • Iterating Splitter : Here the Iterating Splitter is used to split the multiple MessageProcessingLog field to separate individual fields.

Image%2019

Image 19

  • Content Modifier :We will declare properties to store the data from message processing log.

Image%2020

Image 20

Image%2021

Image 21

The body section of content modifier includes XML SQL query to update data to HANA database table. Below is a sample query

<root>
<StatementName>
<dbTableName action="UPDATE_INSERT">
<table>FAILED_IFLOWS_${header.dbname}</table>
<access>
        <ID>${property.Slno}</ID>
        <ITYPE>${property.Type}</ITYPE>
        <INAME>${property.Name}</INAME>
        <PACKAGEID>${property.PackageId}</PACKAGEID>
        <LOGSTART>${property.LogStart}</LOGSTART>
        <LOGEND>${property.LogEnd}</LOGEND>
        <ORIGINCOMPONENTNAME>${property.OriginComponentName}</ORIGINCOMPONENTNAME>
        <PREVIOUSCOMPONENTNAME>${property.PreviousComponentName}</PREVIOUSCOMPONENTNAME>
        <LOCALCOMPONENTNAME>${property.LocalComponentName}</LOCALCOMPONENTNAME>
        <MESSAGEGUID>${property.MessageGuid}</MESSAGEGUID>
<ALTERNATEWEBLINK>${property.AlternateWebLink}</ALTERNATEWEBLINK>
        <LOGLEVEL>${property.LogLevel}</LOGLEVEL>
        <TRANSACTIONID>${property.TransactionId}</TRANSACTIONID>
        <CORRELATIONID>${property.CorrelationId}</CORRELATIONID>
</access>
                <key1>
                <ID>${property.Slno}</ID>
                </key1>
</dbTableName>
</StatementName>
</root>

 

Image%2022
  Image 22
  • Request Reply JDBC Adapter : Connect to HANA Database using JDBC Adapter.

Image%2023

Image 23

Results:

  • Mail :

Image%2024

Image 24

Image%2025

Image 25

  • HANA Database :

Image%2025

Image 26

In conclusion, I believe that this blog post has aimed to deliver essential knowledge and insights that presents a tailored solution involving custom integration flow and seamless integration with SAP HANA database. We seek to empower businesses to proactively monitor, report, and manage integration issues with efficiency and confidence.

 

Thanks and Regards,

Gagan H L

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.