Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
bhalchandraswcg
Contributor

Previous – Content Filter | Index | Next – Normalizer


This week we'll study the pattern known as Claim Check.

When do I use this pattern?


Content Filter pattern filters out unnecessary elements or items from the payload. What if this filtered out information is required in later steps? Claim Check pattern can be used to extract the filtered out information. In Claim Check pattern, payload is stored in a persistent location before filtering out. The unique key to retrieve such stored payload is attached to the message so that any future steps requiring the filtered information may retrieve the information.

Claim Check in CPI


In CPI, Claim Check can be implemented with use of Data Store or Multicast components.

Method 1: Data Store


Integration Flow



Claim Check using Data Store


The Integration flow is similar to one used in Content Filter pattern with the addition of Claim Check steps.

Here, the integration flow starts immediately using Timer Start Event, gets the order number 10248 from OMS (Northwind) using Command Message pattern, writes the Order to Data Store, applies Content Filter, maps the payload to IDoc format using Message Translator pattern, sends the IDoc to ECC. Now, the payload contains response from the ECC. However, reporting system wants full payload containing Order Information. So, we retrieve the full Order using Get operation stored previously and send it to Reporting system using HTTP Receiver Adapter.

Here's the configuration of Write operation:







































Property Value
Data Store Name Orders
Visibility Integration Flow
Entry ID ${xpath./Orders/Order/OrderID}
Retention Threshold for Alerting (in d) 2
Expiration Period (in d) 90
Encrypt Stored Message Checked
Overwrite Existing Message Unchecked
Include Message Headers Unchecked

Here's the configuration of Get operation:



























Property Value
Data Store Name Orders
Visibility Integration Flow
Entry ID ${xpath./Orders/Order/OrderID}
Delete On Completion Checked
Throw Exception on Missing Entry Checked

Here, the OrderID, the unique key for order, acts as unique key in Data Store. Using the OrderID alone, entire Order information can be retrieved from Data Store.

Method 2: Sequential Multicast


Integration Flow



Claim Check using Multicast


Here, Integration Flow starts immediately using Timer Event and gets the Orders using Command Message pattern. The Claim Check implementation is done in Multicast alone. The Sequential Multicast causes steps on ERP branch to happen before it switches to the Reporting branch. When it switches to Reporting branch the payload before ERP branch started is restored.

In the figure Claim Check using Multicast, numbers show the steps and colours represent the content of the message; the same colour means exactly the same content. As we can see in the figure, step 6 (dark green message) happens after step 5 (light blue message), however, step 6 has the same message as step 1 (dark green message).

This behaviour of Sequential Multicast is similar to how the Claim Check works when the entire payload is stored and retrieved.

Data Store vs Multicast

























Criterion Data Store Multicast
Closeness to Pattern explanation Implementation using Data Store is a lot closer to the pattern explained in the book. Therefore, any variation of Claim Check pattern can be implemented using the Data Store. Multicast is a twist on the explanation. Multicast behaves like a claim check in cases where entire payload would be stored and retrieved later.
Compatibility with Content Enricher Using the workaround mentioned in Enrich your content with ProcessDirect blog, it is possible to use Data Store implementation with Content Enricher. Multicast removes the requirement of Content Enricher by providing a complete payload in individual branches. In cases, where only a few nodes need enriching, this can act as counter-intuitive.
Visibility As you may have noticed, Visibility can be set to 'Integration Flow' or 'Global' in the data store. Setting the visibility to Global means any Integration Flow having the correct key can access the contents of the Payload. The Multicast enables Global visibility by the use of ProcessDirect. However, note that the use of ProcessDirect means the message will pass from one Integration Flow to another leading to heavier communication between the two.

So, when should I use what?


From the above table, it does looks like Data Store is winning the battle. Here's my guideline:
Check Multicast implementation first based on the points above. Remember that the multicast implementation works well when the message stays in single flow and entire payload is required by all branches. In all other cases, use Data Store implementation.

Please let me know what you think in the comments below. Would you weigh these two approaches on any other criteria?

References/Further Readings



Hope this helps,
Bala

Previous – Content Filter | Index | Next – Normalizer

Labels in this area