Skip to Content
Technical Articles

Handling Large Data with Content Enricher and OData v2 adapter

In this blog post you will learn how to better model a Content Enrich scenario in SAP Cloud Platform Integration. All full scale integrations would inevitably have one or more Enrich step. Very often, the adapter connected to the enrich pulls in all the available data from the backend system causing crashes and performance degradation. The efficiency of the integration therefore greatly depends of the finer modelling the Enrich step especially when large data is involved.

In this blog post, we will model a simple integration flow to handling large data is using dynamic queries using property and OData v2 $filter query option. In the example here, we will be enriching CompoundEmployee entity with FOLocation by creating a dynamic filter in Successfactors OData v2 adapter.

The Scenario

I have configured a looping process(as seen below) to fetch and enrich Compound Employee records in pages. The script step “Parse keys to Properties” has the logic for creating dynamic filter.

Content Enricher configuration for this scenario is shown below.

Understanding Content Enrich Functionality

Content Enrich works by matching the key elements from Original Message(i.e. Compound Employee) to the key elements in Lookup Message(i.e. FOLocation) and then intelligently aggregating the snippets of the matching data from Lookup into the Original message.

We can always help optimize the data fetched in the Lookup Message by passing the appropriate key elements as $filter parameter for the adapter. In this example, we are comparing key element location from Compound Employee to key element externalCode and therefore we will define and configure a dynamic filter on externalCode for optimization.

Configuring Content Enrich to work with Large Data Volumes

Step 1: Defining the script

The main step.  Below is the Script Code for “Parse keys to Properties” to construct the dynamic filter. The first part of the script parses the Original Message and selects all the key elements using XmlSlurper. The second part forms the key element in the OData $filter format  and sets it to property named leadKey.

You can see that I always have the property configured with a value. Else there would be a runtime failure in the OData v2 adapter if the property is missing.

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
    //Body 
    def body = message.getBody(java.lang.String) as String;
    
    //Parse the key from Original message
    def response = new XmlSlurper().parseText(body);
    def keys = response.'**'.findAll{ node-> node.name() == 'location' }*.text();
    
    //Formulate the filter query. Eg: 	$filter=externalCode in '1710-2018'
    def leadKey = "'" + keys.unique().join("','") + "'";
    if(leadKey == null || leadKey.trim().isEmpty()){
        message.setProperty("leadKey", "");
    } else{
        message.setProperty("leadKey", "\$filter=externalCode in " + leadKey);
    }
    return message;
}

 

Step 2: Configure the SucessFactors OData v2 adapter

The simplest step. Add the property leadKey in the query option. And your scenario is good to go!

Advantages of the approach

  1. Far lesser calls to the backend systems.
  2. Lesser and importantly relevant data only pulled in each enrich call.
  3. Overall improvement in throughput and execution times.

Points to consider

  • Only elements defined as filterable can be used in a $filter query. A good design approach could be to select only fields that are filterable as Lookup key elements. In case the required key element is not filterable, you can also consider another filterable element which is related to the selected key element to form the query.
<Property Name="externalCode" Type="Edm.String" Nullable="false" sap:filterable="true" sap:required="true" sap:creatable="false" sap:updatable="false" sap:upsertable="true" sap:visible="true" sap:sortable="true" MaxLength="32" sap:label="Code"/>​
  • Adding dynamic queries increases your your overall query length and this can lead to issues. Server usually have a limit on the permitted length for the query. This is defined typically as power of 2, i.e. 4096(2^12) or 8192(2^13) with Successfactors.

The above steps might look a bit overwhelming but is relatively easy to perform. The script I shared must suit your usecase with minimum changes. In general, dynamic parameters can greatly help improve performance of your integration scenarios and most adapters support dynamic injection of parameters. Do consider them in all your integration flow designs.

1 Comment
You must be Logged on to comment or reply to a post.
  • Hi Prasanth Rao,

    Very useful blog , Thank you . One key point I would like to add here is we need to batch/split with group with 999 documents as “in” filter supports only 999 (values) at a time.

    Thanks

    Nag