Recently I came across a need to debug the standard execution flow of the data enrichment framework in MDG to troubleshoot some issues. Apparently the trouble was that a custom enrichment done in the system wasn’t working. However, I was pretty convinced that it wasn’t the issue and decided to find out what the real trouble was.

The problem statement was that the overall MDG performance turned slow very suddenly and it seems to get stuck at the custom data enrichment applied.

So I started debugging from the point where the enrichment gave way to the standard. I could see that the enrichment did its job well, and the entities were getting the data updated correctly. Immediately after that, came  the standard validations that perform data checks on the entire change request.

The below screenshot shows the standard code which applies all the checks for the request step.

/wp-content/uploads/2015/01/1_633772.jpg

The selection was based on the below:

/wp-content/uploads/2015/01/3_633776.jpg

Based on the configuration, there was a duplicate check in place. As I gradually proceeded with the debugging to narrow down the cause, I could see that the performance hit was coming up in the duplication check.

/wp-content/uploads/2015/01/4_633777.jpg

Next round of debugging – go inside the standard duplication check to understand whats happening within it.

The standard does the duplicate search based on the configured set of attributes.

/wp-content/uploads/2015/01/5_633778.jpg

So it prepares the data and then accesses the enterprise search template set for this data model.

/wp-content/uploads/2015/01/6_633791.jpg

Once the template is obtained, it accesses the correct connector ID for establishing the link to TREX via the connectors. ( As you must be aware by now, that the connectors are setup using the ESH Cockpit , which by itself is a pretty cool feature.)

/wp-content/uploads/2015/01/7_633792.jpg

So after doing all this ‘magic’, it accesses TREX and tries to return the results. Now if the TREX is not accessible or has any data corruption issues with the indices, this is going to fail and there would not be any results. This has the potential to ‘hang’ the system.

So it was a good learning to go through all these layers and finally pin-point that the issue wasn’t with any custom enhancement done, but rather with some corrupt indices in the TREX. And upon re-indexing the same and recreating the connector, the issue was resolved.

So a lesson learnt is to execute the enterprise test search tool from the back-end to check this issue out much faster than going through all the layers of debugging.

/wp-content/uploads/2015/01/10_633793.jpg

Hope this may prove useful for all the techies out there who are interested in cracking the duplication check available in the standard logic flow.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply