Skip to Content
Technical Articles
Author's profile photo Deepankar Bhowmick

Enhanced tracing in SAP cloud integration (Tracer Plus)

Introduction

Tracing is a very important aspect of any integration tool and SAP cloud integration is no exception. Currently there are couple of distinct ways of implementing trace in an IFLOW.

  • Using persist message integration pattern. UI still not available to view the stored messages and a custom app needs to be built to consume the read datastore API.

  • View the payload at each step using the time bound trace feature.

But quite honestly, the flexibility the message processing log offers, it would be very good if we can somehow leverage it to view the payloads (or at least some important part of it) without crossing the circuit breaker restrictions. Enter, the reusable simple IFLOW Tracer Plus.

The reason I developed this IFLOW was due to a simple ask. When IFLOWS are running in production, and even for successfully processed IFLOWS, business or functional team would like to know few details regarding the payload which was transacted. It could be tell me the employees which were sent to WorkForce from Successfactors or Tell me the IDOC number and many request on the similar lines. And the only way to actually give the correct figure we have to check in the respective source or target system. We might need to check SRT_MONI or any IDOC related TCODES to answer those questions accurately. And this takes time. I wanted something using which I could give them the information quickly.

What is Tracer Plus IFLOW?

This reusable IFLOW helps in providing crucial information regarding the execution of another IFLOW in a payload agnostic manner. It can basically perform a deep-scan on any XML and extract the crucial information out of it. However there are 2 restriction of using Tracer plus:

  • This reusable IFLOW can only be employed in custom IFLOWS, standard IFLOWS those have been enhanced and standard IFLOW which offers exist handlers. It cannot be used to extract information of standard IFLOWS.

  • The message protocol of the payload which needs to be fed into tracer plus must be XML. So there might be a need to content conversion before invoking it.

But how does it know which elements within the given payload is crucial/important?

For that we need to develop a vocabulary of elements of the project we are working in. Let me explain with an example. Lets assume we have 10 custom IFLOWS which needs to be developed ranging from HCM to ISU to FI domains. Each of the identified interface will exchange payloads and we need to identify the elements within each of these payloads which we might consider crucial or important. It could be person_id_external, national_id, business_unit for HCM related IFLOW. It could be DOCNUM and ARCKEY for IDOC interfaces or it could be MATNR, BUKRS for some other interface. The only care must be taken that no Personally identifiable information (PII), or Sensitive Personal Information (SPI) should be treated as crucial information.

Once we have defined the vocabulary we can create a JSON out of it and upload this information as an entry in the Binary-Parameters partner directory.

{
   "config": [
      {
         "interface": "i001",
         "element": [
            "DOCNUM",
            "ARCKEY"
         ]
      },
      {
         "interface": "i002",
         "element": [
            "person_id_external",
            "national_id",
            "business_unit",
            "company",
            "manager_person_id_external"
         ]
      }
   ]
}

We can keep on expanding this vocabulary as an when new interfaces are developed or add new elements to existing interfaces. Tracer Plus will need to receive two things in order to extract the details from payload

  • The payload in XML format.
  • The interface ID as a message header. All the elements configured against this interface ID will be searched in the payload.

Tracer plus will perform a deep-scan of the inbound XML and register the unique crucial elements as custom header properties in the message processing logs. Since the full payload is not being saved, circuit breaker will not come into effect.

Tracer%20Plus%20IFLOW

Tracer Plus IFLOW.

As we can see that the IFLOW itself is very simple and the main task is happening within the XMLDeepScan.jar.

Upload%20the%20JSON%20config%20in%20the%20Binary-Parameters%20partner%20directory

Upload the JSON config in the Binary-Parameters partner directory

check out my blog on how we can manage the PD configuration on a multitenant SAP cloud integration landscape Multitenant cloud integration partner directory manager

Demonstration

I will simulate CompoundEmployee response and we will use the same configuration created above and check the results.

Simulation

Simulation by sending CompoundEmployee and the interface ID

 

Custom%20properties%20in%20MPL

Custom header properties in MPL

Here we can easily identify the employees that was part of the transaction without actually saving the payload.

Github

Here is the source code of the deployed JAR file and the artifact as well. Source Code + Artifact.

Conclusion

We have demonstrated how we can apply custom tracing to IFLOW with custom fields in the message processing logs. Feel free to engage in the comment section and any bugs or suggestion mentioned in the comment section are welcome which I will patch up in subsequent release.

Assigned Tags

      3 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Rajesh Pasupula
      Rajesh Pasupula

      Nice approach in dealing the use case.

      But wondering why cannot you leverage here the Custom Header properties which allows to capture this information.

      As I see the overhead of maintaining the additional header parameters like Interface id for each interface and maintenance of the log parameters/properties.

      Any how I like the initiative of doing differently and wish you a good luck.

      Regards

      Rajesh Pasupula

      Author's profile photo Deepankar Bhowmick
      Deepankar Bhowmick
      Blog Post Author

      Hello Rajesh,

      Deep Scan program uses custom header properties only to store the information. There are 2 benefit of using this approach:

      1. Tracing is now managed centrally using a separate iflow. So all the logic is delegated into this.

      2. You can configure the fields that you want to watch/trace without redeploying the iflow. New interfacew and new fields can also be added without any downtime.

      And when you co figure you are not touching the iflow actually. Just updating the binary parameters PD.

      Thanks

      Deepankar

      Author's profile photo Shiva Prasad Narahari
      Shiva Prasad Narahari

      Hi Deepankar Bhowmick,

      Can you explain in brief about partner directory and how to set up JSON config in the Binary-Parameters partner directory.

      Got struck at this point.

       

      Regards

      Shiva