Technical Articles
Splunk – Part 2 : SAP CPI MPL Logging
In Part-1 of Splunk Blog, we saw how to use Splunk HTTP Event Collector (HEC) and JSON standard source type to log SAP API interaction to Splunk Cloud. You can follow the same blog steps to implement a real-time JSON HTTP Event logging from SAP CPI during Interface message exchange(like from Exception Process for example). I won’t cover this since it’s self-explanatory from Part-1 and just have to replace SAP APIM Step with an HTTP Request/Reply Call in SAP CPI.
But in this Part-2 we will see how to take the SAP CPI standard MPL that in most cases have enough information of runtime execution history into Splunk Cloud and get benefit from it.
Power of MPL
SAP CPI Message Processing Log contains structured information on the processing of a message. Read here about the MPL properties and their description. There is certain property that is not set by default and needs to be handled from Interface as listed below.
Properties | How to Set | Purpose |
CustomHeaderProperties | From Groovy Script. | Save value from the payload in addition to an Application ID. |
Id | From SAP_ApplicationID Header | Save values like IDoc number, Customer / BP Number exchanged in that interface |
MessageType | From SAP_MessageType Header | Business Object of Interface |
ReceiverId | From SAP_Receiver Header | Receiver Application of the Interface |
SenderId | From SAP_Sender Header | Sender Application of the Interface |
These are optional but having an MPL enriched with these properties will make it complete.
//add custom header property
def messageLog = messageLogFactory.getMessageLog(message);
messageLog.addCustomHeaderProperty(String name, String value);
Solution
- Create a new Source Type in Splunk for SCPI MPL
- Create an Index in Splunk for SAP CPI
- Create an HEC in Splunk for SAP CPI
- Implement a Scheduled IFlow to extract MPL and log to Splunk
1 Splunk – Create Source Type
The source type controls how Splunk formats incoming data and indexes with appropriate timestamps and event breaks. This facilitates easier searching of the data later. Splunk comes with a large number of predefined source types. JSON is one such predefined source type that was used to log API Interaction as shown in the previous blog. JSON has a fixed structure and event timestamp is taken from “time” string value pair.
However, for MPL, we will create a new Source Type so we can the MPLs JSON representation as-is to ingest into Splunk and use the MPL LogStart value as the event timestamp in Splunk.
2 Splunk – Create Index
3 Splunk – Create HTTP Even Collector
4 SAP CPI – IFlow to Extract MPL and Index to Splunk
Scheduler to run IFlow periodically | ![]() |
Read Last run Timestamp from Local Variable and set it as the start time for extraction Set the current Timestamp as the end time for extraction |
![]() |
Making a looping Process Call. Server-side pagination limit results to 1000 entries per API call. Hence we loop through the result. |
![]() |
Read MPL from ODATA API. URL : https://<tmn_host>/itspaces/odata/api/v1 |
![]() |
Convert XML to JSON | ![]() |
Extract MPL array list from ODATA response and set as the message body | Groovy Script given below the table. |
Set Splunk HEC token as Authorization Header | ![]() |
Call the Splunk Raw Event API with a Channel Identifier and Event Metadata as Query parameters | ![]() |
Overwrite the Local variable with the current run timestamp from property variable | ![]() |
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import groovy.json.*;
def Message processData(Message message) {
def body = message.getBody(java.io.Reader);
def inputJSON = new groovy.json.JsonSlurper().parse(body);
def builder = new JsonBuilder(inputJSON.MessageProcessingLog);
message.setBody(builder.toString());
return message;
}
Result
The MPL entry is sent periodically as RAW data to Splunk and indexed. Use these events to
- Write Reports and produce Dashboard of Interface runtime status.
- Search MPL in Splunk for analysis. View as table and download.
- Send Alerts for Events that meet a certain condition like failed Messages, No message for a certain interface today etc. Send Alerts as EMail / Webhook Call / JIRA ticket etc
Once there is data, the result is purely dependent on the capability of the platform and our creativity in utilizing it 🙂
Hi Santosh,
Thank you for such a wonderful blog, I am following it. I am facing below issue while posting data in trail Splunk Cloud platform.
Error Details:
javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target, cause: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Please suggest.
Regards,
Poornima.
Hi Santosh,
I tried the same thing and we are on a Splunk trial account. I am unable to see the logs since its failing with this error:
java.net.ConnectException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target, cause: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Here are the HTTP adapter connection details:
Please let me know where I may have gone wrong.
Thanks,
Ramya