The 22nd of November I attended the International Integration Day held by the VNSG at the SAP headquarters in Den Bosch, The Netherlands. Below my view on the new and upcoming additions on Cloud Platform Integration presented during this day.
The JMS adapter is available since the end of June for Enterprise users. It is already actively used in the community, as shown in the blogs about asynchronous messaging and dead letter handling. Despite the loss of shininess, I think it’s good to promote queue usage. Compared to file-based there are clear advantages:
- Payload information: out of the box JMS will provide information about messages on the queue, enabling insights in the usage and performance of your integration
- Dead letter handling: sidetrack messages which cannot be processed and prevent node outage
- Persistence: the ability to contain and modify erroneous messages will help you solve the exceptional cases in your integration and improve the quality
Regression test service
Automated testing is one of the most underdeveloped parts of integration solutions. I think this is a huge problem: integrations are considered to be fault tolerant and provide high availability in an ecosystem of interfaces and services that are likely to change without notification. SAP is taking the first steps to address this problem by providing a regression test service. You provide the Iflow and input/output messages, SAP will create a test service which will be called before each deployment.
I think that this is a step in the right direction of improving automated testing. However, it’s a bit cumbersome that you need to handover your integration and messages: what about changes in messages? What order of delay will bring this to your project?
Eventually, the right tool is a test suite where we can develop and maintain testing scenario’s. In addition to regression tests I would also like to see the following types of tests available:
- Integration testing: evaluate the quality of data in the receiving systems
- Unit tests: develop tests for individual data conversion steps
- Destructive testing: evaluate that the bad cases are handled correctly
An announcement was made that there will be improvements in message monitoring, with focus on input and output messages of the integration. Currently the only way to view your message payload is by utilizing logging scripts, just as I described in my previous blog. The downside of this approach is that you pollute your integration flow with redundant logging components that obscure the primary integration objective.
What I would like to do is indicate the points in the flow where I would like to monitor a message, and process them with a separate integration process. Also, I would like to be able to dynamically enable parts or logging flows during run time. This functionality would lead to a decrease in downtime of the integration in case of issues.
Another comment of SAP was that they want to prevent heavily usage of the Message Processing Log. This is understandable as computational time and storage are not unlimited while high availability must be guaranteed. I think this could also be improved in the log itself by having retention on a maximal volume of messages. I’m curious of the direction SAP will take.
All in all it was an exciting day with lots of innovations in Cloud Integration. A variety of interesting subjects spiked my attention, which will be subject to my crave of knowledge and insights. Also, I’m curios to experience the new functionality for Cloud Platform Integration and the impact it will have on future projects!