Technical Articles
Your SAP on Azure – Part 10 – B2B Integration
Using Azure Logic Apps as the integration service is a smart way to connect your organization with the external trading partners. You don’t have to buy hardware or software licenses so there are no upfront costs. It’s not important how many messages you send or how much data is transmitted – all applications are highly scalable, and you pay only for what you have used. In the last post, I showed that building a workflow is also easy. We have designed two applications that communicate with an SAP system and today we are going to enhance the solution by establishing a cross-company integration. I will go through all required steps to enable a secure communication channel using the AS2 protocol to send encrypted and signed messages using certificates. On top, we will validate IDocs against a schema and modify them using XML transformation. Azure Data lake will be the storage for our messages archive and Key Vault will protect private keys. As always, plenty of knowledge in one post!
ENTERPRISE INTEGRATION
The Integration Account is the heart of our solution. It allows you to store and manage artifacts required to communicate with the external world like AS2 partnerships, schemas, maps and certificates. We link it with the logic app to enable the communication channel.
I’m going to deploy two integration account for sending and receiving messages.
In each Integration Account, I defined two partners with an AS2Identity qualifier.
As next I configured agreements between the Sender and the Receiver. You don’t have to maintain the detailed settings now – we will use it later to enable encryption.
AS2 INTEGRATION
Based on the examples from the last post I built two workflows with HTTP Triggers. To process the IDocs I use the Encode and Decode actions from the AS2 connector.
The encoded message from the SAPSend application will be sent to the HTTP trigger in the SAPReceive workflow. Use the dynamic expression to convert the content of the message to a binary representation (base64toBinary). Then the IDoc is decoded by the AS2 connector and passed to the SAP Application Server.
The applications flow looks as follows:
Initial configuration is completed (I told you it’s easy!) and we can already test the connectivity by generating an IDoc from the Sender system. Let’s analyze the process execution:
- The IDOC was generated in the sender system
- The message was received by the SAPSend logic app, encoded and passed to the second application
- The SAPReceive workflow processed the message and sent the IDOC to the SAP system
- The IDOC was received by SAP system
If you compare the time when the message was sent and received, you can notice that the entire execution took around 1 sec. Not bad.
MDNs
That wasn’t too difficult, was it? Let’s continue with the MDNs.
Usually, when you send the message through AS2 channel your trading partner requests a delivery confirmation called MDN. The logic app connector supports this requirement and the implementation is again quite easy.
The first step is to verify that the partner indeed requested the MDN. I used the parallel Condition action and selected the MdnExpected as the parameter. If the confirmation is not required, the application will simply answer with Code 200.
You can send the MDN in synchronous and asynchronous modes. The difference is whether the confirmation is sent as a response to the original message or is it a new request. I added another Condition action to verify the mode and send the response. The entire MDN logic should look as follow:
But how do we process MDNs that trading partners send to us? Basically, the delivery receipt is just another AS2 message. We can build a small application that will process and save them in the Data Lake storage.
The provisioning of the Azure resource takes a minute and we can move forward to build the third application. I replace the special characters in the AS2 Message ID and I use it as a filename.
When the message from SAP is triggered and the AS2 transfer is completed we can verify we have received the MDN. Open Data Explorer in the Data Lake storage to display the files:
MESSAGE ENCRYPTION AND SIGNATURE
We should ensure that the data transmitted between systems secured no one else can access or modify it. The first step is to use an HTTPS endpoint which is the default setting for Logic App. In addition, to guarantee data integrity we can use the message encryption and digital signatures. Both methods are supported in Azure and nicely integrate into AS2 connector. Depending on your requirements you can import a trusted certificate from external CA or generate a self-signed one using Azure Key Vault.
To sign the messages, we will use the sender private key. In the IA_SAPReceive I import the public part of the certificate to allow the signature verification. The message encryption is done using a trading partner public certificate (in our case it’s SAPReceive).
Certificate usage | Certificate type | Certificate file | Integration Account | Workflow |
Signature (outbound) | Own private key | SAPSend.pfx | IA_SAPSend | SAPSend |
Signature verification (inbound) | Trading partner’s public key | SAPSend.cer | IA_SAPReceive | SAPReceive |
Encryption (outbound) | Trading partner’s public key | SAPReceive.cer | IA_SAPSend | SAPSend |
Decryption (inbound) | Own private key | SAPReceive.pfx | IA_SAPReceive | SAPReceive |
To manage certificates and store private keys I’m using Azure Key Vault service.
We’re going to create a new certificate with a private key, but you can also import already existing one. Integration with external CA is possible as well. Enter the created resource, select Certificates and click on Generate/Import. Fill the required details. To allow message encryption select Data Encipherment under Key Usage flags on Advanced Policy Configurator.
Repeat above step to create a certificate for the SAPReceive workflow.
Enter each certificate and download the Public Key in the CER format. You can now enter the Integration Account and import the certificates there (look at the table above).
Finally, we need to modify the AS2 agreement to request the message signing and encryption. For SAPSend fill the send settings and for SAPReceive focus on the receiving part.
Let’s send a sample message and verify the results on the SAPReceive side:
XML SCHEMA CHECK
Before we post an IDoc to SAP system we should verify whether the message follows the agreed format. For example, we can check that the message has all the required fields.
You can easily generate the schema file for IDocs defined in an SAP system. Go to WE60 and select XML schema from the menu. Save the results to a file.
Import the file to the Integration Account associated with the SAPReceive workflow.
The schema validation requires additional building block in the logic app. Remember that AS2 content property is base64 encoded, so we need to decode it before passing the data to XML Validation:
To validate the process, I added a new field MESSAGE_TYPE in the DC40 segment and triggered the message. The XML Validation failed, and the error can be found in the output properties:
XML TRANSFORMATIONS
Very often the agreed format of the message doesn’t fully reflect organization requirements. In extreme cases, you can receive a CSV file that needs to be translated to an IDoc. You can do it within Logic Apps, however, in this post, I would like to start with something simpler and show you how to add a new field TEST_RUN to each received message. I created following XSLT map:
<xsl:stylesheet version="1.0"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:output omit-xml-declaration="no" indent="yes"/>
<xsl:strip-space elements="*"/>
<xsl:template match="node()|@*" name="ident">
<xsl:copy>
<xsl:apply-templates select="node()|@*"/>
</xsl:copy>
</xsl:template>
<xsl:template match="/FLIGHTBOOKING_CREATEFROMDAT01/IDOC/E1SBO_CRE[not(TEST_RUN)]">
<xsl:copy>
<xsl:apply-templates select="@*"/>
<TEST_RUN>X</TEST_RUN>
<xsl:apply-templates select="node()"/>
</xsl:copy>
</xsl:template>
<xsl:template match="/FLIGHTBOOKING_CREATEFROMDAT01/IDOC/E1SBO_CRE/TEST_RUN[not(text())]">
<xsl:copy>
<xsl:apply-templates select="@*"/>
<xsl:text>X</xsl:text>
<xsl:apply-templates select="node()"/>
</xsl:copy>
</xsl:template>
</xsl:stylesheet>
And I imported it to the receiver Integration Account:
As for the XML Validation, the transformation requires additional action in the message flow:
You can trigger a test message to see if it works correctly:
The message was processed successfully, and the new field was inserted. However, you can notice that the IDoc received by SAP system still doesn’t contain the new element. That’s because the HTTP action still references the AS2 property. We need to modify it to include the output of XML transformation:
Now, the transformed XML is delivered to your system.
MESSAGES ARCHIVE
Our solution is almost completed. The last thing I would add is to keep an archive of all send and received files. As we did for MDNs, we can use the Data Lake storage to archive the IDocs.
In the SAPReceive workflow I add a new action just before the Encode AS2 Message:
Similarly, I edited the SAPReceive workflow to save the decoded files:
You can check in the Storage Explorer that all sent and received messages are stored in configured directories.
Building simple applications that can integrate your SAP solution is easy using Azure Logic Apps. Built-in connectors let us design workflows almost like building a tower using LEGO bricks. It’s a good alternative to SAP Cloud Integration and you can use it for on-premise and cloud systems. What I’m missing is the pre-configured integration packages – each time you have to build an application from scratch. Also, for people who are not familiar with XSLT syntax creating schema mappings may be difficult at the beginning. But looking for the speed of innovations in Azure we won’t have to wait too long for new features!