Skip to Content
Technical Articles

A Serverless Extension Story – From ABAP to Azure

Introduction

The topic of extensions especially side-by-side extensions is quite hot in the SAP ecosystem. The options available for such extensions are growing at SAP e.g. via the extension factory or ABAP Steampunk and so are your paths to make the intelligent enterprise reality. However, the cloud ecosystem is diverse, and you are not restricted to SAP’s offering for building extensions. I will not discuss extensions and how to build them from a “philosophical” perspective in this blog. Instead, I want to show you on a technical level what is possible with Microsoft Azure focusing on its serverless offering without any touchpoints to SAP except for the ABAP stack.

In case you are interested in the “philosophical” discussion, I want to draw your attention to this blog on LinkedIn: https://www.linkedin.com/pulse/art-extension-sap-universe-christian-lechner/. The blog is based my talks at the SAP Inside Track Munich (https://www.youtube.com/watch?v=ivwLmDbBkB8) and Hamburg (https://www.youtube.com/watch?v=rICzyZb7Sig) where I share my opinion on that, too.

So, without further ado Let us dive into a serverless extension of SAP via Microsoft Azure.

Scenario

Let us assume that a purchase order is processed in an ABAP-based system. This process should be extended via a side-by-side extension. The extension should execute a dunning check for the business partner and send out on email if the customer has reached a specific dunning level.  Technically the ABAP system should send a message to the cloud whenever a purchase order is processed.  A side-by-side extension should subscribe to that message and fetch additional data from an SAP S/4 System i.e. the current dunning level of the customer. Depending on the dunning level the extension should publishes another message. This message is consumed by another extension and sends out an email to a person in charge of the manual further check.

As mentioned before, I want to focus on the technical implementation. The business story is certainly something that can be discussed but maybe this triggers some ideas when you see what is possible.

Let us get to the technological options that we have in Microsoft Azure and the Microsoft open source ecosystem to make the scenario reality. As mentioned above we want to stick to serverless as far as it is possible (spoiler: it is possible all the way). So, what do we need in detail:

  • We need the capability to talk from ABAP to Microsoft Azure i.e. to the Microsoft Azure Service Bus to publish a message. Microsoft offers the ABAP SDK for Azure as open source project to achieve this.
  • We need a functionality that can subscribe to the Service Bus and process the message. We use Azure Functions for that.
  • As we need some additional information from an SAP S/4HANA system we use the SAP Cloud SDK for the interaction with the ABAP backend and publish a message in the service bus if a certain dunning level is reached. In the demo we interact with an SAP S/4HANA Cloud System (i.e. the sandbox behind the SAP API Business Hub). Consequently, we need to store the corresponding credential securely. This is where Azure Key Vault comes into play.
  • Last but not least, we want to send an email in case a message was published that the customer has to be checked. We use Azure Logic Apps as a serverless low-code service of Microsoft Azure to achieve this. To send out an email we use the Gmail connector of logic apps.

The overall scenario looks like this:

In the following sections, we will make a little walk through the single steps necessary to implement the scenario. Having said that, this blog will not be hands-on tutorial description, the Microsoft documentation is too good, so I do not want to repeat it here. However, I will mention the positive and not so positive highlights that I came across in the development of the demo scenario to safeguard you from certain pitfalls.

Disclaimer

Before I start: I am quite aware that the quality of the code is not bullet-proof production ready. There is a lot of room for improvement, so look at it with a big grain of salt but keep in mind  😊

Step 1 – Setting up the Azure Message Bus

First, we need to have the messaging service in place. For that I created one resource group for all resources that I used in this demo called “AzureandSAPRG”. This way I can delete all relevant resources for the demo in one click if needed.

The first resource that I created was the “central hub” for the demo, namely the Service Bus. I did nothing special there. I created the resource and added two topics (not queues) with the default configuration:

The “purchaseorder” topic is the one used for the messages coming from ABAP, while “dunningwarning” is the one used for the messages from the Azure Function. Beneath each topic, I added a subscription for the message receivers:

In order to restrict the access from the different components I created two shared access polices on the Service Bus level, one that is restricted to read messages and one to write:

Step 2 – Calling from ABAP to Azure

Next, we need to make ABAP talk to Azure. To achieve that we make use of the ABAP SDK for Azure. You need to install the SDK on the ABAP system that should send out the message to the service bus. You achieve this via abapGit as described in the overview document.

To make it work, you must set up some infrastructural stuff like defining the HTTP connection to Azure and put the values for connecting to the Service Bus into the customizing delivered via the SDK. There is not step-by-step guide for connecting the Service Bus, but you can follow along the implementation guide for the Event Hub and combine it with the information available in the header of the demo report for the Service Bus.

I also used the report to send the message i.e. copied it, made some changes towards modern ABAP. From a functional perspective I put the customer number as well as the company ID into the message transferred to the Azure Service Bus:

The only technical change that I would highly recommend is to exchange the creation of the JSON string. While the report is using the outdated TREX functionality, I switched to the class /UI2/CL_JSON. This makes your life in Azure easier since this utility class creates consistent JSON data whereas TREX has some shortcomings.

Summarizing this step: you basically fill in some configuration and then adopting one report and you are good to go to send a message from ABAP to Azure. The first step of our extension journey is finished.

Remark: Putting in data into configuration of the SDK is far from being comfortable. The code under the hood of the SDK also has quite some room for improvement. Unfortunately, there is no public channel on the further development and improvement topics of the SDK and the speed of merging pull requests and closing issues is not that fast. Maybe it is time for a fork …

Step 3 – Service Bus Topic, Azure Function and the SAP Cloud SDK

As everything is in place to send a message to Microsoft Azure, we now must create something that subscribes to that messages. For that we create an Azure Function App with a function in it. The basics how to do that are extensively described in the Microsoft documentation (for local development as well as for development in the Azure Portal), so I will not go into every detail.

I created the function app which serves as a bracket for several functions in the portal and then created my function in my beloved local development environment aka Visual Studio Code with the Azure Functions Extension. I used the guided procedure to create a function that is triggered by an Azure Service Bus message. In detail I subscribe to the “purchaseorder” topic in the service bus and the subscription “DunningCheckSubscription” underneath that topic. This already creates you a baseline for further development as shown in the screenshot:

The language that I used is TypeScript.  Defining the Azure Service Bus subscription as trigger has the convenient side-effect that I get the customer number as well as the company number from the message automatically transferred into my function. This manifests itself in the “function.json” file as an inbound binding that can be used within the function code via the name (in this case “mySBMsg”):

The code in the function is quite basic:

  • I fetch additional information with respect to the dunning history of the customer from an SAP S/4HANA system (in my case from the sandbox system of the API Business Hub)
  • Then I apply basic machine learning i.e. an IF-ELSE statement 😊
    If the dunning level is 1, the function fetches some more data of the business partner and sends out a message with that data to the Service Bus, using the “dunningwarning” topic. In the other case no follow-up action is triggered.

In order to connect to the SAP S/4HANA system I use the SAP Cloud SDK and import the relevant modules namely “CustomerDunning” and “BusinessPartner” from the SAP Cloud SDK i.e. “@sap/cloud-sdk-vdm-business-partner-service”. This provides you a fluent API for retrieving the data.

You can see that the API key and the URL are retrieved via environment variables. I will talk about that later.

As I use TypeScript I must build the project. The configuration for that is stored in the “tsconfig.json” file. You must extend the predefined configuration with the option:

"esModuleInterop": true

This is necessary due to the dependencies coming in with the SAP Cloud SDK.

To give you an idea how many lines of code are necessary to achieve all the above said, here is what my code looks like (be aware of my disclaimer):

Wait a moment – how to we get the message back out to the service bus? The magic behind that is the bindings that you define in Azure Functions. Roughly speaking you can wire up many Azure service with the function using them like in- or output parameters. In my scenario I want to push a message to the Service Bus, so I defined an output binding in the “function.json” file:

Code-wise I put data into that parameter and that is it:

Remark: In some documentations of Azure you find that you must call “context.done()” at the end to transfer the data. This is not necessary if you use an async function, as in this case the “context.done()” callback is handled implicitly by the Azure Functions framework.

Quite cool, but we want to test that right? As we developed locally, does this mean that we must mock- up the resources from the bindings. Well no, we can go hybrid and use the resources in Microsoft Azure (in our case the Service Bus) and run the function locally. How cool is that! One thing you should take care of in this setup: the configuration of a function is taken form the Function App configuration in Azure, but for local execution the file “local.settings.json” is taken instead. So, the next question is: do I have to manually copy all the settings from Azure to this file? Again no, the Azure CLI and Azure Functions CLI supports you with that. You have to logon to Azure via the Azure CLI and then you can use the Azure Functions CLI to copy the stuff over into your local project via

func azure functionapp fetch-app-settings <name>

So, everything is set up to send a message from ABAP to Azure and use that message as a trigger for an Azure Function. However, the function is running on your local machine. To finish the development, you must deploy the function to your Azure Function App. The Azure Functions plugin in Visual Studio Code delivers that functionality:

In theory this should create the relevant artefacts in your Azure Function App. However here I struggled with some points that I want to draw your attention to. Although Visual Studio Code tells you that the function was successfully deployed, I came across the two following issues (not always, but sometimes … no idea why):

  • The function was deployed, and the status of the function was active. However, when calling the function, it returned a HTTP 503 and told me that the function is down for maintenance. What …?
    The reason for that is that the upload causes Azure to automatically create a “Down for Maintenance” file in your Function App. This should be removed when the upload is finished. Sometimes this is not the case, so you must remove it manually to get things working again. This is described e. g. here: https://stackoverflow.com/questions/53850105/problem-using-azure-functions-and-azure-cosmos-db-returning-down-for-maintenan
  • The function was deployed but could not be started due to an error that the extension for the Service Bus binding was not installed. As a background: the bindings are managed via extensions that get installed on demand e.g. when you create a function with a specific binding in the Azure portal. In my case the purging after deployment was a bit too motivated it seemed, so it purged the extension. This can be fixed manually by reinstalling it as described here https://github.com/Azure/azure-functions-host/wiki/Updating-your-function-app-extensions.

So not everything is rainbow and unicorns when deploying the function, but the ecosystem around Azure Functions is vibrant and you usually always find a fix.

One further point I want to mention: as we call out to an SAP S/4HANA system, we need to store the connection data i.e. the URL of the API and the API key. While it is fine to store the URL in the Function App configuration you probably want to store the API key or any other kind of credentials in a more secure place. This is where I used Azure Key Vault as the right service for the problem. The good thing is that no code changes are necessary to access the secrets stored there as you can simply reference the stored key in your Function App configuration by pointing to Azure Key Vault:

For more details on that see: https://azure.microsoft.com/en-us/blog/simplifying-security-for-serverless-and-web-apps-with-azure-functions-and-app-service/

So far, we are now at the point where we created another message in the service bus that can be consumed by another subscriber. To show you what is also possible with the serverless offerings in Azure I decided to use another service namely Azure Logic Apps to develop the last piece of my scenario.

Step 4 – Service Bus, Azure Logic App and Gmail

Azure Logic Apps are a serverless offering in Azure. In contrast to Azure Functions, they are a low-code approach, so you can work within a graphical designer. Your process flow is made up of single actions that you can configure. Here is my simple flow:

Be assured, there are A LOT of actions available to get things going in a Logic App:

My simple logic goes like this: As first step I created a trigger for my Logic App by registering it to the “dunningwarning” topic and the “DunningWarning” subscription:

After that I transformed the message from the service bus and mapped it to local variables:

Finally, I send out an email via Gmail (yes you are not restricted to Outlook365):

Easy like cake. Well, there was one point where I struggled with the graphical editor (I am simply not used to it) and this is where the switch to the code view which comes in handy. I used that for the parsing of the JSON:

I really like that besides the graphical editor also this type of access is available.

Another point I want to stress is that although you cannot debug the Logic App the tracing and the information it delivers is great. You can also replay payload in order to check if changes solved that error. I can tell you I used that a lot as my approach was to try and not to think which was kind of the second-best approach.

Okay so now everything is in place and we can lean back and watch how the pieces fall into place. Let’s do that:

Summary

In this blog I gave you a short overview how easy it is to create a side-by-side extension in Microsoft Azure. Starting from the ABAP SDK for Azure we could leverage some nice serverless functionalities of Azure namely Azure Functions as “classical” development as well as Logic Apps as a low code approach. We interacted with an SAP system via the SAP Cloud SDK and integrated with third-party solutions like Google Gmail.

The overall developer experience with respect to the tooling in ABAP (for sure only with ADT/Eclipse) as well as the tooling of Microsoft is very pleasant, and the available SDKs make your life easy. There have been some pain points in the deployment of Azure Functions, that I mentioned above, but nothing that could not be solved.

Now you can say “That demo has no real business scenario behind it” and I will absolutely agree. However, the ease of integrating the SAP core systems like SAP S/4HANA with Microsoft Azure and all its services open a whole new playground for new scenarios.  What if the “IF”- statement is replaced by machine learning to enable fraud detection? Now we are talking. Creativity for new process starts now, let us see where it takes us on the journey to the intelligent enterprise.

Next Steps

The demo scenario was quite basic, and I neglected some software engineering. That is why I want to dig deeper into some areas in order to get closer to a production-ish level, so maybe some more blog posts are to come on the following topics:

  • The very first area is the implementation of a circuit breaker pattern when calling the SAP S/4HANA system. I want to use the new concept of durable entities for that so basically make my hands dirty with https://dev.to/azure/serverless-circuit-breakers-with-durable-entities-3l2f
  • I did not implement any Unit Tests, which is a shame. So I want to take a closer look at this stuff: https://twitter.com/nthonyChu/status/1199965214407446528
  • Another topic I want to investigate is to store the setup and the provisioning of the resources as Infrastructure as Code and then get a bit more DevOps flow into the story (of course with Azure DevOps) … of course with Azure DevOps 😊

Updates

3 Comments
You must be Logged on to comment or reply to a post.
  • A great blog post Christian Lechner . The amalgamation of different system/services/API’s etc. is  awesome. We will be really interested to hear your experience about Unit test and DevOps.

    PS: I am just thinking with three major hyperscalers providing more of less similar  kind of services how will the customer choose?

    • To that last question; I think feature parity between hyperscalers is great, because that really gives customers true freedom of choice, independent from provided services. As much as we might think multi-cloud is a cool option, it also complicates a system landscape. Being able to consolidate on one hypescaler, will hopefully mean that customers are more willing to experiment. In my experience, they are more bound to their prefered cloud than we would like to think. And it’s also not always pure technical consideration behind that choice.

    • Hi Nabheet,

      thanks for the kind feedback. Concerning your question: despite of some edge cases I agree that there more or less a feature parity on the platforms. How to chose then – imho the choice should depend on what adds the most value to the customer. This choice is what I have experienced not driven by technology at first place but by a lot of other constraints like what cloud provider is the customer already working with, what cloud provider does the customer trust, legal constraints etc..

      Cheers

      Christian