Skip to Content
Technical Articles
Author's profile photo Martin Pankraz

.NET speaks OData too – how to implement Azure App Service with SAP Gateway

This post is part of a “duet” sharing an implementation project along with configuration guidance for Azure AD, Azure App Service, SAP OData, SAP OAuth server, and Azure API Management. Check Martin Raepple’s post (soon to be published) for details on principal propagation and hybrid connectivity in this scenario.

Dear community,

It is a given that SAP ERP is an interesting data-source for many users and applications. But you need to be sure, that data requests don’t interfere with critical ERP processes during periods of financial closing for instance and to manage all interested parties. Injecting an API Management solution in front of your SAP Gateway is a well-established approach to enable governance (who is calling what) and throttling to avoid overload.

There are a couple of posts on the community regarding BTP applications integrating SAP’s API Management with Azure AD for OData with single instance deployments. Today we are closing the gap and shed some light on Azure API Management (APIM) and App Service to consume data from your private S4 via the SAP gateway for that matter. At the end we look at global deployment as a finishing touch.

Internet-facing vs. internal-only vs. mixed access to SAP OData

BTP’s implementation of APIM offers you public endpoints only, which is good enough for many cloud-native use-cases. Azure APIM can be fine-tuned to support all deployment models.

Let’s have a look at a very “open” and a tightly closed setup in more detail.

Below diagram depicts a setup, where the app service and APIM are internet-facing, while APIM is also Azure VNet integrated to gain “line of sight” to the private SAP OData gateway instance.

Fig.1 architecture overview

To harden the entry point into your private Azure VNet further we would recommend adding an application gateway with web application firewall. Regardless of an app gateway, you should make sure that SAP OData services can only be reached through APIM. For that purpose, network security groups are in place to lock down communication into the SAP VNet on the https channel.

Azure AD takes care of proper user access to the app service with conditional access and multi-factor-authentication where required.

Note, that you could have Azure APIM to be internal only too and configure the Azure App service to be VNet-integrated to get “line-of-sight” again. We kept APIM internet-facing in this scenario to be reachable from BTP for instance. Find the required network security group settings for the API Management subnet on the Azure Docs.

Note: Once the new BTP service for Azure Private Link goes generally available you could even connect privately from BTP to Azure APIM.

Now, let’s look at “Fort Knox” instead.

Fig.2 fort knox architecture

In some cases, you want to stay away from any Internet exposure and isolate your application including the APIM layer to be accessible internal only. To achieve that you configure a private endpoint for App Service and set APIM to be internal only.

Fig.3 Azure APIM VNet settings

Your end users reach the app service only via VPN or ExpressRoute.

From project implementation experience we can tell that you often want to decide on a case-by-case basis, which APIs are exposed publicly or internally only. The “mixed” architecture assumes a desire for lower maintenance and therefore keeps internal and external APIs in one APIM instance. For greater isolation and stronger policy distinction you could consider having separate instances.

Fig.4 “mixed” architecture

Import OData metadata into APIM to get rolling

So far so good on the connectivity side of things. To expose your OData endpoints, you typically want to import them into APIM rather than model twice. Azure APIM currently supports OpenAPI, WADL, WSDL, or Azure resources (logic apps, functions, and app service) that host an API as import format. For our purpose we will convert the SAP OData metadata xml into OpenAPI specification.

Going forward we are referring to the SAP example OData service “epm_ref_apps_prod_man_srv”.

The public org OASIS and its technical committee maintain an open-source converter tool. Find further details regarding installation on their GitHub repos. Using their CLI commands we can create the output we need.

odata-openapi -p --basePath '/sap/opu/odata/sap/epm_ref_apps_prod_man_srv' --scheme https --host <your IP>:<your SSL port> .\epm_ref_apps_prod_man_srv.metadata.xml

Mhm, so we need the metadata.xml as a file. There are multiple ways to achieve that. Using SAP GUI for instance you can leverage the built-in Gateway Client (transaction /IWFND/GW_CLIENT). I prefer Postman for any manual http calls. Either way you can retrieve the metadata xml.

Fig.5 SAP Gateway Client metadata request

The mentioned command creates an openapi.json file, which can be directly imported into Azure APIM. As a result you get the OData enabled specification generated within APIM.

Fig.6 APIM OpenAPI import screen

SAP’s metadata xml does not contain an operation for the $metadata resource. Therefore, the openapi converter didn’t generate an entry. Unfortunately, we need it for any OData client to function properly. In addition to that we want an http HEAD operation for efficient X-CSRF-Token retrieval. So, to complete the setup add those two.

Fig.7 metadata operation

 

Fig.8 HEAD operation on service root to fetch tokens

 

Ready to test? Dear Mr. Postman please enlighten us…

 

Fig.9 Postman request via APIM to SAP OData

 

Ok, so we are good to go to implement our .NET client against this APIM exposed OData service 😊

Implement .NET client in Azure App Service to consume SAP OData

Using Visual Studio Code I created a .net 5 MVC scaffolding project with Azure AD integration. Now, we need decide on an OData client library. There are quite a few available. I chose Simple.OData.Client, because it was a little simpler for my prototype than the fully blown Microsoft OData Client.

public async Task<IActionResult> Index()
{
      // Acquire the access token.
      string[] scopes = new string[] { _Configuration.GetValue<string>("SAPODataAPI:ScopeForAccessToken") };
      ODataClient client = await getODataClientForUsername(scopes);

      var products = await client
         .For<ProductViewModel>("Products")
         .Top(10)
         .FindEntriesAsync();

      return View(products);
}

Given the base URL of the API exposed by APIM above request fires a GET against the OData entity Set “Products” with some OData operations and marshals the result into the dictionary object ProductViewModel, which contains the fields from the request. You can do this dynamically with reflection syntax, but for the sake of easier understanding I kept it explicit.

Let’s run the app already 😊

Fig.10 DotNET app consuming SAP OData

Voilà, we see the results from the EPM products OData service from our S4 system in Azure served via APIM. GET requests are easy. What about PUT, POST, PATCH and DELETE? For those we need CSRF-Token handling.

Fig.11 DotNET app editing SAP OData

Editing and saving the product price triggers a PATCH request. To be able to supply the csrf token we need to send the fetch command beforehand. In Postman scenarios this is often done with a preceding GET request to the $metadata endpoint.

So, you need to decide if you want your client or APIM to handle this. An important requirement will be token caching to save the additional call to get the csrf-token.

Our project supports client-side or APIM-side handling of tokens. We recommend to use APIM to do the authentication calls and take care of the relevant token caching. This “separation of concerns” is a common practice in software architecture and allows you to scale Principal Propagation to all your clients instead of implementing it in every single client.

That leaves only the Azure AD integration to them and keeps SAP Principal Propagation etc. isolated in your APIM instance.

Have a look the snippet from our APIM policy for SAP Principal Propagation for reference.

 

<choose>
  <!-- CSRF-token only required for every operation other than GET or HEAD -->
  <when condition="@(context.Request.Method != "GET" && context.Request.Method != "HEAD")">
      <!-- Creating a subrequest "SAPCSRFToken" and set it as HEAD request to get the token and cookie.-->
      <send-request mode="new" response-variable-name="SAPCSRFToken" timeout="10" ignore-error="false">
         <set-url>@(context.Request.Url.ToString())</set-url>
         <set-method>HEAD</set-method>
         <set-header name="X-CSRF-Token" exists-action="override">
             <value>Fetch</value>
         </set-header>
         <set-header name="Authorization" exists-action="override">
             <value>@("Bearer " + (string)context.Variables["SAPBearerToken"])</value>
         </set-header>
         </send-request>
         <!-- Extract the token from the "SAPCSRFToken" and set as header in the POST request. -->
         <set-header name="X-CSRF-Token" exists-action="skip">
             <value>@(((IResponse)context.Variables["SAPCSRFToken"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
         </set-header>
         <!-- Extract the Cookie from the "SAPCSRFToken" and set as header in the POST request. -->
         <set-header name="Cookie" exists-action="skip">
             <value>@{
		string rawcookie = ((IResponse)context.Variables["SAPCSRFToken"]).Headers.GetValueOrDefault("Set-Cookie");
		string[] cookies = rawcookie.Split(';');
		string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
		
                return xsrftoken.Split(',')[1];}
             </value>
          </set-header>
  </when>
</choose>

Great, now we can load and update SAP data via OData exposed by the Azure APIM instance. And best of all: while honoring the SAP Principal Propagation. Caching the SAP Bearer token is key to efficient OData communication for any client app. Read more on that on Martin Raepple’s upcoming post of the series.

Scaling for global OData reach

Consuming SAP OData from anywhere in a low latency fashion is a common requirement. The architecture discussed in this post can easily be scaled globally with multiple instances of APIM and App Services.

FrontDoor makes sure that the client reaches the nearest entry point into the Microsoft backbone. VNet peering enables the requests to travel from APIM efficiently and securely over the Microsoft backbone to your primary SAP instance in your given region.

Below is an example with three Azure regions spanning north America, Europe, and Australia. Those regions also host BTP and therefore would be interesting for BTP apps too. You can check all available Azure regions here.

Fig.12 architecture overview for global access

You can add locations to your APIM instance from the Azure portal while maintaining APIs only once. Such a setup also ensures higher availability in case of an outage of App Service or APIM in any region. For your primary region next to the SAP instance it would be worth considering to think about availability zones for APIM to protect the management ui.

Fig.13 Azure APIM locations view

Result caching is great but impacts Principal Propagation

Having one primary SAP instance in one place but a consumer base globally distributed makes APIM result caching an interesting capability. That way results that change rarely (like master data) can be served in your edge location directly rather than going to the backend.

Be aware that this would render SAP Principal Propagation in-active. So, you would need to check at least authorization objects always against the SAP backend even though serving the result from APIM cache or a distributed database. Or implement another authorization layer on the APIM layer outside of SAP. Otherwise, every user of the app, who can access the API would see the result.

In case you are expecting not only global access but also ad-hoc peak demand and frequency of requests you might want to go one step further than just APIM service throttling and built-in result caching and think of implementing a fully-fledged geodes-pattern. Have a look at the related post here.

Final Words

We can conclude .NET speaks SAP OData too. Told you so 😉In addition to that we saw different access models of the client app and APIM (internet-facing, internal-only and mixed), that can be fine-tuned to the needs of your users and SAP backend instance in Azure. The OData services can be imported into APIM by converting them to their OpenApi specification. Furthermore, global access needs can be addressed with native Azure components such as FrontDoor.

Martin Raepple’s post gives more insights on the Azure AD setup as well as the distributed Bearer token caching and hybrid deployment where Azure APIM lives on-premises. Stay tuned for updates on efficient CSRF-Token handling with APIM policies.

Find the mentioned .NET project and config guide on my GitHub repos here.

As always feel free to ask lots of follow-up questions.

 

Best Regards

Martin

Assigned tags

      6 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Jelena Perfiljeva
      Jelena Perfiljeva

      Interesting blog, thanks for sharing!

      It seems weird that you had to do some additional changes in order to get a working OpenAPI format. Hm.

      Would love to read more about the authentication and OAuth part, it can be a confusing subject. Good pointers on the tokens.

      One note: the correct product name (at least today, who knows when SAP renames it 🙂 ) is SAP Gateway. It does handle OData services but it's not called "SAP OData gateway". I find it's always best to use the official names to avoid any confusion.

      Thank you!

      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      Agreed on the naming Jelena and changed! To me adding OData to it intially was actually out of clarity 😀

      OAuth for this scenario is covered by my colleague Martin Raepple in length in his existing blog series. He will publish the next part with reference to this post with focus on APIM like I mentioned a little further down the line.

      KR

      Martin

      Author's profile photo Jelena Perfiljeva
      Jelena Perfiljeva

      Thanks for the link! 137 steps, holy cow.

      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      My pleasure. Martin always provides a lot of context and almost educational input, which raises the number of steps although the high level steps are a lot less. We found that others are more likely to succeed at first try with such a "confusing" subject as you put it.

      Author's profile photo Gov Totawar
      Gov Totawar

      Great Blog Martin, always good to see patterns in your blogs

      Author's profile photo Jose Muñoz Herrera
      Jose Muñoz Herrera

      Brilliant Martin Pankraz