Skip to Content
Technical Articles
Author's profile photo Martin Pankraz

.NET speaks OData too – how to implement Azure App Service with SAP Gateway

❤️‍🔥HOT News: Azure APIM direct OData integration released! Conversion to OpenAPI is no longer required.

This post is part of a “duet” sharing an implementation project along with configuration guidance for Azure AD, Azure App Service, SAP OData, SAP OAuth server, and Azure API Management. Check Martin Raepple’s post for details on principal propagation and hybrid connectivity with Azure ARC in this scenario.

See this YouTube series 🎥 for a guided path to SAP Principal Propagation.

Find the second post in the series about OpenAPI conversion, SDK generation and Microsoft PowerAutomate integration here.

Find the repos associated with this blog here.

See this repos to learn more about an approach using the SAP Cloud SDK enabling one-command release.

Dear community,

It is a given that SAP ERP is an interesting data-source for many users and applications. But you need to be sure, that data requests don’t interfere with critical ERP processes during periods of financial closing for instance and to manage all interested parties. Injecting an API Management solution in front of your SAP Gateway is a well-established approach to enable governance (who is calling what) and throttling to avoid overload.

There are a couple of posts on the community regarding BTP applications integrating SAP’s API Management with Azure AD for OData with single instance deployments. Today we are closing the gap and shed some light on Azure API Management (APIM) and App Service to consume data from your private S4 via the SAP gateway for that matter. At the end we look at global deployment as a finishing touch.

Internet-facing vs. internal-only vs. mixed access to SAP OData

BTP’s implementation of APIM offers you public endpoints only, which is good enough for many cloud-native use-cases. Azure APIM can be fine-tuned to support all deployment models.

Let’s have a look at a very “open” and a tightly closed setup in more detail.

Below diagram depicts a setup, where the app service and APIM are internet-facing, while APIM is also Azure VNet integrated to gain “line of sight” to the private SAP OData gateway instance.

Fig.1 architecture overview

To harden the entry point into your private Azure VNet further we would recommend adding an application gateway with web application firewall. Regardless of an app gateway, you should make sure that SAP OData services can only be reached through APIM. For that purpose, network security groups are in place to lock down communication into the SAP VNet on the https channel.

Azure AD takes care of proper user access to the app service with conditional access and multi-factor-authentication where required.

Note, that you could have Azure APIM to be internal only too and configure the Azure App service to be VNet-integrated to get “line-of-sight” again. We kept APIM internet-facing in this scenario to be reachable from BTP for instance. Find the required network security group settings for the API Management subnet on the Azure Docs.

Note: Once the new BTP service for Azure Private Link progresses further, you could even connect privately from BTP to Azure APIM.

Now, let’s look at “Fort Knox” instead.

Fig.2 fort knox architecture

In some cases, you want to stay away from any Internet exposure and isolate your application including the APIM layer to be accessible internal only. To achieve that you configure a private endpoint for App Service and set APIM to be internal only. See this blog for additional guidance next to the docs.

Fig.3 Azure APIM VNet settings

Your end users reach the app service only via VPN or ExpressRoute.

From project implementation experience we can tell that you often want to decide on a case-by-case basis, which APIs are exposed publicly or internally only. The “mixed” architecture assumes a desire for lower maintenance and therefore keeps internal and external APIs in one APIM instance. To achieve this you deploy APIM in “internal” mode but exposes the “external” facing APIs via a Gateway component, that has access to the private VNet that hosts APIM. Azure Application Gateway with path-based routing rules would be an Azure native solution for this.

Fig.4 “mixed” architecture

For greater isolation and stronger policy distinction you could consider having separate APIM instances. You can drop the application gateway if you have a dedicated APIM instance for your internet-facing APIs.

Import OData metadata into APIM to get rolling

So far so good on the connectivity side of things. To expose your OData endpoints, you typically want to import them into APIM rather than model twice. Azure APIM currently supports OpenAPI, WADL, WSDL, or Azure resources (logic apps, functions, and app service) that host an API as import format. For our purpose we will convert the SAP OData metadata xml into OpenAPI specification.

Going forward we are referring to the SAP example OData service “epm_ref_apps_prod_man_srv”.

📢UPDATE: Azure APIM direct OData integration released! Conversion to OpenAPI is no longer required.

The public org OASIS and its technical committee maintain an open-source converter tool. Find further details regarding installation on their GitHub repos. Using their CLI commands we can create the output we need.

odata-openapi -p --basePath '/sap/opu/odata/sap/epm_ref_apps_prod_man_srv' --scheme https --host <your IP>:<your SSL port> .\epm_ref_apps_prod_man_srv.metadata.xml

Mhm, so we need the metadata.xml as a file. There are multiple ways to achieve that. Using SAP GUI for instance you can leverage the built-in Gateway Client (transaction /IWFND/GW_CLIENT). I prefer Postman for any manual http calls. Either way you can retrieve the metadata xml.

Fig.5 SAP Gateway Client metadata request

The mentioned command creates an openapi.json file, which can be directly imported into Azure APIM.

If you want to be really quick without any local setup, try our web-based converter. It basically wraps the functionality of the OASIS converter and exposes it as a website. From there you can copy&paste or download. See our dedicated blog post on it for more details.

Fig.6 OData to OpenAPI Converter website

Once you uploaded the generated OpenAPI spec into APIM, you get the OData enabled specification generated. Nice 🙂

Fig.7 APIM OpenAPI import screen

SAP’s metadata xml does not contain an operation for the $metadata resource. Therefore, the openapi converter didn’t generate an entry. Unfortunately, we need it for any OData client to function properly.

Fig.8 metadata operation

In addition to that we want an http HEAD operation for efficient X-CSRF-Token retrieval.

Fig.9 HEAD operation on service root to fetch tokens

 To complete the setup add a GET operation for the service root “/” including an inbound policy.

    <base />
    <rewrite-uri template="/" copy-unmatched-params="true" />

Ready to test? Dear Mr. Postman please enlighten us…


Fig.10 Postman request via APIM to SAP OData


Ok, so we are good to go to implement our .NET client against this APIM exposed OData service 😊

Implement .NET client in Azure App Service to consume SAP OData

Using Visual Studio Code I created a .net 5 MVC scaffolding project with Azure AD integration. Now, we need decide on an OData client library. There are quite a few available. I chose Simple.OData.Client, because it was a little simpler for my prototype than the fully blown Microsoft OData Client.

public async Task<IActionResult> Index()
      // Acquire the access token.
      string[] scopes = new string[] { _Configuration.GetValue<string>("SAPODataAPI:ScopeForAccessToken") };
      ODataClient client = await getODataClientForUsername(scopes);

      var products = await client

      return View(products);

Given the base URL of the API exposed by APIM above request fires a GET against the OData entity Set “Products” with some OData operations and marshals the result into the dictionary object ProductViewModel, which contains the fields from the request. You can do this dynamically with reflection syntax, but for the sake of easier understanding I kept it explicit.

Let’s run the app already 😊

Fig.10 DotNET app consuming SAP OData

Voilà, we see the results from the EPM products OData service from our S4 system in Azure served via APIM. GET requests are easy. What about PUT, POST, PATCH and DELETE? For those we need CSRF-Token handling.

Fig.11 DotNET app editing SAP OData

Editing and saving the product price triggers a PATCH request. To be able to supply the csrf token we need to send the fetch command beforehand. In Postman scenarios this is often done with a preceding GET request to the $metadata endpoint.

So, you need to decide if you want your client or APIM to handle this.

Our project supports client-side or APIM-side handling of tokens. We recommend to use APIM to do the authentication calls and take care of the relevant token caching. This “separation of concerns” is a common practice in software architecture and allows you to scale Principal Propagation to all your clients instead of implementing it in every single client.

That leaves only the Azure AD integration to them and keeps SAP Principal Propagation etc. isolated in your APIM instance.

Have a look the snippet from our APIM policy for SAP Principal Propagation for reference.


  <!-- CSRF-token only required for every operation other than GET or HEAD -->
  <when condition="@(context.Request.Method != "GET" && context.Request.Method != "HEAD")">
      <!-- Creating a subrequest "SAPCSRFToken" and set it as HEAD request to get the token and cookie.-->
      <send-request mode="new" response-variable-name="SAPCSRFToken" timeout="10" ignore-error="false">
         <set-header name="X-CSRF-Token" exists-action="override">
         <set-header name="Authorization" exists-action="override">
             <value>@("Bearer " + (string)context.Variables["SAPBearerToken"])</value>
         <!-- Extract the token from the "SAPCSRFToken" and set as header in the POST request. -->
         <set-header name="X-CSRF-Token" exists-action="skip">
         <!-- Extract the Cookie from the "SAPCSRFToken" and set as header in the POST request. -->
         <set-header name="Cookie" exists-action="skip">
		string rawcookie = ((IResponse)context.Variables["SAPCSRFToken"]).Headers.GetValueOrDefault("Set-Cookie");
		string[] cookies = rawcookie.Split(';');
		string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
                return xsrftoken.Split(',')[1];}

Great, now we can load and update SAP data via OData exposed by the Azure APIM instance. And best of all: while honoring the SAP Principal Propagation. Caching the SAP Bearer token is key to efficient OData communication for any client app. Read more on that on Martin Raepple’s upcoming post of the series.

Avoiding login bursts (“monday morning blues”)

People have routines and therefore tend to create clusters of logins at similar times. SAP’s OAuth server can become a bottleneck during such periods. We recommend to adjust the default token lifetimes on the SAP OAuth server and implement a random back off delay parameter. That parameter ensures that your cached user tokens don’t expire all at the same time even though your users tend to login in waves (monday morning for instance). Our provided APIM policy supports that approach. See below an example to illustrate the process:

Fig.12 token lifetime handling to avoid login bursts

Of course on the very first day of your implementation when no tokens are cached yet, you are still in trouble 😉 we would recommend to rely on an APIM throttling policy in such cases. Likely you will need to experiment a bit with the parameters to find your individual optimal fit.

Scaling for global OData reach

Consuming SAP OData from anywhere in a low latency fashion is a common requirement. The architecture discussed in this post can easily be scaled globally with multiple instances of APIM and App Services.

FrontDoor makes sure that the client reaches the nearest entry point into the Microsoft backbone. VNet peering enables the requests to travel from APIM efficiently and securely over the Microsoft backbone to your primary SAP instance in your given region.

Below is an example with three Azure regions spanning north America, Europe, and Australia. Those regions also host BTP and therefore would be interesting for BTP apps too. You can check all available Azure regions here.

Fig.13 architecture overview for global access

You can add locations to your APIM instance from the Azure portal while maintaining APIs only once. Such a setup also ensures higher availability in case of an outage of App Service or APIM in any region. For your primary region next to the SAP instance it would be worth considering to think about availability zones for APIM to protect the management ui.

Fig.14 Azure APIM locations view

Result caching is great but impacts Principal Propagation

Having one primary SAP instance in one place but a consumer base globally distributed makes APIM result caching an interesting capability. That way results that change rarely (like master data) can be served in your edge location directly rather than going to the backend.

Be aware that this would render SAP Principal Propagation in-active. So, you would need to check at least authorization objects always against the SAP backend even though serving the result from APIM cache or a distributed database. Or implement another authorization layer on the APIM layer outside of SAP. Otherwise, every user of the app, who can access the API would see the result.

In case you are expecting not only global access but also ad-hoc peak demand and frequency of requests you might want to go one step further than just APIM service throttling and built-in result caching and think of implementing a fully-fledged geodes-pattern. Have a look at the related post here.

Operationalize the approach with APIOps

Working with the converter and APIM UI is nice but doesn’t scale to hundreds or thousands of APIs. For that you need to step up the approach to incorporate pipeline tooling and automation. We would recommend to have a look at this reference about APIOps and this CI/CD approach with templates to get started. Similar like we leveraged the nodejs files provided by the OASIS committee for our web-converter, you could inject that into your pipeline to convert to openAPI spec on the fly.


Final Words

We can conclude .NET speaks SAP OData too. Told you so 😉In addition to that we saw different access models of the client app and APIM (internet-facing, internal-only and mixed), that can be fine-tuned to the needs of your users and SAP backend instance in Azure.

The OData services can be imported into APIM either by converting them to their OpenApi specification or natively from the OData specification. Furthermore, global access needs can be addressed with native Azure components such as FrontDoor.

Martin Raepple’s post gives more insights on the Azure AD setup as well as the distributed Bearer token caching and hybrid deployment where Azure APIM lives on-premises. See our second post in the series on more details about APIM based token handling for Bearer and CSRF, SDK generation for OData consumption and Microsoft PowerAutomate.

Find the mentioned .NET project and config guide on my GitHub repos here.

As always feel free to ask lots of follow-up questions.


Best Regards


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Jelena Perfiljeva
      Jelena Perfiljeva

      Interesting blog, thanks for sharing!

      It seems weird that you had to do some additional changes in order to get a working OpenAPI format. Hm.

      Would love to read more about the authentication and OAuth part, it can be a confusing subject. Good pointers on the tokens.

      One note: the correct product name (at least today, who knows when SAP renames it 🙂 ) is SAP Gateway. It does handle OData services but it's not called "SAP OData gateway". I find it's always best to use the official names to avoid any confusion.

      Thank you!

      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      Agreed on the naming Jelena and changed! To me adding OData to it intially was actually out of clarity 😀

      OAuth for this scenario is covered by my colleague Martin Raepple in length in his existing blog series. He will publish the next part with reference to this post with focus on APIM like I mentioned a little further down the line.



      Author's profile photo Jelena Perfiljeva
      Jelena Perfiljeva

      Thanks for the link! 137 steps, holy cow.

      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      My pleasure. Martin always provides a lot of context and almost educational input, which raises the number of steps although the high level steps are a lot less. We found that others are more likely to succeed at first try with such a "confusing" subject as you put it.

      Author's profile photo Gov TOTAWAR
      Gov TOTAWAR

      Great Blog Martin, always good to see patterns in your blogs

      Author's profile photo Jose Muñoz Herrera
      Jose Muñoz Herrera

      Brilliant Martin Pankraz

      Author's profile photo Pavan Golesar
      Pavan Golesar

      Something that had in recent days pulled my interest is the AZURE and its APIM. Thank you, these efforts, to put it all together.

      Best Regards,

      Pavan Golesar

      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      Hi Pavan Golesar,

      you might like this repos using SAP Cloud SDK too. Let me know what you think.



      Author's profile photo Chandrashekhar Mahajan
      Chandrashekhar Mahajan

      Hello Martin Pankraz

      Great blog post! I have few questions related to this topic,

      1. is there any guidance available if we want to provision Azure APIM in front of SAP CPI? There could be scenarios where we may not just want to wrap SAP OData into OpenAPI spec and register it on APIM. we could have scenario where integration scenario developed in CPI and the end result (CPI URL) needs to be provisioned via Azure APIM. Please let me know if you have any guidance on it.
      2. I saw tools which converts OData to OpenAPI spec but when it comes to API design 1st approach, usually API spec gets designed/developed and then it needs to be developed in backend. So assuming if I have Open API spec available with me, is there a way I can create OData EDMX definition based off that? Tagging Ralf Handl as well if any information is available on it.




      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      As of this month you will be able to have a native OData experience in Azure APIM. No need for conversions anymore.

      Is this a pass through scenario with CPI? Or do you apply transformations on the iFlow before hitting the backend api? If it is pass-through you might consider Azure private networking if the SAP backend also runs on Azure.



      Author's profile photo Chandrashekhar Mahajan
      Chandrashekhar Mahajan

      Its not pass through scenario. we apply transformations on iFlows.



      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      Ok, is then pure OData good enough?

      consumer -> APIM (OData API) -> CPI (OData API) -> SAP backend (OData API)

      Why do you need the conversion from OpenAPI to OData?

      Author's profile photo Chandrashekhar Mahajan
      Chandrashekhar Mahajan

      Hello Martin,

      our scenario do not uses OData based integration. In general, I was looking for solution where APIM will be façade layer to CPI integration scenario.




      Author's profile photo Martin Pankraz
      Martin Pankraz
      Blog Post Author

      So, your question is answered then? APIM can serve as your facade. It is up to you to decide which API flavour to use. APIM supports OpenAPI or OData alike as of this month. No need for conversions anymore then.