Skip to Content
Technical Articles
Author's profile photo Martin Pankraz

SAP BTP ABAP Environment integration journey with Microsoft – Part 3 – GraphQL with API Management

👉🏿back to blog series or jump to GitHub repos🧑🏽‍💻

<<part 2

Hello and welcome back to your ABAP Cloud with Microsoft integration journey. Part 2 of this series got you covered with enterprise-ready API authentication using X.509 client certificates. Today we will be looking at modern SAP app integration with alternative query options to OData.

Speaking of modern cloud APIs. Everything ABAP Cloud is inherently about OData. Those queries can be hefty and therefore are batched to increase performance. A well-known approach to tackle over- and under-fetching issues beyond batching is the query language GraphQL. -> Request only what you need in a flexible way and mesh it up from multiple places. I found this comparison post valuable for a general overview.

Furthermore, GraphQL reduces the integration complexity for the consuming apps because it offers a single API endpoint for all entities. See the list “Most Popular APIs this year” published by Postman, with prominent entries like Microsoft Graph, Twitter API, GitHub, and Salesforce that advocate for the single endpoint strategy too.

To get started on that journey we will be exposing the SAP RAP enabled OData APIs via GraphQL using Azure API Management. Arguably fronting one or multiple OData services with GraphQL as is sacrifices some of the “fetching” gains again under the hood – if no OData filters are passed. The consuming app doesn’t notice but APIM requested the unwanted OData fields anyways.

Note: SAP API Management on Azure also offers GraphQL too.

Generate your SAP Bookings GraphQL schema

I was brave and prompted ChatGPT to generate the GraphQL schema of the SAP RAP Travel Booking sample from my OData metadata xml file.

create a graphql schema for the booking entity from below OData metadata: <?xml version="1.0" encoding="utf-8"?>…

But you may also use available generators like the OpenAPI Generator. Create the OpenAPI representation of your OData metadata xml from here.

The result will look something like this:

type BOOKING {
  TravelID: String!
  Description: String
  OverallStatus: String
}

input CreatedBooking { 
    TravelID: ID! 
    Description: String
    OverallStatus: String
}

input UpdatedBooking { 
    TravelID: ID! 
    Description: String
    OverallStatus: String
}

type Query {
  getBooking(TravelID: String!): BOOKING
}

type Mutation {
  createBooking(input: CreatedBooking): BOOKING
  updateBooking(input: UpdatedBooking): Boolean
}

Note: For better readability I already dropped many of the fields from the parsed OData schema.

Upload the GraphQL schema into Azure API Management to create a new API

Choose “Synthetic GraphQL” because we will be fronting an existing OData API. Come up with an API URL suffix to identify your single API endpoint. Click Create.

See this Microsoft docs entry for further details.

Open the Resolvers pane and add the provided resolvers for querying and updating OData entities via the SAP Travel Booking service.

Maintain your target OData service URL in line 9 (and respectively line 12 for the updateBooking resolver).

Handle OData specifics regarding CSRF tokens and ETags

GraphQL delegates the request to the downstream OData API. Therefore, we need to deal with cross-site-request-forgery (CSRF) tokens and concurrency-control (ETags) in between. I suggest tackling that on Azure API Management policy level of the OData API. Use the provided policy snippet.

<!--
    Altered version of SAP CSRF policy. Added If-Match header to handle ETags. 
-->
<policies>
    <inbound>
        <base />
        <choose>
            <!-- CSRF-token only required for every operation other than GET or HEAD -->
            <when condition="@(context.Request.Method != "GET" && context.Request.Method != "HEAD")">
                <!-- Creating a GET subrequest to get the SAP CSRF token, cookie and ETag. -->
                <send-request mode="new" response-variable-name="SAPCSRFToken" timeout="10" ignore-error="false">
                    <set-url>@(context.Request.Url.ToString())</set-url>
                    <set-method>GET</set-method>
                    <set-header name="X-CSRF-Token" exists-action="override">
                        <value>Fetch</value>
                    </set-header>
                    <set-header name="Authorization" exists-action="override">
                        <value>@(context.Request.Headers.GetValueOrDefault("Authorization"))</value>
                    </set-header>
                </send-request>
                <!-- Extract the token and cookie from the "SAPCSRFToken" and set as header in the POST request. -->
                <choose>
                    <when condition="@(((IResponse)context.Variables["SAPCSRFToken"]).StatusCode == 200)">
                        <set-header name="X-CSRF-Token" exists-action="override">
                            <value>@(((IResponse)context.Variables["SAPCSRFToken"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
                        </set-header>
                        <set-header name="Cookie" exists-action="override">
                            <value>@{
                                string rawcookie = ((IResponse)context.Variables["SAPCSRFToken"]).Headers.GetValueOrDefault("Set-Cookie");
                                string[] cookies = rawcookie.Split(';');
                                /* new session sends a XSRF cookie */
                                string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
                                /* existing sessions sends a SessionID. No other cases anticipated at this point. Please create a GitHub Pull-Request if you encounter uncovered settings. */
                                if(xsrftoken == null){
                                    xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("SAP_SESSIONID"));
                                }
                                
                                return xsrftoken.Split(',')[1];}</value>
                        </set-header>
                        <!-- add ETag for GraphQL orchestrated calls -->
                        <set-header name="If-Match" exists-action="override">
                            <value>@(((IResponse)context.Variables["SAPCSRFToken"]).Headers.GetValueOrDefault("ETag"))</value>
                        </set-header>
                    </when>
                </choose>
            </when>
        </choose>
    </inbound>
    <backend>
        <base />
    </backend>
    <outbound>
        <base />
        <find-and-replace from="@(context.Api.ServiceUrl.Host)" to="@(context.Request.OriginalUrl.Host)" />
    </outbound>
    <on-error />
</policies>

Pay special attention to line 41. Arguably retrieving the ETags on the update request defeats the concurrency purpose, since the value might have changed from the one visible on the app. However, this was the simplest way to show the overall approach with a working sample and no extra app development.

Consider passing the ETag as a response field on GraphQL for proper concurrency.

Run an integration test 😎

Move to your GraphQL API and past the query and variables from the Postman collection or design your own. Hit trace to follow the transformation from the GraphQL resolver to the SAP OData call.

query getBooking ($travelId: String!){
    getBooking(TravelID: $travelId) {
        Description
        OverallStatus
        TravelID
    }
}

Jump to the Outbound section of the trace to verify the detailed response from the BTP ABAP Environment.

Find the Postman collection with all the requests here (link ready for import). See below the update request fired from Postman using GraphQL mutation directly:

Final Words

That’s a wrap 🌯you saw today how you can up-level your SAP RAP OData APIs hosted on the BTP ABAP Environment with the query language GraphQL. The approach enables more efficient API queries and simplifies app lifecycle, because all requests are handled by a single API endpoint. Azure API Management took care of the heavy lifting regarding the integration and required transformations.

Find all the resources to replicate this setup on this GitHub repos. Stay tuned for the remaining parts of the steampunk series with Microsoft Integration Scenarios from my overview post.

Cheers

Martin

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.