API Management is not just “managing your APIs”.
API Management is about allowing access to your data and your services in a developer friendly way. Benefits are that you can create digital services, grow your partner ecosystem and eventually develop projects much faster than with any SOA.

When thinking about “allowing access your data and services”, there is often the same discussion that comes up with my customers: their backend infrastructure is quite slow, and will not scale to unknown transaction throughput and bandwidth.

There are multiple ways to address this,for instance through caching, but eventually the backends need to be protected from unpredictable and potentially heavy traffic (“catastrophic success”).

How can this be achieved in the SAP API Management solution? Through “Traffic Management” policies, that are being configured between the API consumer and the API implementation.

The “Spike Arrest” policy will limit the amount of calls occurring in a specific time interval. For instance “60 requests per minute”. That actually means, that 1 request every second is allowed. Within that second, every further request will be rejected. Note that you can also specify a weight for the requests coming in, so that “heavy” requests will count for more than others.
Example: 600 requests per minute means 10 requests in a second. If the weight of a request is set to 5, then only 2 requests will be handled in a second.
More about this policy can be found online.

The “Concurrent Rate Limit” policy will limit the amount of connections to your backends. For instance you may not want you backends to accept more than 10 simultaneous connections from the API Management layer. You would specify this in the policy, along with the time-to-live of your connection, for instance 5 seconds.
More about this can be found online.

Note that the “Quota” policy is merely an artifact that is used when creating variations of APIs that are distinguished by the allowed number of calls (Premium/Free, Gold/Silver/ Bronze, …). This is why it is not really a technical traffic management policy.
More about this policy can be found online.

Goal of this exercise
In the following part of this blog entry, I will explain how to use the ConcurrentRateLimit policy.
You can do almost everything from the SAP API Management User Interface, however you can also download the proxies, and work on them in your preferred Editor.
In our case, we’ll use the UI first, export the API proxy (without API call), modify its XML configuration in Atom (or any other XML editor) and update the API proxy in the HCP.

I will assume that you are now familiar with SAP API Management and that you have at least one API proxy in place. If this is not the case, please refer to Holger’s blog entry, and create your own GWSAMPLE_BASIC proxy (you don’t need to do the “Creating a product” and following steps).

1- Understanding flows

As the documentation of the ConcurrentRateLimit policy states, we need to add the ConcurrentRateLimit policy in 3 flows: Target Endpoint request, Target Endpoint response and Default Fault Rule flow.

As a quick reminder:
– the Target EndPoint flows are the ones that are being used just before a request hits the backend (or Target), ie. just after the backend (Target) sent his response. If you are not yet sure how flows are working, check Chris’ blog entry.

– the error flow, or Default Fault Rule, is a specific flow that is raised whenever an error occurs in the API proxy. “When an API proxy encounters an error, the default behavior is to exit from the normal processing pipeline and to enter an error Flow. This error Flow bypasses any remaining processing Steps and Policies.”, copied from the online documentation.
Unfortunately, this flow is not yet available in the User Interface, which is the reason why we’ll work on the proxy’s XML configuration offline, which is quite common though.

3- Add the Concurrent Rate Limit policy

First of all, navigate to your HCP API Management Trial environment.
To add the Concurrent Rate Limit policy, open the GWSAMPLE_BASIC policy, and click on  “Policies“.

Click on the “Edit” link on the right-bottom corner.

Click on the “PreFlow” flow of the TargetEndpoint on the left-hand menu of the Policy Designer.
Click on the “+” sign next to the “ConcurrentRateLimit” policy, on the right-handed “Policies” panel.

Name: ConcurrentRateLimit
EndpointType and FlowType were set for you when you clicked on the “TargetEndpoint>PreFlow” link in the previous step.
Stream: Incoming Request

4- Understand the policy

As you may know by now, the policies are being configured in XML.
The same applies to the ConcurrentRateLimit policy.

Let’s have a look at the policy to understand what it does per default:

<!-- This policy allows us to throttle inbound connections to an API Proxy. It has to be attached to Preflow Request and Response flow of TargetEndPoint as well as Default fault rule of target end point -->
<ConcurrentRatelimit async="true" continueOnError="false" enabled="true" xmlns='http://www.sap.com/apimgmt'>
 <!-- count is the number of allowed concurrent connections at any given point in time and ttl refers to the time to live value of a connection -->
 <AllowConnections count="1" ttl="10"/>
 <Distributed>true</Distributed>
 <StrictOnTtl>false</StrictOnTtl>
 <!-- the target end point to which this policy should be applied -->
 <TargetIdentifier name="default"/>
</ConcurrentRatelimit>

The “AllowedConnections” is set to 1, and the time-to-live is set to 10 seconds, meaning that only 1 connection is allwed in 10 seconds.
The “Distributed” attribute means that the ConcurrentRateLimit is defined across all API proxies that are using the same Backend as Target. This is very useful if multiple API proxies are using the same target for different purposes.
The “TargetIdentifier” specifies the target for which the policy applies. This is defined in the route rules:

Some more advanced options are available and are documented online.

5- Proxy configuration

So far we haven’t done much, apart from adding one policy to our proxy. As you may have experienced for yourself, you can not save your API proxy because the UI is expecting the ConcurrentRateLimit policy to be attached to 3 flows. But because we cannot attach the policy to the error flow from the UI, we’ll do this offline.

First of all, let’s detach the policy from the TargetEndPoint flow. To do this, simply use the “-” button.

You can now save your API Proxy. Simply click on “Update” and “Save”.

It’s now time to get our hands dirty within the API Management solution 😉
On the main screen of your proxy, click on the “” button in the upper right corner and select “Export”.

The zipped file you get is the default format for a SAP API Proxy.
Unzip it’s content to your drive and open the “APIProxy” folder.
At the root of the folder, you’ll find the overall configuration of your proxy:

Open it in your preferred Editor (Atom is a great editor for instance) and have a look at it: you will see all your proxy resources, targets, policies, etc..

Now let’s have a look at our “default” TargetEndPoint definition, where we need to attach the policies to.
Open the “APITargetEndPoint” folder and open the “default.xml” file.
It should be pretty empty, ie. contain only the default XML configuration since we have no policies attached to the default target endpoint:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<TargetEndPoint>
    <name>default</name>
    <provider_id>SAPDevSystemES4</provider_id>
    <isDefault>true</isDefault>
    <relativePath>/sap/opu/odata/iwbep/GWSAMPLE_BASIC</relativePath>
    <properties/>
    <faultRules/>
    <preFlow>
        <name>PreFlow</name>
        <request>
            <isRequest>true</isRequest>
            <steps/>
        </request>
        <response>
            <isRequest>false</isRequest>
            <steps/>
        </response>
    </preFlow>
    <postFlow>
        <name>PostFlow</name>
        <request>
            <isRequest>true</isRequest>
            <steps/>
        </request>
        <response>
            <isRequest>false</isRequest>
            <steps/>
        </response>
    </postFlow>
    <conditionalFlows/>
</TargetEndPoint>

First of all, let’s add the “DefaultRouteRule” flow to the target endpoint.
To do so, copy and paste the following under the <faultRules/> element:

    <defaultFaultRule>
        <name>defaultfaultRule</name>
        <alwaysEnforce>true</alwaysEnforce>
        <steps>
            <step>
                <policy_name>ConcurrentRateLimit</policy_name>
                <condition> </condition>
                <sequence>1</sequence>
            </step>
        </steps>
    </defaultFaultRule>

Now, let’s add the policy to the request and response flows (of your default target endpoint):
To do so, in the preflow request and postflow responsereplace the empty element “<steps/>” with the following code:

            <steps>
                <step>
                    <policy_name>ConcurrentRateLimit</policy_name>
                    <condition></condition>
                    <sequence>1</sequence>
                </step>
            </steps>

Be careful about any typo, since there is no consistency check like in the UI (which they are made for…).

Your end result should look like this:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<TargetEndPoint>
    <name>default</name>
    <provider_id>SAPDevSystemES4</provider_id>
    <isDefault>true</isDefault>
    <relativePath>/sap/opu/odata/iwbep/GWSAMPLE_BASIC</relativePath>
    <properties/>
    <faultRules/>
    <defaultFaultRule>
        <name>defaultfaultRule</name>
        <alwaysEnforce>true</alwaysEnforce>
        <steps>
            <step>
                <policy_name>ConcurrentRateLimit</policy_name>
                <condition> </condition>
                <sequence>1</sequence>
            </step>
        </steps>
    </defaultFaultRule>
    <preFlow>
        <name>PreFlow</name>
        <request>
            <isRequest>true</isRequest>
            <steps>
                <step>
                    <policy_name>ConcurrentRateLimit</policy_name>
                    <condition></condition>
                    <sequence>1</sequence>
                </step>
            </steps>
        </request>
        <response>
            <isRequest>false</isRequest>
            <steps/>
        </response>
    </preFlow>
    <postFlow>
        <name>PostFlow</name>
        <request>
            <isRequest>true</isRequest>
            <steps/>
        </request>
        <response>
            <isRequest>false</isRequest>
            <steps>
                <step>
                    <policy_name>ConcurrentRateLimit</policy_name>
                    <condition></condition>
                    <sequence>1</sequence>
                </step>
            </steps>
        </response>
    </postFlow>
    <conditionalFlows/>
</TargetEndPoint>

Save your file and make sure to update/recreate the ZIP file just as you downloaded it.
Now go back to your API proxy list and “Import” the API proxy, ie. the ZIP file.

Your existing API proxy will be overwritten, ie. updated, by your modified one.

6- Check your changes and test

Navigate into your proxy, and make sure that the ConcurrentRateLimit policy has been added to the request and response of the Target EndPoint flows.

As I said before, the DefaultFaultRule flow is not visible in the UI, but we know the configuration is fine.

In order to test the proxy, the easiest way is to use POSTMAN.
I will assume you know the basics about POSTMAN, so here is only a brief description of how to test:
– create an API call to your proxy,
– save it in a specific folder (it should be the only API call in that folder),
– open the “Runner” screen of POSTMAN (button top left) and create a test run on your folder,
– set the Iteration to 100 and the delay to 0,
– do this again, so that you have 2 parallel windows open,
– start the two test runs and observe:

With animations:

As you can see, the concurrent rate limit policy works fine (503 error) since we are having more than 1 connection within 10 seconds.

Conclusion

Traffic management is not just another feature: it is one of the most critical ones. You need to protect you backend systems from a throughput and bandwidth perspective, which you can easily do with SAP API Management.
Adding features such as Caching will not only protect your backend services, but also service responses much faster, making all of your API consumers happy!

Last but not least, a great thanks to Andreas Krause, one of our top-Consultants for SAP API Management, who helped me out on this one!

Please feel free to reach out for any comment or question you may have.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply