In this series of documents, I will introduce you to the concepts and ideas that sit behind SAP’s API Management offering. We’ll start by looking at the “the big picture”, and then as we progress through the documents, we’ll dive deeper into the details so you can see how all the pieces fit together.
By the end of this series of documents, you will be able to build a real-life API Proxy that exposes a Web Service from a backend system.
So, let’s start with the 30,000 feet overview…
Where We’ve Come From
It has long been known that lots of powerful business functionality lies buried deep inside every SAP system. However, the challenge has always been how to get at that functionality in a consistent and easy-to-consume manner?
Back in the mid-nineties, the Remote Function Call (RFC) interface arrived on the scene (1994 if my memory serves me correctly). Originally, the RFC interface was accessed from outside the SAP system via a set of C libraries that allowed an external C program to call any ABAP function module that had the “Remote Enabled” flag switched on.
This was certainly a good start, but it was fairly restricted in its adoption for the simple reason that first, in order to make use of this interface you had to be a C programmer, and second, you had to have some internal knowledge of the SAP system in order to understand the data being sent and received. Needless to say, adoption was limited.
Then, the libraries for accessing the RFC interface were expanded to include Visual Basic and this new fangled language from Sun Microsystems called “Java”. For a while, it looked like everyone was happy. Visual Basic and its MS Office counterpart “VB for Applications” was widely used and Java was growing in popularity. But the whole use of RFC based access to SAP imposed a barrier to adoption. RFC was a proprietary protocol and required the external developer to have at least some knowledge of how an ABAP function modules works.
But the mid-nineties saw the invention of another technology that has now totally transformed the face of both the retail and business computing world – the World Wide Web (now often referred to by non-technical people as “The Interwebs”). Quite soon, businesses began to see the benefits of giving customers access to their systems through these new bits of (not very compatible) software called browsers.
But as with any emerging technology, there’s always a scramble for market domination and whoever wins this scrap tends to be able to set the technical standards for how that software should behave.
All of these changes provided SAP with a significant challenge because at that time, the ABAP kernel had no way to communicate using any protocol other than DIAG (used by SAPGUI) and CPIC (used for Remote Function Calls). Back then, this new fangled protocol called HTTP was completely incompatible with ABAP systems.
So to get the ball rolling, SAP implemented a protocol translation layer called the Internet Transaction Server. The main job of this software was to translate HTTP messages arriving from browsers into either the DIAG protocol used by SAPGUI or the RFC protocol, then to translate the outbound response from the SAP system back into HTML for the browser.
Shortly after the ITS was released, and In parallel to its development, a project started to allow an ABAP kernel to send and receive XML based messages using the HTTP protocol. This then became the foundation of what is now the Internet Connectivity Framework
Step forward now to the Millenium Year and a researcher at the University of California, Irvine by the name of Roy Fielding was granted a doctorate for his thesis called Architectural Styles and the Design of Network-based Software Architectures (available here https://www.ics.uci.edu/~fielding/pubs/dissertation/top.htm if you’re curious). From this paper, the widely popular concept of Representational State Transfer, or REST has been acquired.
The architectural principles of REST are now taken to be the de facto design philosophy behind all well designed Web software. Unfortunately, the term “RESTful service” has been bandied around so much nowadays, that some people talk of REST as if it is an alternative protocol to HTTP. “Yes, we have two types of service, HTTP and REST”.
Which reminds me of a scene from the Blues Brothers: Classic Movie Line #5 – YouTube
But anyway, I digress…
All through the browser wars of the late nineties and early naughties, people were experimenting with different protocols. During this time, SOAP based Web Services started to become widely used (in spite of their dubious grasp of the principles of REST)
All this to say that everyone knew that backend business systems contained lots of useful information and functionality – its just that nobody could agree on a single, unified protocol through which these backend systems could be accessed.
And even today, people still can’t agree…
Leveling The Playing Field
In all of this technical maelstrom of conflicting or ambiguous interface standards, and with the petty squabbles over my-interface-is-better-than-your-interface, various companies have decided to sidestep the whole issue and provide software from which anyone can create their own standardisation layer. One such company was Apigee.
What SAP has done is to implement the Apigee Edge software as a cloud-based service and then provide all HCP users with access to it via a service called “SAP API Management”.
Using the functionality of API Management, you can develop your own proxies to almost any backend service you like – and there’s no restriction here to access only SAP backend systems. You can access any system that is visible to the public internet and supports HTTP based communication.
APIs Are The Fuel That Drives Innovation
Any backend business system (such as an SAP ERP system) is known as a “System of Record”. These systems need to be rock-solid stable simply because large businesses are entirely dependent both on the data they contain, and the functionality they provide. If such a system goes down, its quite possible that part (or even all) of the business could stop working!
Consequently, reliability is the number one issue here. Systems of record do not change either very often or very quickly – and for good reason.
However, in the brave new world of Agile software development and the accelerating pace of change, the systems with which the end users interact are expected to stay up to date with the latest technological developments. This means new layers of software must be built on top of the Systems of Record that can accommodate the expectation of a more rapid pace of change.
Hence we now see the emergence of Systems of Engagement that sit on top of the Systems of Record. Systems of Engagement change faster than the underlying Systems of Record, but still not fast enough to keep up with the pace of change that occurs with technology innovation. Therefore, we see a third layer of software known as Systems of Innovation.
Systems of Innovation are where the latest technological advances are implemented. The apps in this layer of software are known for their generally short lifespan and high turnover; yet at the same time, these apps still need to access the same old business functionality exposed by the Systems of Records.
Therefore, to provide a standardised interface, SAP API Management sits in between the systems of Engagement and the Systems of Innovation.
Enabling Innovation and Bimodal IT
In the world of API Management, the term “Bimodal IT” refers to the practice of managing two separate, yet coherent modes of IT delivery: one focused on stability and the other on agility.
Mode 1 is traditional and sequential, emphasizing safety and accuracy. (Systems of Record)
Mode 2 is exploratory and nonlinear, emphasizing agility and speed. (Systems of Innovation)
Starting Your API Management Service
To gain access to this API Management service, you must at least have a Trial Account on the SAP HANA Cloud Platform. If you don’t have one of these, please go and set one up now…
From your HCP account, from the menu on the left, select Services, then scroll down to the section titled “Integration” and click on the “SAP API Management” tile. If this is the first time you’ve access SAP API Management, you will first need to enable this service before proceeding.
When the API Management service screen appears, you will see that there are several different links you could select:
- The API Portal. This is used by a proxy developer for creating API Proxies and Products
- The Dev Portal. This is used by the developer of a frontend application when they wish to consume an API Proxy developed by the proxy developer
- Service Configuration. Here there are two further links that allow you to configure the roles and destinations used by API Management
Click on the link to access the API Portal (not the Dev Portal!) and you will see a screen similar to the following. If this is the first time you’ve accessed your API Management tool, then all the counters on these analytic tiles will be zero.
Before we dive into the details of how to build an API Proxy, it is very important that we first understand two things:
- The data objects with which we will be working
- How these data objects are related to each other
Overview of API Management Data Objects
In API Management, you will be working with 4 different data objects.
The first type of data object is a “System”. A system is simply a logical name for some backend system of record. Each system you define will act as the provider of backend functionality such as Web Services or OData Services.
An API Proxy is the data object that contains all the functionality to be executed when an external user wants to access the backend service. Within the API Proxy you can implement all manner of functionality such as checking that the user has not called this API more times that are permitted by their quota, or preventing a denial of service attack with a tool known as a “Spike Arrest”, or translating the incoming request into a SOAP XML payload, or performing your own custom written functionality, etc etc etc…
To be honest, the development of an API Proxy is where 99% of your development effort will go. You can think of the API Proxy as the heart of API Management.
Each API Proxy represents the interface to a single backend service, You cannot create an API Proxy that is a mashup of data received from different backend services. If you want to do that, then you should look at HCP’s OData Provisioning service and not here in API Management – this is the wrong tool for that particular job.
Now this is where the fun starts (as if we aren’t having fun already…)
It is not possible for an external user to consume an API Proxy directly. In order for your API Proxy to be visible to the outside world, you must first add it to a data object known as a “Product”. The purpose of a Product is to group proxies together into units that represent all the services needed to perform some larger unit of business functionality.
You should think of a Product as The unit of API exposure
The important thing to understand here is that in addition to an Product being able to contain multiple API Proxies, one API Proxy can be added to multiple Products. This results in the formation of something called a many-to-many hierarchy, like this:
So remember: one API Proxy can be added to multiple Products, and one Product can contain multiple API Proxies.
Either way, if you do not add an API Proxy to a Product, then no external user will be able to consume that API Proxy – it will remain completely hidden from view.
Finally, we come to the last data object known as an Application.
We must be very careful to understand what is meant here by the use of the term “application”, because in the context of API management, the term “application” does not carry the conventional meaning.
The conventional meaning for the term “application” is a unit of software with which a user interacts in order to perform some business task.
In the context of API Management, an “application” is the unit of API consumption. In other words, if you, as the developer of some external business application wish to consume the API Proxies exposed through the various Products, you must declare your usage of those Products by thecreation of this data object known as an “application”.
In other words, in API Management, an “application” represents one user’s subscription to a specific set of Products. If a different user wishes to subscribe to the same set of Products, then they must create their own application. This is how API Management is then able to manage the usage of APIs and keep track of things like quotas.
As we’ve already seen with API Proxies and Products, there is a many-to-many relationship between Products and Applications. One Application can contain multiple Products, and one Product can be consumed through multiple Applications.
Two Different Perspectives: Development and Consumption
As was described above, there are two different tools used for the development and consumption of API Proxies.
If you are the developer of an API Proxy, then you use the API Portal and are concerned with creating the data objects outlined below.
However, if you are the developer of some front end application such as an iPad app, then you will be looking at the API Proxies from the perspective of a consumer, not a developer; therefore, you will use the Dev Portal and be concerned with the data objects outlined below.
In the next document, we will look in overview at the development of each of these data objects: Systems, API Proxies, Products and Applications