Skip to Content

Intro

All modern technologies, platforms, and frameworks have a dedicated tool for programmatic interaction right? Especially if you think about everything that is being used extensively these days, for example: angular, cloud foundry, npm, git and we could go on forever. Heck, even apigee has a CLI / “SDK”.

So what’s the pattern then? Well, most of them seem to have both a sort of public API (e.g. a node module) which can also be consumed through the plain old console.

The reason why this has become so widespread is fairly obvious: developers are “lazy”. Deep down you know that everything traces back to that. But it is a special flavor of lazy: we want to be as productive and mistake-free as possible, all while investing the least amount of effort and time. Personally, the things that get me really tired are the repetitive things. I hate to manually do stuff in an UI, when I can write some code to do it for me.

So first you might automatise the compilation of a project, then you might automatise the testing, then even the deployment. If you are “lazy” enough, you might even cause this process to run automatically when you check-in your code. Hint-hint: I am talking about CI/CD here. There are numerous benefits intrinsic to this and a ton of blogs about it, so I won’t insist on it further.

Even better, if you make it all open source, other people might make some cool integration with another library or tool (e.g. a grunt task).

SAP

Well, in the SAP world, CI/CD is rather new, unused or misunderstood in most areas. Of course, there are exceptions, as SAP technologies are completely heterogeneous when talking about virtually any characteristic (underlying technology, architecture, principles or lack of thereof etc.). Open source is slightly raising, but the community in my opinion is not really there yet.

With this in mind, it’s no real surprise that the SAP API Manager (Cloud) doesn’t have a CLI or a API module (not REST API) of its own. Somehow it’s strange right? The API Manager is about programmatic access to your systems and data, but it does not have a good interface of its own in this regard.

I expect that SAP will come around and make some kind of tool to help us here. But I don’t really like to wait, especially when it might take a while.

My take at it

Well, as I am “lazy” and I currently have to work with the API Manager, I decided that I can’t stand doing one more manual re-deploy of the proxies that I am working on. Keep in mind that, at the moment, the API Manager Cloud UI does not even allow you to do the full scope of operations supported by the runtime itself (e.g. define “fault rules”).

So what did I start building? With the help of a lot of other npm modules, I put together a first version of an open-source CLI and API module for the SAP API Manager. You can find source code on its GitHub repository and some more information on the wiki.

I called it sapim (shortened form of SAP API Management).

As its a first version, only a small set of commands are supported right now. If it gains track, more will surely come. I won’t dwell into the details of the module. If you are interested in it, check out the repository and / or wiki. For the remainder of the blog, I will talk about a small sample.

A demo

Let’s build a simple API Proxy for validating JSON objects (you can find the end result on one of my GitHub repositories).

Ok, so we start off by creating an empty API proxy using the API Manager UI, continued by adding in a JavaScript Callout Policy and then exporting it as a ZIP (so we can work on it locally).

Just for more re-usability, we replace some hardcoded “constants” from the API Proxy XML files with placeholders (the proxy name, description and base URL).

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<APIProxy>
    <name>{{service-name}}</name>
    <title>{{service-name}}</title>
    <description>{{service-description}}</description>
    <!-- ... -->
</APIProxy>

Now in the callout, we want to do the validation using JavaScript code. With a quick Google search, we can find the “right” npm module for doing the validation for us. Then we have to write some very easy code for calling the npm module:

const schema = require("validate");

const validate = data => schema({ /* schema definition... */ }).validate(data);

const errors = validate(JSON.parse(context.getVariable("request.content")));

context.setVariable("response.headers.content-type", "application/json");
if (errors.length) {
    context.setVariable("response.status.code", 400);
    context.setVariable("response.reason.phrase", "Bad Request");
    context.setVariable("response.content", JSON.stringify(errors));
} else {
    context.setVariable("response.status.code", 200);
    context.setVariable("response.reason.phrase", "OK");
    context.setVariable("response.content", JSON.stringify({"result": "ok"}));
}

To make sure that we didn’t make any stupid mistakes, let’s build some unit tests with Mocha:

describe("index.js", function () {
    it("should return OK for simple valid input", function () {
        var result = run({name: "Spet", email: "something@yes.com", 
            address: {city: "Cluj", street: "Brassai"}});

        assert.equal(result["response.status.code"], 200);
    });
    // ...
});

Of course, at this point if we would try to just paste the code inside the API Manager, it would not work because:

  • We use some ES6 features in there.
  • We also use the ‘require’ function.

In the API Manager, we actually need to have all required libraries uploaded as file resources. So one would first think: “Ok, I’ll just download the validate package manually and add it in there!”. Nope, that is a bad idea. Firstly, the package most likely depends on other packages. Secondly, that would be an extra manual step when we want to automatise things!

To circumvent this, we can use babel and browserify to transpile the code to “old JS” and to bundle up the dependencies. To link everything together, we’ll use Grunt (and some plugins) to build the API Proxy files from the source files. Our Grunt build would do the following steps:

  • Run some small Mocha unit tests.
  • Clean the build output folder.
  • Copy the static API Proxy files (XMLs) to the output folder.
  • Browserify the source code(bundle up all the dependencies).
  • Transpile the code to not use ES6 features anymore.
  • Package the proxy in an archive.
  • Deploy the archive to the API Manager.

The last two steps are done using custom grunt tasks wrapping the new sapim library:

grunt.registerMultiTask("package", "Package the proxy", function() {
    sapim.default()
        .packageProxyToFile(this.data.src, this.data.dest, 
            !!this.data.placeholders, this.data.placeholders)
        .then(this.async());
});

grunt.registerMultiTask("upload", "Upload the proxy", function() {
    sapim.default().uploadProxy(this.data.src).then(this.async());
});

And we can simply configure and use these tasks as any other Grunt tasks:

    "package": {
        dist: {
            src: "dist/APIProxy",
            dest: "dist/proxy.zip",
            placeholders: {
                "service-name": "sapim-sample",
                "service-description": "Sample API Proxy for using the SAPIM tool",
                "service-base-path": "/sample"
            }
        }
    },
    "upload": {
        dist: {
            src: "dist/proxy.zip",
        }
    }

I won’t go more deeply into the Grunt details, you can check it out on the GitHub repository if you are interested.The last step is to add the Grunt task(s) to the package.json:

  "scripts": {
    "deploy": "grunt deploy"
  }

After running npm run deploy, we can already see that our API Proxy has been deployed:

And, just to make sure it works, we can also test it a little in Postman:

Final words

To summarise the added value of the approach presented:

  • NPM is managing and downloading the dependencies automatically for us.
  • Grunt (together with various plugins and the library that I have built) prepares and deploys the proxy to the API Manager.
  • Mocha runs our unit tests. If a test fails, the deployment is stopped.

Without the automated build / deployment, all the above points would have to be done manually:

  • Manually download and include the libraries that you need.
  • Build the whole thing by your own, or even worse, simply always do it on the API Manager UI directly (and then you have no version control and no unit testing).
  • Either build unit tests and run them manually or simply skip unit tests altogether.

In a nutshell, if we have an automated way of deploying to the API Manager, we can use concepts and tools from other areas (NPM, Grunt, Babel, etc.) to make our proxies more robust and easier to develop. The sapim library aims to provide this way, at least until SAP offers us an official tools for doing this.

To report this post you need to login first.

2 Comments

You must be Logged on to comment or reply to a post.

  1. Elijah Martinez

    Hi Serban,

    Firstly let me say great project, and thank you for sharing with the community your efforts to ease your usage of API Management.  It’s great to see the community getting active, and to hear your feedback.

    One point to mention is that SAP API Management does not currently offer a CLI tool, but it does offer APIs to manage the platform in an automated fashion. While they are not all yet documented you can get started here: https://api.sap.com/shell/discover/contentpackage/APIMgmt

    We hope to provide additional advanced documentation across all available APIs, including transport management soon. In the meantime thank you for providing an alternative!

    Regards,
    Elijah

    (0) 
    1. Serban Petrescu Post author

      Hi Elijah,

      Yep, without the existence of the REST / OData APIs it would not be possible to do a good portion of the things that the CLI is doing. The CLI is meant to complement the APIs that you guys expose and consume them easily.

      Thanks,
      Serban

       

      (0) 

Leave a Reply