Skip to Content
Technical Articles

Google Cloud Platform Backing Services Alerts via SAP Cloud Platform Alert Notification

Pre-read

This blog post is part of a series of blog posts related to SAP Cloud Platform Alert Notification service.

You can relate to the parent blog post for more detailed information about the service itself.

Let’s take the following situation – we have a solution which we deploy on SAP Cloud Platform. This solution consumes backing services from different hyper-scalers – such as AWS or GCP. Naturally, we would like to have a single approach to receive alerts from both our solution on SAP Cloud Platform and the hyper-scalers services that it uses. Furthermore, we want to do that into an hyper-scaler agnostic manner.

In the next couple of blog posts, we are going to show you precisely how you can receive alerts for the backing services from hyper-scalers via SAP Cloud Platform Alert Notification. That way, you are going to have a common alert management approach for both SAP Cloud Platform and your hyper-scalers regardless of which one you use.

Into that particular blog post, we are going to concentrate on Google Cloud Platform (GCP).

For the very same task, but on AWS you can refer to this blog post.

Prerequisites

To start, you are going to need:

  • Active GCP account
  • An active subscription to SAP Cloud Platform Alert Notification
  • SAP Cloud Platform application deployed on SAP Cloud Platform Cloud Foundry environment and using some of the GCP backing services. You can refer to this blog post to understand more details about this topic.

Setup

In this particular case, we are going to have the setup below.

  1. The example application is consisting of two modules. It is living on SAP Cloud Platform Cloud Foundry environment. It is already sending custom alerts to Alert Notification. More about custom alerts read here.
  2. This application uses Redis provided via Google Cloud Deploy.
  3. We are using Stackdriver to monitor the backing service
  4. We are using Google Cloud Functions to post custom alerts to SAP Cloud Platform Alert Notification
  5. Our sample application is also posting its application’s specific custom alerts to Alert Notification. For example, it posts an error alert whenever an exception in its code occurs.

Steps to Execute

Follow the steps below to achieve similar results.

Configure your Alert Notification

We are going to start by creating our configuration for Alert Notification. Firstly we should create a subscription, and also, we are going to create a user for basic authentication. That’s right Alert Notification REST APIs support both OAuth and Basic authentication methods.

Configure BASIC Authentication Client

  1. Subscribe for Alert Notification if you haven’t done so.
  2. Navigate to your Alert Notification service instance UI
    1. In SAP Cloud Platform Cockpit navigate to your Cloud Foundry Space -> Services -> Service Instance -> Click on your instance
  3. In the menu, select “Service Keys” and then click on the create service key button.
  4. In the popup give your keys a name, for example, “G Integration Keys” and in the description field copy and paste the following JSON:
    {
        "type":"BASIC"
    }​
  5. These are the client, secret and endpoint, which we are going to use for our integration

Configure Subscriptions

In this section, we are going to use one of the coolest features of SAP Cloud Platform Alert Notification. The import feature. We have preconfigured the subscriptions and actions for you.

  1. Copy-Paste the provided JSON below into some text editor.
    {
      "conditions": [
        {
          "name": "AnyCondition",
          "propertyKey": "eventType",
          "predicate": "ANY",
          "propertyValue": "",
          "labels": [],
          "description": ""
        }
      ],
      "actions": [
        {
          "name": "NotifyMeByEmail",
          "state": "ENABLED",
          "labels": [],
          "destination": "your.email@here",
          "description": "",
          "type": "EMAIL"
        }
      ],
      "subscriptions": [
        {
          "name": "GCPRedisAlert",
          "conditions": [
            "AnyCondition"
          ],
          "actions": [
            "NotifyMeByEmail"
          ],
          "labels": [],
          "state": "ENABLED",
          "description": ""
        }
      ]
    }​
  2. Replace the “your.email@here” with the email you want to receive alerts. If you would like to use another channel different than email, create a new action and attach it to your subscription.
  3. In the Alert Notification UI, click on “Export or Import”.
  4. Copy-Paste the modified JSON into the import field and click on “Import”.
  5. Now the only thing left is to confirm your email.
  6. Go to the “Actions” menu and click on the “NotifyMeByEmail” tile.
  7. Click on “Confirm Action” button.
  8. You are going to receive an email with confirmation code. Copy the code and post it into the popup. Then click the “Confirm” button.

Note that you can fine-tune the provided subscriptionCurrently, it accepts any alert coming to Alert Notification. To achieve better filtering, you can play around with the provided condition in the Conditions tab of the menu. For hyper-scaler alerts, we recommend the usage of tags to achieve better results.

 

Configure your GCP

Below is a one-time configuration which you should execute on in your GCP account.

We assume that you already have an instance of Redis or some other service that you use in your application.

We are going to use a Node.js based function-as-a-service from Google Cloud Functions to integrate with SAP Cloud Platform Alert Notification.

Below you can find the code that we have used for our example.

 

'use strict';

const request = require('request');
const util = require('util');

const REQUEST_TIMEOUT = 10000;

exports.postToAns = (req, res) => {
    console.log("Posting to ANS");
    let options = buildOptions(req.body);

    request(options, function(error, response, body) {
        if(error) {
            console.error(`ANS has returned an error:`);
            console.log(util.inspect(error, {depth: null}));
            res.status(500).send({
                message: "Internal server error occured"
            });
            return;
        }

        console.log(`ANS has processed the event and returned response:`);
        console.log(util.inspect(response, {depth: null}));
        res.status(response.statusCode).json(body);
    });
};

const buildOptions = (body) => {
    let ansBody = convertToAnsPayload(body);

    console.log(`Converted GCP payload to ANS one:`);
    console.log(util.inspect(ansBody, {depth: null}));

    return {
        url: process.env.ANS_SERVICE_API,
        method: "POST",
        auth: {
            user: process.env.CLIENT_ID,
            password: process.env.CLIENT_SECRET
        },
        json: convertToAnsPayload(body),
        timeout: REQUEST_TIMEOUT
    };
};

const convertToAnsPayload = (gcpPayload) => {
    console.log(`Received gcp payload:`);
    console.log(util.inspect(gcpPayload, {depth: null}));

    let incident = gcpPayload.incident || {};
     
    return {
        "eventType": "GCP.EVENT",
        "severity": chooseSeverity(incident.state),
        "category": chooseCategory(incident.state),
        "subject": incident.policy_name,
        "body": incident.summary,
        "resource": {
            "resourceName": incident.resource_name,
            "resourceType": incident.resource_name
        },
        "tags": {
            "gcp:incident_id": incident.incident_id,
            "gcp:resource_id": incident.resource_id,
            "gcp:state": incident.state,
            "gcp:started_at": incident.started_at,
            "gcp:ended_at": incident.ended_at,
            "gcp:policy_name": incident.policy_name,
            "gcp:condition_name": incident.condition_name,
            "gcp:url": incident.url
        }
    }
}

const chooseSeverity = (state) => {
    switch(state) {
        case "open": 
            return "WARNING";
        case "closed":
            return "INFO";
        default: 
            return "WARNING";
    }
};

const chooseCategory = (state) => {
    switch(state) {
        case "closed":
            return "NOTIFICATION";
        default: 
            return "ALERT";
    }
};

Another option is to use the shared Google Cloud function from here.

Note that this is just a code snippet and you can decide not to use it or to change it or to proceed with your own implementation.

 

Importing sample function into Google Cloud Platform

  1. In the Google Cloud Console search for Cloud Functions
  2. Once you open the Cloud Functions. Click on the Create Function button.
    1. As a name put gcpToANS.
    2. Remove the Allow Unauthenticated invocation checkbox.
    3. Click on the ZIP Upload radio button
    4. Under the ZIP file click on Browse and upload the provided (or your function)
    5. Now in the Stage Bucket section click on the Browse button
  3. We should now create the bucket which contains our function
    1. Click on the + button.
    2. Now it is time to give our bucket a name. For example ansfunctionsbucket. Then click on the Create button
  4. Once we have created the bucket set the Function to execute to postToAns –  this is the node.js function written in the zip file.
  5. Now it is time to set our environment variables
    1. Click on Environment Variables Environment Variables, networking, timeouts and more.
    2. In the expanded space, click the Add Variable button two more times and set the following variables.
      1. CLIENT_ID – Here you should fill the endpoint and the client and secret which you have configured in the “Configure BASIC Authentication” section of that blog post.
      2. CLIENT_SECRET- Here you should fill the endpoint and the client and secret which you have configured in the “Configure BASIC Authentication” section of that blog post.
      3. ANS_SERVICE_API –  Assemble this by copying the url property from the Service Key section and add to it /cf/producer/v1/resource-events For  example https://clm-sl-ans-live-ans-service-api.cfapps.eu10.hana.ondemand.comcf/producer/v1/resource-events
      4. Click on the Create button
      5. Wait for the function to compile

Configuring Stackdriver

  1. Back in Google Cloud Console search for Monitoring
  2. Open Stackdriver and you should see your dashboard
  3. In the upper left corner click on “My First Project” and select Workspace Settings
  4. What we are going to do next is to configure a Web Hook which invokes our function that we just created
  5. Go to Alerting > Notifications > WEBHOOKS
  6. Click on Add Webhook button.
  7. Fill in the url of your function (Can be obtained from the Trigger section of your Cloud Function details.). Once you fill in the details click on Test Connection
  8. Once the Test is finished click on Save
  9. Now it is time to configure our Notification Policy
  10. Back in the Stackdriver dashboard click on Alerting > Create Policy
  11. Click on AddCondition. For this blog post, I will use a very simple uptime alert. As shown in the picture below.
  12. Once this is done set a threshold as well
  13. Once this is done click on Save
  14. Back in the previous screen click on Add Notification Channel and select  WebHook with Token
  15. Select the WebHook we defined
  16. Finally, give your policy a name and click Save

 

You are good to go! Once your alert is raised (you can try testing by stopping your Redis instance) you will receive an email notification, which looks like this.

 

What’s Next

You can fine-tune and play around with this configuration.

We are working on similar blogs for providing integration with other hyper-scalers. Also, this integration will evolve over time and the format and the way how we display notifications will be much more appealing.

We are working on many cool features – like fallback channels for alert, alerts for CPI integration flows, lifecycle management alerts for applications, templating of alerts towards different systems like JIRA and many more. Also, we are working on enhancing our catalogue of alerts by adding more and more services.

 

Be the first to leave a comment
You must be Logged on to comment or reply to a post.