Skip to Content
Technical Articles

Streaming the log data from Jenkins pipeline to the Elastic search and Visualization in the Kibana dashboard

In this article, I will be demonstrating the installation  of opensource Elastic search, Kibana, Logstash plugin  and how to stream the log data of a build of a Job in a Jenkins pipeline.

Logstash plugin is used to stream the log data from Jenkins instance to the any indexer. In the current scenario, i would be using Elastic search. Hence, the build logs for a job can  be streamed to the elastic search and eventually, can be visualized in the Kibana dashbaord.

Prerequisities:

Install an opensource helm chart for elastic search from the following url Elastic search.

Similarly, install the helm chart for Kibana from here. Ensure, to configure the ‘elasticsearchHosts’ and ‘elasticsearchURL’ while installing the Kibana. Those two parameters refers to the elastic search instance url and it’s hostname. These two parameters can be set during the command input for instance as shown below:

helm install --name Kibana elastic/kibana --set elasticsearchHosts="http://elasticsearch-master:9200" --set elasticsearchURL="http://elasticsearch-master:9200"

Lastly, it is required to install the logstash plugin in our Jenkins instance. To do so, login into the Jenkins instance -> Manage Jenkins -> Manage plugins as shown below

Manage plugins-> Available plugins -> Logstash plugin

After installing the plugin, you should be able to see it in the installed plugins under manage plugins.

Configuration:

After successfully installing the plugin, in the global configuration of the Jenkins instance, we need to configure the logstash plugin with the indexer (in the current scenario elastic search endpoint) and the credentials to access it if required. Please follow the below steps to do the configuration:

 

In the above image, configure the uri, where the elastic search is running.Moreover, define the indexer name followed by the the instance type for example(logstashtest/jenkins). In case, if it is a  protected url, enter the credentials for username and password and save the configuration.

Logstash working:

After configuring the elastic search endpoint in the logstash configuration in the jenkins, create a test pipeline with the logstash step to see how the logs inside the logstash step would be send to the elastic search indexer.  Sample pipeline looks like the following:

pipeline{
    agent none
    stages {
        stage("first"){
            steps {
                timestamps {
                      logstash{ 
                       echo "hello world 1"
                      }
                  
                }
            }
        }
        stage("second"){
            steps{
                timestamps {
                    logstash {
                        echo "hello world 2"
                    }
                }
            }
        }
    }
}

from the above pipeline, all the logs produced inside the logstash step, will be streamed to the elastic search endpoint.

The logs streamed to the elastic search can be viewed by accessing the elastic search endpoint. You can see if the elastic search is up by accessing

http://localhost:9200

The list of indexers can be viewed with the following command:

http://localhost:9200/_cat/indices

In my case, I have configured the indexer with the name “logstashtest” , Hence I could see all the logs of the build under this indexer by accessing the following command :

http://localhost:9200/logstashtest/_search?pretty

And the event payload data will look like the following:

{
  "took" : 2,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : 1,
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "logstashtest",
        "_type" : "jenkins",
        "_id" : "uoOWWWoB3m4VW96z1LI3",
        "_score" : 1.0,
        "_source" : {
          "data" : {
            "id" : "4",
            "projectName" : "test",
            "fullProjectName" : "test",
            "displayName" : "#4",
            "fullDisplayName" : "test #4",
            "url" : "job/test/4/",
            "buildHost" : "Jenkins",
            "buildLabel" : "master",
            "buildNum" : 4,
            "buildDuration" : 478,
            "rootProjectName" : "test",
            "rootFullProjectName" : "test",
            "rootProjectDisplayName" : "#4",
            "rootBuildNum" : 4,
            "buildVariables" : {
              "BUILD_DISPLAY_NAME" : "#4",
              "BUILD_ID" : "4",
              "BUILD_NUMBER" : "4",
              "BUILD_TAG" : "jenkins-test-4",
              "BUILD_URL" : "https://test.sap.com/job/test/4/",
              "CLASSPATH" : "",
              "HUDSON_HOME" : "/var/jenkins_home",
              "HUDSON_SERVER_COOKIE" : "240b1176c4e3f0fd",
              "HUDSON_URL" : "https://test.sap.com/",
              "JENKINS_HOME" : "/var/jenkins_home",
              "JENKINS_SERVER_COOKIE" : "240b1176c4e3f0fd",
              "JENKINS_URL" : "https://test.sap.com/",
              "JOB_BASE_NAME" : "test",
              "JOB_DISPLAY_URL" : "https://test.sap.com/job/test/display/redirect",
              "JOB_NAME" : "test",
              "JOB_URL" : "https://test.sap.com/job/test/",
              "RUN_CHANGES_DISPLAY_URL" : "https://test.sap.com/job/test/4/display/redirect?page=changes",
              "RUN_DISPLAY_URL" : "https://test.sap.com/job/test/4/display/redirect"
            }
          },
          "message" : [
            "hello world 2"
          ],
          "source" : "jenkins",
          "source_host" : "https://test.sap.com/",
          "@buildTimestamp" : "2019-04-26T12:20:17.393+0000",
          "@timestamp" : "2019-04-26T12:20:17.877+0000",
          "@version" : 1
        }
      }
    ]
  }
}

Finally, the required parameters from the event payload can be extracted using cat API of the elastic search.

Similarly, the required logs in the Kibana can be visualized after accessing to the following url:

http://localhost:5601

and the logs would look in the following manner:

The required parameters and required logs can be extracted using the filters in Kibana.

Ensure to configure the indexer pattern in the Kibana, before analyzing the logs. It is recommended to use the type “logstash*” for the pattern. As we have already configured our indexer name as “logstashtest”, Hence, now we are able to stream the log data from Jenkins instance to elastic search and finally to the Kibana.

2 Comments
You must be Logged on to comment or reply to a post.
  • Hello Srinikitha Kondreddy

    Thanks for this interesting post,

    I have a question about logstash configuration, how to add a grok filter

    of logstash plugin into Jenkins?

     

    Thanks in advance?

  • Hi Chaymaa,

    Thanks for your question. In my case, I have used Elasticsearch as an output channel and Logstash indexer to hold the Jenkins logs. Grok filter is applicable and can be used when you use the Logstash indexer as the Logstash to store the logs. Hope this answer would help you.

     

     

    Thanks and Regards,

    Nikitha