Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Swifty
Product and Topic Expert
Product and Topic Expert
This post forms a part of our overall series SAP Business Technology Platform - Software Lifecycle with elements of Continuous Integration, Delivery and Transport Management. If you'd like to follow the overall use case, the Overview post provides a good starting point.

This series and the associated blog posts are a collaboration between jacek.klatt2 and swiftc on behalf of the SAP HANA Database & Analytics, Cross Solution Management team. You will find a video walkthrough at the bottom of the blog post

 

Introduction


 

To support elements of Continuous Delivery across our scenario, we want to set up a tool to help us configure pipelines.

SAP maintains Project "Piper" to simplify the setting up of Continuous Delivery. Our end to end scenario uses the Project Piper Jenkins Image. At the time of writing, Jenkins is the only tool available. However, Project Piper is being revamped with the aim of making it accessible with the tool of your choice, so this is worth keeping an eye on as more tools become available over time

Jenkins allows us to create pipelines for automating our build, testing and release process. In our scenario, we'll be committing changes to GitHub. These commits will then trigger the testing and transport of our artifacts between the Development and Production environments of our simplified landscape

 

Preparing Our Docker Image


Before we deploy our Jenkins container, we'll need to make a change to the provided container image. This is because in order to automate the migration of artifacts for SAP Data Intelligence, we'll need to ensure the container image also has one important tool - the SAP Data Intelligence System Management Command-Line Client (also known as the helpfully shorter vctl)

At the time of writing, the most recent version is 1.0. Before you prepare your own image, you should check the SAP Software Center for the most up-to-date version (you will find the tool under the name DATA INTELLIGENCE-SYS MGMT CLI)

Before we continue with our next steps, we have two prerequisites to complete. Our first step is to install Docker on your machine to allow us to create our container images. The next step is to create an account on DockerHub, an online repository for container images - those with privacy concerns should investigate DockerHub Organizations, this is out of scope for this scenario

Docker allows us to use something called a Dockerfile to build our final container images. While these Dockerfiles can become quite involved, the instructions required for our image are reasonably straightforward
# Uses the DockerHub image ppiper/jenkins-master as a base
FROM ppiper/jenkins-master

# Copy SAP Data Intelligence command line tool (vctl) to the image and set
# PATH variable to expose it
RUN mkdir $HOME/vctl
RUN chmod 777 $HOME/vctl
COPY vctl $HOME/vctl/vctl
ENV PATH="$HOME/vctl:${PATH}"

 

The above code should be put into a file named Dockerfile (no extension). This, when placed in the same folder as the vctl tool we provided earlier will give us everything we need to build our Jenkins container image

The first line of code sets the "base image", the starting point for our container. The line refers to the most recent version of the container image located at ppiper/jenkins-master on DockerHub. From this base image, we then have the necessary code to integrate the vctl tool

 

Building our Docker Image


Before we build the container image and push it to Dockerhub, please note that we performed these actions on a Windows machine using either Command Prompt or Windows Powershell. For other operating systems including considerations around the need for sudo please consult the official documentation

First we want to navigate to the folder which contains our prepared Dockerfile and the vctl tool

Next, we can build the image locally using a single command. Two things are important to note: first, the image name we choose will be the final name of the image in DockerHub. Secondly, the full stop is important - please don't forget it.

This process may take a while - please use this time as an opportunity to grab a coffee, catch up on emails or enjoy some sunlight
docker build -t <dockerhub username>/<image name> .

After we build our Docker Image, we can view it in the Docker Desktop application


Our Jenkins Image in Docker Desktop


Before we upload our Container Image to DockerHub, we want to test it locally first. Run your Container Image, assigning port 8080 in the optional settings so that you can test for the Welcome Page at localhost:8080


If you can access the Welcome Page, then we should be ready to go


 

Uploading to DockerHub


Once we've verified our image locally, we want to push the image to DockerHub. This means that the exact container we've just created will be backed up online, and can be pulled down from there to be used in our setup.

First, we will need to use our login credentials. Using the below command and then providing your username and password will store our credentials for Docker to use when pushing images
docker login

 

Docker will use your stored credentials to push the container image in a single command.
docker push <dockerhub username>/<image name>

 

Once your container image has been successfully pushed to DockerHub, we're almost ready to start building our pipelines.

 

Hosting Your Docker Container


For our scenario, we're hosting our setup on a VM on Azure. However, the question of where to host your own setup is important, and you will want to decide for yourself on a case-by-case basis.

At the time of writing, Windows is not supported for productive use. The minimum recommendations for hosting are as below:

  • OS: Ubuntu 16.04.4 LTS

  • Docker: 18.06.1-ce

  • Memory: 4GB reserved for Docker

  • Available Disk Space: 4GB


On your VM, you will want to run the following command to generate cx-server and server.cfg, both of which will be helpful for your environment.
docker run -it --rm -u $(id -u):$(id -g) -v "${PWD}":/cx-server/mount/ ppiper/cx-server-companion:latest init-cx-server

 

Once this has run, you'll want to edit the server.cfg file to point to our customized Docker Image.

Under the heading Build Server Configuration, we need to uncomment the docker_image line, and use it to point towards our custom docker image as below

 


 

Save the server.cfg file, then it's time to start our server.

 

Initializing our Server


Once we've pointed the config towards our custom image, it's time to start our server
chmod +x ./cx-server
./cx-server start

The server will be started using our custom image, and initial admin credentials will be generated. We'll need to retrieve the initial credentials
./cx-server initial-credentials

We should now be able to access the Jenkins Dashboard through Port 80 of your host (this can be changed in server.cfg).


We can reach the Jenkins Dashboard


 

As a troubleshooting tip here, we can run the following command to get information on which images are running, which ports are in use and the container ids
./cx-server status

 

Before we're ready to use our Jenkins instance, we want to log on with the admin credentials we retrieved earlier and change the initial admin password. We can do this through Manage Jenkins->Manage Users->admin->Configure

We will also need to install one plugin to ensure that our pipelines can run correctly. We can do this by visiting Manage Jenkins->Manage Plugins->Available then navigating to the Workspace Cleanup Plugin. We'll tick the check box to the left, and then click Download now and install after restart

This plugin can be called to clean workspaces, handy for clearing files before a run

Wrap Up


Throughout this blog we've covered a lot of ground. First, we created a Jenkins Container Image capable of connecting to SAP Data Intelligence using the vctl tool

We then stored our image on DockerHub. Finally, have used the Project Piper cx-server utility to run our custom Container Image on a VM

I hope that you've found this blog helpful, and I welcome your comments and questions below. To get an understanding of our end to end scenario, you can view our Overview blog

 

Video Walkthrough




Note: While I am an employee of SAP, any views/thoughts are my own, and do not necessarily reflect those of my employer