In software development, you can use containers to package software into standardized units and isolate it from its runtime environment. This approach ensures that an application always runs the same, no matter in which infrastructure.
Software containers, as used in the open-source development platform Docker, comprise an application together with all the resources needed for its runtime. This includes, for example, libraries, tools, and configuration files. Docker containers, however, do not contain a complete operating system – therefore, they are way more lightweight and flexible than virtual machines.
On Docker Hub, the most important Docker registry, you can find a wide range of so-called Docker images, which at runtime become containers.
Why Use Containers in DevOps?
In the culture of DevOps, colleagues of two different task areas work closely together: developers who build applications and IT operators who run them. Whenever multiple parties collaborate in software development and deployment, however, there’s an increased risk of environmental conflicts caused by slight differences among them: That an application runs perfectly fine on the specific machine of a developer does not necessarily mean that it runs on the one of an IT operator, as well – and this might cause a lot of trouble.
This is where containerization comes into play: Through a container, software is isolated from its runtime environment and can thus be deployed anywhere – and this facilitates the DevOps interplay.
Following the analogy of real-world freight containers, developers are responsible for the content inside them while IT operators care for their processing and transport. This means that to effectively fulfill their jobs, IT operators don’t need to know what’s inside the containers they work with in detail. Instead, they get an image of the running application and can, therefore, focus on their actual task areas such as monitoring, infrastructure, and scalability.
Why are Containers Key for CI/CD?
In continuous integration (CI) and continuous delivery (CD), containerization constitutes a huge advantage, as well: Containers significantly facilitate integration tests and standardize CI/CD through Docker images.
While CI builds can only give feedback on the semantical correctness of a code change, integration tests combine various software components to check their collaboration and communication. Usually, test environments required for such integration tests are very individual and product-specific, which makes their setup rather expensive. In addition, tests of this kind are quite lengthy. By using Docker images, you can ideally address these issues: As containers already contain the application’s runtime environment, you can keep your CI/CD server clean and even run several tests in parallel to save time, if needed.
When using containers in CI/CD, your pipeline is descriptive only: In its flow, Docker images are defined as steps, while the actual task execution is handed over to the images themselves. Thereby, Docker images are independent of the pipeline engine, which means that you can easily adapt your images to the tooling you want to use.
To easily integrate Docker images into continuous delivery pipelines for SAP development projects, make sure you have a look at our collection of Dockerfiles, which are optimized for working with project “Piper”, our open source CD offering, which can be used as a shared library for the automation server Jenkins.