The Rise of Docker
In November 2014, Docker emerged into the DevOps world as potentially trending containerization technology. It got popularized by its ability to speed up continuous deployments with easy packaging and shipping of applications. Docker is an open source tool that can package an application with its dependencies such as binaries, libraries and configuration files in a virtual container, then you can run that container on any Linux server without any compatibility issues.
Containerization is quite an old concept, but Docker brings several new things to the table that the earlier technologies did not.
- Docker is designed to incorporate most of the recent DevOps tools like Chef, Puppet, Ansible, Jenkins etc.
- Docker makes it possible for developers to replicate their production environments easily as ready-to-run container applications and thus they can work more efficiently.
- Docker enables flexibility and portability by allowing applications to run on laptops, in-house servers, public cloud, private cloud etc. Managing and deploying applications is much easier.
- Docker implements a high-level API to provide lightweight containers that run processes in isolation.
Nowadays, its mainly used by developers and system administrators to build, ship and run distributed applications in association with DevOps.
Concept of Containerization
Docker is primarily developed for Linux; it uses the resource isolation features of the Linux kernel. If you are familiar with the virtualization feature of Linux, then the concept of containerization is very easy to understand.
Containerization = Virtual partitioning feature of Linux + User friendly API
|Applications require the full instance of the operating system.
||Applications share the operating system with the server, so boot/shutdown is very fast.
|Hypervisor manages virtual partitions. It is a bit heavy and contributes to compute performance.
||Docker daemon monitors and controls containers using Docker API or a command line.
|Processes running via hypervisor execution cause overhead.
||Processes run as native on the server enabling low CPU/memory overhead.
Let’s dive into Docker’s architectural components and their relationship with each other. Below is an example of how Docker architecture will work with DevOps tools.
- Docker client (docker) is interface that allows communication between the user and the Docker daemon using REST API (http request).
- Docker daemon (docker) is running on host machine handling requests for services (for example, building and storing images, creating, running and monitoring containers).
- Docker registry is backup of Docker container images with public and private access permissions.
- Docker file contains instructions to build a Docker image.
- Docker image is a read-only template with instructions for creating a Docker container (when the image is built then it is brought to life as a container).
- Docker container is running applications. There can be multiple containers running based on the same image. One can create, start, stop, move or delete using Docker API or command line.
It is also important to note that Docker uses the below operating system features:
- Namespaces make sure a process running in a container cannot see or affect processes running outside the container.
- Control Groups used for resource accounting and limiting key functionality.
- UnionFS (FileSystem) serves as building blocks of containers. It creates layers and enables Docker with lightweight and fast features.
Docker with DevOps
As shown in the example below, Docker can be integrated with continuous integration tools, like Jenkins, to put orchestration in place, enabling continuous deployments. To run any application with Docker, you will always follow two basic steps:
- Create Docker image, using dockerfile.
- Run Docker container, using docker image.
Today, most organizations are undergoing digital transformation but are constrained by how to bring harmony with legacy applications and modern infrastructure. If your organization is seeking to make applications and workloads more portable and distributed in a standardized and an effective way, then Docker is a great solution. Docker enables true independence between applications and infrastructure using lightweight containerization technique.