What is Docker?

By | October 13, 2017

What is docker?

Docker is a technology for software containers Рsmall virtual systems that can be used to support lightweight system virtualization. A Docker container is a bit like a virtual machine. Rather than being an entire standalone operating system, the container shares some elements of its host operating system. Containers can have big benefits in terms of performance vs conventional virtual machines, so are popular with  developers and operations specialists who use them to help standardize software production and deployment.

Docker containers are designed to function like a distributable, lightweight operating system. Bundling elements of the operating system and other packages into the container can remove many of the dependency issues associated with sharing applications, and stop the problem of ‘well, it works on my machine‘. To help keep docker containers lightweight and quick, they are not usually packaged with the entire operating system. Only the minimum libraries and dependencies to make an application work. The high-performance, low-footprint nature of docker containers mean that host machines often run several containers at once without problems.

Docker has become hugely popular in recent years, but it actually builds on a fairly long history of Linux containers, which first came about in the early 2000s. Docker itself  arrived in 2013, but took several years to reach maturity. One of the attractions of Docker is that it made it easy to develop and share containers, provides tools and guides to get you started with containerisation, and has worked with other companies to bring standardisation to the container landscape.

How Can Docker Help?

The low-overhead virtualization provided by containers has proven popular among developers and system administrators. The use of containers can reduce dependency problems, as well as helping support the deployment of micro-services.


Docker containers can help developers by bringing standardization to their development environments. This standardization can help in a number of ways:

  • Simplified code pipelines: Once an app is created in a container, the container can easily be passed between development, testing and production. The physical machines may change, but the application environment can be kept the same
  • Reduced dependency problems: Containers can help solve the ‘it works on my machine’ problem. Rather than just sharing your application, you can share the entire environment it needs to run.
  • Application isolation: Containers make it easy to run multiple versions of an application or a library on the same machine, without getting in each others’ way.
  • Productivity: If your code is in a container it’s possible to automate the process of setting up your working environment, and easy to pass this on to a new developer when they start on a project. No more spending ages getting new developers up and running.


Lightweight containers can also benefit ops and software delivery.

  • Environment consistency: Containers make it possible to carry the same application environment from development through to production, and avoid many of the dependency issues that using a purely physical system would have.
  • Rapid deployment: Containers can be started up and shutdown in seconds, allowing application infrastructure to be scaled rapidly. This compares to virtual machines which can take minutes, and physical hardware which can take days.
  • Streamlined delivery: The speed with which containers can be deployed can also streamline the delivery of updates, bug fixes and new features.


The benefits to developers and ops also have potential to be carried over to enterprise users, although the uptake of Docker at an enterprise level, has not been huge. Many enterprise customers are still getting to grips with conventional virtual machines and are unlikely to move away from those right away. It may be that newer products and services, not burdened by legacy IT procurement will be the early adopters of containerization.

Alternatives To Docker

Since Docker has yet to saturate the market for containers, it pays to be aware of some of the alternatives and competitors in the virtualization world.

Some other approaches to containerization include:

  • Unikernels
    • Very specialised, lightweight virtual machines.
    • Can reduce complexity and improve portability and security
    • However development and deployment tooling is still emerging (but may be helped by the recent acquisition of Unikernels by Docker)
  • LXD
    • Developed by Canonical (the company behind Ubuntu)
    • Focussed more on deploying VMs than apps.
    • Built and operated with the same tools as traditional VMs, but can have runtime performance similar to containers.
  • OpenVZ
    • A container platform for running complete operating systems
    • Shares the host OS (like Linux containers), so much faster and more efficient than traditional virtual machines
    • One of the oldest container platforms still in use today (goes back to 2005)
  • Rkt
    • Originally emerged to tackle security concerns with early versions of Docker
    • An open source, lightweight operating system based on the Linux kernel
    • Designed for providing infrastructure to clustered deployments
    • Builds on the original premise of the container that Docker popularised
  • Conventional virtual machines