Docker: portable containers for consistent development and deployment
Docker is an open source platform that automates the development and deployment of applications inside portable and self-sufficient software “containers”. Python developers can compare it with virtualenv - a tool for building virtual isolated Python environments - that works for any application. It is very useful in situations when you need to run legacy application on newer host system or to test some different versions.
Docker enables developers and sysadmins to quickly assemble apps from components and smoothes relations between development, QA, and production environments, by offering a supplementary layer of abstraction and automation of operating system–level virtualization on Linux. The “Dockerized” apps can be shipped and run anywhere - on laptops, bare metal, OpenStack clusters, data center VMs, or any clouds.
Why do you need Docker and what is so special about it? Simplicity and speed, of course. Web development is full of complexity, because developer needs to coordinate front-end and back-end languages, database, OS, dependencies, and other platform related issues. Container used by Docker is a perfect solution for it. Containers are similar to virtual machines, but they aim at process isolation and containment. Several tools provide resource isolation and detach an app's view from the operating environment, so that developers can put their applications into a standardized container and ship it to any environment they need. Essentially, Docker solves the issues with dependencies by packaging them along with the applications into containers. No more conflicting or missing dependencies, no more platform differences. For instance, spinning up a Plone instance using Docker can reduce your headaches immensely.
Docker consists of two main components: Docker Hub and Docker Engine. Docker Hub is a cloud solution for sharing apps and automating workflows. Docker Engine is a portable runtime and packaging system that gives standardized environments for the development and flexibility for workload deployment so that it is not restricted by infrastructure technology. Moreover, Engine’s lightweight runtime empowers sysadmins with fast scaling capabilities in response to the slightest changes in demand. Using these features Docker allows users to build and ship higher-quality applications quickly and reliably, to abstract away differences in OS distributions for deployment on any infrastructure, to track changes and dependencies in order to manage applications more efficiently.
Recently the Docker platform was enhanced to orchestrate multi-container distributed applications. These apps become interoperable and iterative components that can run anywhere. Docker ensures their dynamic lifecycle with three comprehensive platform services:
- Docker Machine extends the portability features of distributed apps even further. Flexibility in host provision provides quicker iterations and compressing the development-to-deployment cycle.
- Docker Swarm was developed specifically to support a continuous lifecycle, its high-performance and availability. With allocating resources and container workloads that are scheduled automatically Swarm removes inefficient and error-prone manual resource management.
- Docker Compose transforms complex and time consuming procedure of deployment into a simple one. A small YAML configuration file and a few keystrokes allow assembling apps from discrete Docker containers and deploy them very quickly, independently of any underlying infrastructure.
One of the biggest advantages of Docker is its unique modular structure and flexibility. With the latest release user receives the default implementation of the Docker services. But there is always opportunity to add or replace some services with a third-party’s implementation of scheduling, clustering, or some other service written to the orchestration APIs. Docker also provides system health monitoring, real-time analytics, comprehensive audit logs, and pluggable storage backends.
Although the concept of a container in virtualization has been already known for several years, Docker offered a thought-out tool set and a unified API that takes care of underlying technologies and ensures portability and interoperability of the system and its components, as well as orchestrating multi-container distributed applications. At its core Docker remains an open platform for distributed applications development with Docker containers using any language and any toolchain. To get more information visit docker.com.
If you need a reliable and secure end-to-end hosting architecture that will assist in deployment and management of the Docker containers, then Project Atomic is the solution you are looking for. It integrates the tools and patterns of “Dockerized” app deployment with trusted OS platforms. The Atomic tools include:
- Atomic Host is an OS developed to manage Docker containers, built from upstream CentOS, Fedora, or RedHat Enterprise Linux RPMs, features reliable atomic upgrades and rollbacks.
- SELinux provides strict access control while containers are running in production systems, uses Linux Container namespaces.
- systemd and geard are applied for coordination of containers across hosts, auditing and secure management.
- Cockpit is a new administration tool for a direct inspection of journals and management of storage and services straight from the browser.
More at projectatomic.io.