How is Docker Used in Production Applications?
Docker containers are a key component of modern digital transformation projects. Docker pioneered the containerized application model in 2013, and is still the most popular container engine.
There are five main drivers for using Docker in production applications:
- Docker plays a major role in microservices architectures. The microservices development pattern involves breaking up applications into multiple, independent components, each of which does one thing well. It is natural to run and deploy microservices in containers.
- Docker plays well with DevOps methodologies, making it possible to deploy applications in a repeatable manner, together with all their dependencies, and automate software pipelines using infrastructure as code (IaC).
- Docker containers consume less resources than traditional virtual machines, allowing a much bigger density of Docker containers per host. Whether running on premises or in the cloud, this generates significant cost savings.
- Docker enables fast-paced agile development, because it can spin up a self-container environment in just a few seconds. Fast startup time is key to automated testing and deployment at large scale.
- Docker is at the center of an entire ecosystem of management solutions, in particular orchestrators like Kubernetes, which help manage container infrastructure at scale and provide enterprise capabilities like networking, storage management and security.
In this article, you will learn:
Adopting Docker in a Production Environment: Enterprise Considerations
Here are three key considerations when deploying Docker in production in a large enterprise project.
Constantly Changing Technology Ecosystem
Docker deployments can be complex. To properly build, configure, deploy, manage, monitor, and update deployments in production environments, you need the support of a suite of tools. However, the majority of tools, whether the technology is offered as a third-party or first-party integration, are constantly evolving.
As tools evolve, developers, administrators, and security professionals are required to continuously learn new skills and read the accompanying documentation. This volatile technology ecosystem makes it very difficult for teams to keep up with ever-changing demands and to truly take advantage of new technologies.
Enforcing Policy and Controls
Many organizations are required to comply with certain regulatory standards. Compliance standards require organizations to protect sensitive customer data, healthcare data, credit card information, and financial information. In addition, all organizations should have security policies in place to protect their infrastructure, applications and data.
To meet compliance and security requirements, organizations need to implement security controls. However, enforcing security and data protection policies across containerized environments is currently highly complex. While Docker has come up with fixes and continues creating solutions, there is still far less control over containers, compared with virtual machines (VMs) or bare metal infrastructure.
Deploying Containers Across Environments
Containerized applications are highly portable, making development pipelines more streamlined and efficient. However, since infrastructure varies between different data centers and cloud environments, achieving true portability becomes a challenge.
To ensure you can truly deploy your containerized applications across multiple environments, you need to set up standardized infrastructure services. For example, you can consider setting up a network that enables communication between different hosts using software-defined networking or coordinated port mappings.
Related content: read more in our guide to Docker architecture ›
Best Practices for Running Docker in Production
Here are several best practices you can consider when running Docker in production.
It is difficult to secure and monitor a microservices architecture, which typically consists of thousands or tens of thousands of containers. Instead of trying to containerize your entire application at once, consider starting small.
Start by running your monolithic application in Docker and gradually branch out and deploy certain aspects of your application as containers. You can do this until you move entirely to a microservices model.
Use Docker Hosting Services
There are many options to consider when choosing a hosting service for your Docker workloads. You can opt for an on-premise data center and manage everything in-house, you can choose a cloud vendor, or you can try implementing a hybrid model. Hosting containerized applications helps organizations reduce complexity and speed time to market.
Cloud vendors offer a wide range of Docker hosting services:
- Infrastructure as a service (IaaS) options, such as running container instances directly on Amazon Elastic Compute Cloud (EC2) or Microsoft Azure.
- Fully managed container as a service (CaaS) solutions designed for hosting containerized applications, such as Amazon Elastic Container Service (ECS) and Azure Container Service (ACS).
- For those using Kubernetes, there are a range of cloud-based deployment options, including fully managed Kubernetes services.
Use a Private Image Registry and Scan Images
A container is built based on a Docker image. If there are vulnerabilities in the image, the container will inherit the issues and introduce them into your production environment. To ensure images are safe to use, scan them for security vulnerabilities. Even official images may contain vulnerabilities, so it is important to scan all of your images.
In addition to scanning your images, you should keep them in a private, secure container registry, to protect them from compromise or accidental tampering. There are several options for private repositories, including Artifactory, Quay, and Docker Hub.
Docker Monitoring and Logging
To securely manage your Docker deployment, you need to gain visibility into the entire ecosystem. You can achieve this by using a monitoring solution that tracks container instances across the environment, and allows you to automate responses to fault conditions.
Prometheus, for example, is an open source monitoring tool you can use to monitor Docker deployments. Prometheus was built especially for Kubernetes, which is an open source container orchestration platform. You can use Prometheus to set up time-series metrics and correlate the data across large Kubernetes clusters to gain insights and visibility.
Related content: read our guide to Kubernetes in production ›
Docker Security Best Practices
Security is critical to production deployments of containerized applications. Here are several security best practices to consider for your Docker deployments:
- Create Docker runtime security policies—create a policy that defines the appropriate response during runtime. Once a security event occurs, the team and any automated system can respond using the procedures that you have already defined.
- Manage sensitive data with Docker secrets—use secrets to protect sensitive data, like addresses and passwords. Storing this information in a Docker secret lets you safely deploy it during runtime.
- Limit the use of resources—a large amount of deployed containers make for a large attack surface. By limiting the resources allocated for each individual container, you limit the attack surface.
- Use a seccomp profile to limit system calls—seccomp can help you block and restrict calls, maintain a whitelist of calls, and prevent the spread of calls across the infrastructure. Seccomp is offered under an open source license.
Learn more in our detailed guide to Docker security best practices ›