
- DevOps - Home
- DevOps - Traditional SDLC
- DevOps - History
- DevOps - Architecture
- DevOps - Lifecycle
- DevOps - Tools
- DevOps - Automation
- DevOps - Workflow
- DevOps - Pipeline
- DevOps - Benefits
- DevOps - Use Cases
- DevOps - Stakeholders
- DevOps - Certifications
- DevOps - Essential Skills
- DevOps - Job Opportunities
- DevOps - Agile
- DevOps - Lean Principles
- DevOps - AWS Solutions
- DevOps - Azure Solutions
- DevOps Lifecycle
- DevOps - Continuous Development
- DevOps - Continuous Integration
- DevOps - Continuous Testing
- DevOps - Continue Delivery
- DevOps - Continuous Deployment
- DevOps - Continuous Monitoring
- DevOps - Continuous Improvement
- DevOps Infrastructure
- DevOps - Infrastructure
- DevOps - Git
- DevOps - Docker
- DevOps - Selenium
- DevOps - Jenkins
- DevOps - Puppet
- DevOps - Ansible
- DevOps - Kubernetes
- DevOps - Jira
- DevOps - ELK
- DevOps - Terraform
DevOps - Docker
We see that DevOps and Docker are very important in today's software development and deployment. They help to make processes easier and improve teamwork. In this chapter, we will look at the basic ideas of Docker. We will also check its architecture and how it works in the DevOps lifecycle.
First, we will learn how to set up Docker in a DevOps environment. Then, we will manage images and containers. After that, we will look at networking settings. Finally, we will see how to connect Docker with CI/CD pipelines. This will help us deliver software more efficiently.
Understanding the Docker Architecture
In Docker architecture, we have many important parts that work together to help us use containerization.
The key components of the Docker architecture are listed below −
- Docker Daemon (dockerd) − This is the main service that takes care of Docker containers, images, networks, and volumes. It listens for API requests and manages container tasks.
- Docker Client (docker) − This is the command-line tool that we use to talk to the Docker daemon. We run commands like docker run, docker build, and docker ps
- Docker Images − These are read-only templates that we use to create containers. We build them from a Dockerfile. The Dockerfile tells us how to make the image.
- Docker Containers − These are running versions of Docker images. They let us run applications in separate spaces.We create them from images using the command docker run.
- Docker Registry − This is a place to store and share Docker images, like Docker Hub. It lets us push and pull images.
Setting Up Docker in a DevOps Environment
To use Docker well in our DevOps work, we can follow these easy steps for installation and setup.
Step 1. Install Docker
For Ubuntu: Use the following commands to install Docker on Ubuntu −
sudo apt update sudo apt install apt-transport-https ca-certificates curl software-properties-common curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" sudo apt update sudo apt install docker-ce
For Windows / Mac − We can download the Docker Desktop app from the official website.
Step 2. Start Docker Service
We need to start the Docker service with these commands −
sudo systemctl start docker sudo systemctl enable docker
Step 3. Add User to Docker Group
To not use sudo for Docker commands, we can add our user to the Docker group −
sudo usermod -aG docker $USER
Dont forget to log out and then log back in to make the changes work.
Step 4. Verify Installation
We can check if Docker is working fine with these commands −
docker --version docker run hello-world
Step 5. Configure Docker Daemon
We can edit the /etc/docker/daemon.json file for some custom settings −
{ "storage-driver": "overlay2", "log-level": "error" }
After that, we need to restart the Docker service −
sudo systemctl restart docker
Now that we have Docker ready, we can start to containerize our applications and add them to our CI/CD pipeline.
Docker Images and Containers: Best Practices
We can use Docker images and containers well in a DevOps environment if we follow some best practices.
Docker Images
Minimize Image Size − We should use small base images like alpine. This helps to make our images smaller and faster to download.
Layer Management − We can combine commands in one RUN statement. This will help us reduce the number of layers. For example −
RUN apt-get update && apt-get install -y \ package1 \ package2 \ && rm -rf /var/lib/apt/lists/*
Use .dockerignore − Just like .gitignore, we can list files to leave out from the build context. This will also help to make our image size smaller.
Tagging − Let's use semantic versioning for our tags. For example, we can use myapp:1.0.0. We should be careful with using latest.
Docker Containers
Resource Limits − We need to set limits on CPU and memory. This helps to stop our containers from using too many resources −
docker run --memory="256m" --cpus="1.0" myapp
Environment Variables − We can use environment variables for configuration. This way, we do not hardcode sensitive information in our code.
Regular Updates − We should keep our images updated. This helps to fix vulnerabilities. We need to scan our images for security problems regularly.
By following these practices, we can improve the performance, security and maintenance of our Docker images and containers.
Docker Networking: Concepts and Configuration
We can use Docker networking to let our containers talk to each other and to outside systems. It is important to know the different networking modes for managing our containers well. Docker gives us several networking options −
Bridge Network
This is the default network for our containers. It lets containers communicate with each other on the same host
docker network create my_bridge_network docker run -d --name container1 --network my_bridge_network nginx docker run -d --name container2 --network my_bridge_network nginx
Host Network
The Host Network skips the Docker networking stack. It connects the container directly to the host network.
docker run --network host nginx
Overlay Network
The Overlay Network allows containers on different Docker hosts to communicate. It is useful when we use Swarm mode.
docker network create -d overlay my_overlay_network
Macvlan Network
This lets our containers have their own MAC addresses. They look like real devices on the network.
docker network create -d macvlan \ --subnet=192.168.1.0/24 \ --gateway=192.168.1.1 \ -o parent=eth0 my_macvlan_network
When we understand these ideas and setups, we can manage our Docker containers better in a DevOps environment.
Managing Docker Containers with Docker Compose
We use Docker Compose to help manage multi-container Docker applications. It makes it easy to define and run these applications with just one YAML file. This way, orchestration becomes simpler.
Key Features
- Service Definition − We define services, networks, and volumes in a docker-compose.yml file.
- Environment Configuration − We can easily manage environment variables for our containers.
- Scaling − We can scale services up or down with a simple command.
Basic Structure of docker-compose.yml −
version: '3.8' services: web: image: nginx:latest ports: - "8080:80" db: image: mysql:5.7 environment: MYSQL_ROOT_PASSWORD: example
Common Commands
Start Services −
docker-compose up
Stop Services −
docker-compose down
Scale Services −
docker-compose up --scale web=3
Using Docker Compose helps us streamline the development and deployment of applications. It makes managing dependencies and configurations for multi-container setups easier.
CI / CD Integration with Docker
We can make our development better by adding Docker to our Continuous Integration and Continuous Deployment (CI/CD) pipelines. Docker helps us create consistent builds and make deploying applications easier.
Following are the key components −
Dockerfile − This file tells us how to build Docker images.
FROM node:14 WORKDIR /app COPY package*.json ./ RUN npm install COPY . . CMD ["npm", "start"]
CI / CD Tools − We can use popular tools like Jenkins, GitLab CI, CircleCI, and GitHub Actions with Docker.
Pipeline Example
Build − We create a Docker image from the Dockerfile.
docker build -t myapp:latest
Test − We run tests inside a container.
docker run --rm myapp:latest npm test
Deploy − We push the image to a registry and deploy it to production.
docker push myapp:latest
Conclusion
In this chapter, we looked at the basics of Docker in a DevOps framework. We talked about its structure and how to set it up. We also shared best ways to use images and containers.
Networking setup was another topic we covered. Lastly, we talked about managing Docker with Docker Compose. We also showed how to add Docker into CI/CD pipelines.