Docker is an open-source, container technology that allows developers and sysadmins to package applications so that they can be easily moved from one environment to another. This includes their dependencies, libraries, the code itself, and more.
Docker’s use cases are limitless. It can be used as a way to deploy web applications, set up development environments, or build libraries of pre-tested components in a reliable and repeatable fashion.
As software engineers, We are assigned a task that solves a specific client-related business problem in the IT world. As we try to solve that specific problem, we always use specific tools and technology that will solve that problem efficiently and as quickly as possible. As our software team grows and also each one of us like different OS(Operating System) preferences like someone like MacOS, others like Linux, Many of us like Windows, etc. Now when a new member joins our team, we provide him with a Personal Computer(PC) which he will use to work on. Another member of the team will provide him with the latest project repository so that he can easily set up the project on his new pc to work on. Most of the target focuses on a new team member, and how quickly he/she can contribute to that project. To do so the project should take a minimal amount of time to set up so our new members can start contributing to the project. This same problem occurs when we try to deploy our project solution to the dev, staging, or production server.
You may say, what the hell? What that story above mentioned, relates to Docker. Let’s first go through the official definition of docker then we will explain it in simple terms
What is Docker? What problem is it trying to solve?
Also, let’s try to relate how the above story relates to Docker.
Docker is an open platform for developing, shipping, and running applications. Docker enables you to separate your applications from your infrastructure so you can deliver software quickly. With Docker, you can manage your infrastructure in the same ways you manage your applications. By taking advantage of Docker’s methodologies for shipping, testing, and deploying code quickly, you can significantly reduce the delay between writing code and running it in production.
The official definition isn’t that hard to understand, right?
Docker is a tool to automate the deployment of an application as a lightweight container so that the application can work efficiently in different environments.
Let’s try to relate Docker with Our example above.
As software Engineers, We always heard almost every day that, a project working perfectly on my machine may be some issue with your(another teammate’s) pc or the server pc and the blame game starts.
This is what Docker trying to solve,
This works on my machine…
Every single project runs on every machine without any problem with minimal dependencies to set up. Project code works fine on the developer machine but as soon as it is deployed or moved to a new teammate’s pc, it doesn’t work like it was expected.
Docker is like a blank box when you as a developer put your files moves into a new place and it works no matter where you place that box.
This blank box where you put your code, in docker term, it’s called Containers. These containers are the absolute heart of the docker. when every time you take these containers it’s going to work absolutely and exactly like how it worked on your machine. Docker also allows us to publish this as an image to the public or private registry like docker hub, AWS ECR, Harbor, etc. So that others can also use your code on other projects as if you run on your pc.
A simple example can be given like these, Suppose you need an older version of MySQL for a particular project and your pc has the latest version of MySQL so to install the other version of MySQL you need to stop your latest MySQL service and then install the older which is tedious most of the time. But using docker it will take a single command to install the required version of MySQL without stopping your latest MySQL version.
docker run --name mymysql -e MYSQL_ROOT_PASSWORD=my-secret-pw -d mysql:latest -p 3310:3306
Let's break down the command for better understanding...
docker: represents the executable
run: this is a subcommand. There’re lots of other commands. To check we can use
--name: will set a custom name to the container. If not provided it will set a random name for the container.
-e: represents environment variables. if we have multiple environment variables then we have to add
-e for each.
-d: means to run the container as detached mode.
mysql:latest: is the image(a set of instructions) itself which will be used to generate the container.
-p: refer to
port. we always write like this
3310:3306 which actually means
<local_pc_port>:<container_port>. Also if your PC port and the container port are the same then
3306 we can exclude the
Note:To remember this, I always think like this, ports always start from your local pc port to the docker container port.
If you like, you can read the same article on my [Personal blog]
You can read my other blog-posts [Here]
In Conclusion, Docker is often used as a “development tool” by developers who want to ensure that they have required dependencies in the development environment without including them in their codebase or having to learn about those dependencies on each new machine that they use for development.