Docker is an application container engine project, based on go language development, open source. The official documentation is also very detailed, only in English, but if you want to read it, you can certainly overcome it. With the popularity of Docker, the content in the mirror warehouse is also very rich, and many daily environments can be directly found in it.
1. Why does Docker appear
A few scenarios:
- Operation and maintenance help you to develop the project deployed to the server, tell you that there is a problem can not start up. You did a local run and found no problem…
- The project to be launched is unavailable due to the update of some software versions.
- There are projects involving a lot of environmental content, a variety of middleware, a variety of configurations, but also the deployment of many servers…
These problems really boil down to the environment. To avoid problems caused by different environments, it is best to deploy the project along with the various environments required by the project. For example, if the project involves redis, mysql, JDK, ES, etc., bring the entire environment with you when deploying the JAR package. So the question is, how can the project bring the environment with it?
Docker is here to solve this problem!
Still a little unclear? Here’s another analogy. We install an Android app on our phone, and the process from the birth of the app to its installation is roughly like this:
Java development -- > APK -- > release to major app stores -- > User A download APK installation can be usedCopy the code
From the user’s point of view, I don’t care what environment this app needs to rely on, just download it and install it. With that in mind, here’s docker:
Java development -- > JAR package -- > package the project with various environments (become an image) -- > image into the Docker repository -- > operation and maintenance shoes download the image, run directly.Copy the code
2. The core idea of Docker
This is the logo of Docker, a whale filled with containers. On the back of the whale, containers are isolated from each other, which is the core idea of Docker. For example, when multiple applications were running on the same server, there might be conflicts in port occupation of the software. Now, the software can run independently after being isolated. In addition, Docker can maximize the power of the server.
3. Differences between Docker and traditional virtualization
Before the rise of Docker container technology, it was mostly virtual machines, such as VMS. If you want to install Linux on your Windows for some practice, you can install Linux on your VM.
As shown, I’m running Linux on Windows, so my Windows is a host (gray area). The blue part is the Linux system THAT I installed on my VM. From bottom to top, there are the kernel, libraries, and various applications. You can run many applications on a single computer.
However, virtual machines are very bulky. To virtual the entire system, both software and hardware are required. A virtual machine is equivalent to a computer, so starting a virtual machine takes up a lot of resources and the startup speed is slow.
Container technology is also a virtualization technology, but containers are very lightweight. The Linux container, for example, no longer emulates a full operating system. If I only use the Linux kernel, then I don’t need anything else.
The resources needed to run the software are packaged into a container, which is isolated and equivalent to a container.
As shown in the figure, the application process in the container runs directly on the kernel of the host machine. There is no kernel of its own in the container, let alone hardware virtualization. Instead of using a common lib library, each container has its own lib, containing only what it needs to run its own APP. Containers are isolated from each other, each with its own file system that doesn’t affect the other, and can start up in seconds.
4. What Docker can do
- Faster application delivery and deployment. The deployment environment used to install a bunch of stuff, now it’s docker, packaged as an image, released the test, one click to run.
- Easier upgrades and expansions. With Docker, deploying apps is as easy as building blocks. For example, after the project packaged image is released and run, it is found that additional servers are needed to improve service performance. You can directly download the running image from the new server.
- Simpler system operation and maintenance. Containerization allows for a high degree of consistency between development and test environments.
- More efficient use of computer resources. Docker is kernel-level virtualization and can run many container instances on a physical machine. For example, I run Tomcat, ES, Kibana and so on at the same time on a machine to make full use of system resources.