I believe that many front-end students have more or less heard of Docker, but they have a deep misunderstanding of it, thinking that Docker technology has nothing to do with our front-end.

In fact, this is not the case. Docker technology can also be used at the front end to help us improve efficiency. (After all, they say, the front end can do anything.)

What is a Docker?

Docker is a containerized platform. It is a technique that has become popular in recent years for building and publishing applications.

As you can see here, Docker isn’t just for backend people, and it doesn’t even pick a language, whether you’re NodeJS, Java, or PHP…… It can work for you.

For construction and release, front-end students first associate with the “old man with a quick hook” scheme — after local compression of front-end projects, HTML files are stored in Redis, and other CSS, JS, images and other resources are directly uploaded to CDN.

Or, more simply, all front-end project resources are compressed and transferred to the CDN. It then borrows Nginx on the gateway or back-end server to proxy the default port to the CDN address of the HTML file.

I will admit that docker is not the kind of elegant technology that would be needed for a pure front-end project with no service. After all, it’s hard to kill a chicken.

But as an experienced front-end, we’ve all been around nodeJS projects to some extent. Whether it’s Express, Egg, Nuxt, or Next, we’ve probably gone from running node projects directly on the server to using PM2 to manage multiple Node server processes.

I have encountered a lot of problems in the construction and deployment area, and I feel that the emergence of Docker technology is specifically designed to solve these problems.

What’s the difference between docker and a virtual machine?

Docker is a bit like a virtual machine, we have to admit that. Look at the graph below

From this figure, we can easily see that virtual machines and Docker are both isolated ideas in a sense, but what they isolate is different. Virtual machines use memory allocation to isolate the entire operating system level, while Docker only isolates the application level.

Here’s a more detailed comparison of the two, which I found online:

The VM starts slowly and occupies the memory required by unnecessary system resources. Docker can be started in seconds, because Docker skips system initialization and can directly use the current system.

To sum up, Docker technology is more outstanding than virtual machines in application-level isolation, elastic scaling and rapid expansion.

Advantages of Docker technology

The idea of Docker is container, which has the advantages of continuous integration, version control, portability, isolation and security.

The idea behind Docker is to package code and its dependent runtime environment into a package, officially called an image (a container). Then go to the server and pull the package from the remote (docker repository), run the image, and create a container. This container acts as a scene replay on the server, recreating the state of your code before it was packaged.

If you have compressed code locally, when the server pulls down the image and runs the container based on the image, the container will contain the compressed code and any previously installed environment such as NodeJS, Redis, Nginx, PM2, etc.

In this way, our programmers are most afraid of the online process will become: a set of your local code + running environment, directly moved to the server to run. The process is easy, intuitive and silky smoother than Dove.

Docker’s popularity is no accident. Because of its continuous integration, version control, portability, isolation and security advantages to the traditional sense of build deployment has brought revolutionary significance.

What are the benefits of using Docker on the front end?

Docker has so many advantages that we can also use it to empower our front-end development.

What are the benefits of using Docker on the front end? I’ve combed through some of my own experiences with Docker:

1. Deployment is efficient and conducive to project migration

The traditional deployment steps for a Node project (based on Jenkins) :

  1. Jenkins builds the machine pull project code
  2. NPM install, NPM run build and upload static resources to CDN
  3. Passes the build to the specified directory on the target server
  4. Go to the specified directory on the target server and start the Node service to complete the deployment

Docker containerized Node project deployment steps:

  1. Locally pull an underlying NodeJS image and start a container with that image
  2. In this container, NPM install, NPM run build and upload static resources to the CDN, and package the container to upload to the remote image repository
  3. Go to the server and pull up the image. Start the Node service to access the desired Node service and the corresponding front-end resources

Especially for project migration, projects that have already used Docker technology need not be migrated at all. Because the code and the runtime environment have been packaged into a whole, move to which server to which server to pull the image has been made.

2. Standardized operating environment, more stable and reliable deployment

The biggest advantage of Docker technology is that it smoothen the environmental differences for different code environments (or servers), providing the same running environment for your code development, test and production.

This was almost impossible to do before Docker, because there were so many factors that caused your code to run inconsistently, Such as operating system type, operating system version, node version, node and Modules package version (there are always some packages are not locked version) and some base dependencies such as Nginx, pM2, Redis, etc.

In my own experience, I have often encountered the same piece of code that works fine locally, but has inexplicable problems in tests or online. Some even went straight wrong during the build process, which I didn’t have locally.

This is something we don’t want to see scratching our heads. Time should be spent on beautiful things, not getting bogged down in trying to find differences between different circumstances.

The emergence of Docker is precisely to solve this problem. It standardizes the operating environment and fixes different codes and operating environments into a whole in the form of image package, just like packing all goods into a container.

It focuses on WYSIWYG, meaning that what you see locally should be what you get when you go online. This deployment is stable and reliable, not too cool indeed!

If Git and other code management tools unify the code of each environment branch, then Docker is equivalent to unify the running environment of each environment code.

3. Easier continuous delivery and deployment (CI/CD)

Modern software development emphasizes continuous integration and delivery, which requires our build and deployment processes to be very efficient.

Docker deployment with GitLab CI makes our deployment process more automated, simplified and efficient.

This is all devOps thinking. CI/CD creates a real-time feedback loop mechanism that continuously transmits small iterative changes to accelerate change and improve quality. CI environments are usually fully automated, with git push commands triggering the automatic build and packaging of new images, which are then pushed to the Docker image library, and finally the Python script is used to automatically pull the new image and start the Node service.

The whole process is a one-stop service, from the time you submit the code to the corresponding test branch, you never have to think about anything again. Everything is efficient, automated and delivered at least several times faster.

4. More efficient O&M and second-level rollback

If you understand the concept of mirrored packages, the second rollback is not difficult to understand. Every mirror you play is a version.

If there is a problem or a code that should not be sent after you go online, you just need to find the stable image package and pull it from the server again to run the corresponding service.

I don’t have to answer the question of why the front end uses Docker. You may even want to touch your growing hairline and say, “No more overtime!”

Yes, according to my personal experience, for our front-end Node service project, using Docker container deployment is really beneficial. Since using Docker, I can’t live without it!


Docker-related operations this article has not been detailed, want to wait to personally practice docker technology can be based on this document to yeasy.gitbook. IO /docker_prac… 🙂