This is the 80th unwatered original, want to get more original good articles, please search the public number to pay attention to us ~ this article first in the political cloud front-end blog: NPM private library from the construction to data migration finally disaster recovery backup some solutions

preface

In accordance with international practice, before the text begins, let’s briefly introduce the open source NPM private library framework currently on the market.

  • Verdaccio

Verdaccio is a branch of the Sinopia open source framework. It provides its own small database, as well as the ability to broker other registries (e.g. Npmjs.org), configuration and deployment are relatively simple, one step down to the stomach. Consider this if the company has fewer private packages or if you want to be lazy.

  • Cnpmjs.org

Famous CNPM, you must have already felt its speed “fast”, yes, its register service is taobao mirror. An enterprise-specific NPM registry and Web service based primarily on Koa, MySQL, and simple storage services, the most powerful feature of which is its synchronization module mechanism (timing synchronization of all source Registry modules, only those that already exist in the database, and only popular modules).

  • Nexus

Backend developers should be familiar with this. The Nexus2 is mainly used for the unified management of maven/gralde repositories, while the Nexus3 has added the NPM plug-in to support NPM. There are three types of NPM repositories. They are hosted (private warehouse), Proxy (proxy warehouse), and Group (composite warehouse).

In general, Nexus aside, Cnpmjs.org is much more complex than Verdaccio in terms of deployment process and overall design, but it offers higher scalability, customization, and support for a variety of business use scenarios. Next, we respectively from the Cnpmjs.org container deployment, data migration, OSS disaster recovery backup and other contents, layer by layer.

Cnpmjs.org container deployment

At present, the company’s application deployment is basically container deployment, internal iPAAS platform, application process deployment and one-click release. Cnpmjs.org also comes with a Dockerfile and docker-compose. Yml file, so here’s how to deploy with Docker.

  • First let’s look at the Dockerfile file
FROM node:12
MAINTAINER zian [email protected]

# Working enviroment
ENV \
    CNPM_DIR="/var/app/cnpmjs.org" \
    CNPM_DATA_DIR="/var/data/cnpm_data" 

# shell format
# Run when docker build
RUN mkdir -p ${CNPM_DIR}

WORKDIR: The WORKDIR specified by WORKDIR will be present at each layer of the build image
WORKDIR ${CNPM_DIR}

Copy directive: copies a directory or file from the context directory to the specified path in the container
COPY package.json ${CNPM_DIR}

RUN npm set registry https://registry.npm.taobao.org

RUN npm install --production

COPY .  ${CNPM_DIR}
COPY docs/dockerize/config.js  ${CNPM_DIR}/config/

Declare port (7001 for register service, 7002 for Web service)
EXPOSE 7001/tcp 7002/tcp

# Anonymous data volume: If you forget to mount the data volume when starting the container, the data volume will be automatically mounted to the anonymous volume.
VOLUME ["/var/data/cnpm_data"]

RUN chmod +x ${CNPM_DIR}/docker-entrypoint_prod.sh

# Entrypoint 
# the exec format
# Run when docker run
Dockerfile has multiple CMD commands, only the last one takes effect
# CMD ["node", "dispatch.js"]
CMD ["npm"."run"."prod"]
Copy the code

Change the CMD command to [” NPM “, “run”, “prod”], because it adds a layer of shell scripts for different environments, and currently all global variables are stored here.

Example: the docker – entrypoint_env. Sh

Export DB_USRNAME='root' export DB_PASSWORD='123456' export DB_HOST='127.0.0.1' export BINDING_HOST = '0.0.0.0 DEBUG = CNPM * node dispatch. JsCopy the code
  • To modify the docker – compose. Yml file, delete mysql db – this service here, reason can be through the/docs/dockerize/config. The configuration file to connect to the company under the js test environment of the mysql database, You do not need to build the mysql -DB image.
Version: '3' # docker version services: # configure container list web: # custom, service name build: # build image based on dockerfile context:.dockerfile: Volumes: -cnpm-files-volume :/var/data/cnpm_data ports: /var/data/cnpm_data ports - "7001-7001" - "7002-7002"Copy the code

Note: 1, the global configuration file path: / docs/dockerize/config. Js. 2. Set bindingHost to 0.0.0.0.

  • Finally, tap on the consoledocker-compose up -d, that is, start the application in daemon mode, and then open the browserhttp://127.0.0.1:7002, you will see the Web page. performNPM config set registry http://127.0.0.1:7001You can set this parameter to the mirror source address of the established private library. This parameter is recommendednrm, can freely switch NPM source.

The following figure shows the site:

Note: 1. When you change the native code, first run the docker-compose build to build the new image, then run the docker-compose up-d to replace the running container.

Data migration

Because the company used Verdaccio to build the private library, to switch to the new NPM private library means to migrate all the private packages previously released. According to the statistics, there are more than 400 packages and a total of more than 7000 versions. According to the normal logic, data migration should start from the database, but Verdaccio does not rely on the database. At first, I had a look at the source code of Cnpmjs.org, and saw how it stores the metadata of the NPM module into the database when we publish the module.

Through the routing file (/ routes/registry. Js) we can easily find/controllers/registry/package/save js, this file is what we want.

Core code:

var pkg = this.request.body; // Get the NPM module metadata, that is, the package.json file processed by the libnPMPublish module json data
var username = this.user.name; // The current user name
var name = this.params.name || this.params[0]; // NPM Specifies the module name
var filename = Object.keys(pkg._attachments || {})[0]; // Compressed file name of the NPM module
var version = Object.keys(pkg.versions || {})[0]; // Latest version of the NPM module

Copy the code
// upload attachment

// Base64 decoding, get the module file binary data. From the LibnPMPublish module tardata.tostring ('base64'), the NPM module file flow base64 string
var tarballBuffer = Buffer.from(attachment.data, 'base64'); 
// Use fs-cnpm to save the NPM module file locally. The default path is path.join(process.env.home, '.cnpmjs.org', 'NFS ').
var uploadResult = yield nfs.uploadBuffer(tarballBuffer, options);

var versionPackage = pkg.versions[version];
var dist = {
  shasum: shasum,
  size: attachment.length
};

// if nfs upload return a key, record it
if (uploadResult.url) {
  dist.tarball = uploadResult.url;
} else if (uploadResult.key) {
  dist.key = uploadResult.key;
  dist.tarball = uploadResult.key;
}
var mod = {
  name: name,
  version: version,
  author: username,
  package: versionPackage
};

mod.package.dist = dist;

// Save the module data to the database
var addResult = yield packageService.saveModule(mod);
Copy the code

As long as we have access to the metadata of the NPM module (i.e., the processed JSON data of package.json), we can upload the module file to the file system or the OSS service, and the data will fall into the library. Verdaccio has two apis to get the full data of its private library NPM module and the JSON data of the current NPM module. The path is /-/ Verdaccio /packages, /-/ Verdaccio /sidebar/$PKG$, The request path for modules with Scope is /-/verdaccio/sidebar/$scope $/$PKG$.

The idea is clear, let’s get started! New save_zcy js file, based on the original/controllers/registry/package/save js slightly under renovation.

Core code:

// Request the remote file and return the binary stream
const handleFiles = function (url) {
  return new Promise((resolve, reject) = > {
    try {
      http.get(url, res= > {
        res.setEncoding('binary') / / binary
        let files = ' '
        res.on('data'.chunk= > { // Load into memory
          files += chunk
        }).on('end'.() = > { / / finished loading
          resolve(files)
        })
      }) 
    } catch (error) {
      reject(error)
    }
  })
};

// Get the binary data from the remote module file
yield handleFiles(dist.tarball).then(res= > {
  // Use Buffer to convert objects
  const tardata = Buffer.from(res, 'binary')
  pkg._attachments = {};
  pkg._attachments[filename] = {
    'content_type': 'application/octet-stream'.'data': tardata.toString('base64'), // Read the data from the buffer, using Base64 encoding and converting it to a string
    'length': tardata.length,
  };
}, error= > {
  this.status = 400;
  this.body = {
    error,
    reason: error,
  };
  return;
});
Copy the code

Next we connect the controller save_zcyy.js to the App route of the Registry service.

// Add the fetchPackageZcy and savePackageZcy controllers
app.get('/:name/:version', syncByInstall, fetchPackageZcy, savePackageZcy, getOneVersion);
app.get('/:name', syncByInstall, fetchPackageZcy, savePackageZcy, listAllVersions);
Copy the code

The fetchPackageZcy controller is used to request the above API (/-/verdaccio/sidebar/ SCOPESCOPESCOPE/ PKGPKGPKG or /-/verdaccio/sidebar/ PKGPKGPKG) to pull the JSON data of the corresponding module.

NPM install [name] : NPM install [name] : NPM install [name] : NPM install [name] : NPM install [name] : NPM install [name]

OSS Dr Backup

First, briefly explain why to do the OSS Dr Backup, there are the following points.

  • If disks on the server are damaged, files may be lost, which may cause certain risks
  • If the server disk is full, the system automatically demots and uploadmodule files to the OSS

Based on the above points, we sorted out the following disaster recovery and backup plan:

  • package publish

That is, module files are stored locally and uploaded to oss for backup. The fs-cnpm and OSS-cnpm plug-ins are used.

  • package install

That is, before downloading the module file, check whether it is a private package (that is, whether the package name has scope). If it is not a private package proxy to the upstream Registry, if it is a private package, check whether the private package file exists on the local server. If it does not exist, go to oss and download it to the local NFS directory. If it exists, find the module file directly from the NFS directory, read and write it to the downloads directory, and finally call the fs.createreadStream method to stream the file.

IsEnsureFileExists determines whether the module file exists locally. The code is as follows:

const mkdirp = require('mkdirp');
const fs = require('fs');

function ensureFileExists(filepath) {
  return function (callback) {
    fs.access(filepath, fs.constants.F_OK, callback);
  };
}
Copy the code

Note: Before downloading module files from OSS to NFS, you must create a module file directory as follows:

const mkdirp = require('mkdirp');

function ensureDirExists(filepath) {
  return function (callback) {
    mkdirp(path.dirname(filepath), callback);
  };
}
Copy the code

Email notification

Cnpmjs.org comes with email notification, but only applies error log reporting. Since most of our private packages are business components, tools, etc., sometimes the release of a formal version of a business component needs to notify the users of the business component. Currently, we use Maintainers, which includes maintainers and users of the module.

Example:

"maintainers": [{"name": "yuanzhian"."email": "[email protected]"}]Copy the code

The email configuration is as follows:

mail: {
  enable: true.appname: 'cnpmjs.org'.from: process.env.EMAIL_HOST,
  host: 'smtp.mxhichina.com'.service: 'qiye.aliyun'./ / use the built-in email transmission, check the support list: https://nodemailer.com/smtp/well-known/
  port: 465./ / SMTP port
  secureConnection: true.// SSL is used
  auth: {
     user: process.env.EMAIL_HOST,
     pass: process.env.EMAIL_PSD, // }}Copy the code

At the end of the article

In the future, Cnpmjs.org will be the place to do a lot of custom development, such as access to the company’s internal permissions system, Web page refactoring, and access to online documentation of business components. If you also need to build a private NPM library, hopefully this article has been helpful.

Recommended reading

Teach you to build enterprise-class NPM private warehouse in minutes

Writing maintainable quality code: Component abstraction and granularity

, recruiting

ZooTeam (ZooTeam), a young and creative team, belongs to the product RESEARCH and development department of ZooTeam, based in picturesque Hangzhou. The team now has more than 40 front end partners, the average age of 27 years old, nearly 30% are full stack engineers, no problem youth storm team. The membership consists of “old” soldiers from Alibaba and netease, as well as fresh graduates from Zhejiang University, University of Science and Technology of China, Hangzhou Electric And other universities. In addition to the daily business connection, the team also carried out technical exploration and actual practice in the direction of material system, engineering platform, building platform, performance experience, cloud application, data analysis and visualization, promoted and implemented a series of internal technical products, and continued to explore the new boundary of the front-end technology system.

If you want to change the things you’ve been doing, you want to start doing things. If you want to change, you’ve been told you need to think more, but you can’t change; If you want to change, you have the power to achieve that result, but you are not needed; If you want to change what you want to accomplish, you need a team to support you, but there is no place for you to bring people; If you want a change of pace, it’s “3 years of experience in 5 years”; If you want to change the original savvy is good, but there is always a layer of fuzzy window paper… If you believe in the power of belief, that ordinary people can achieve extraordinary things, that you can meet a better version of yourself. If you want to get involved in the growth of a front end team with a deep understanding of the business, a sound technology system, technology that creates value, and spillover impact as the business takes off, I think we should talk about it. Any time, waiting for you to write something, to [email protected]