I have taken over several NodeJS projects and participated in several interesting nodeJS open source projects. Recently, I have sorted out some problems and solutions in order to avoid further pits. Without further ado, let’s begin

1. The problem of setting the NODE_ENV variable in Windows and MAC

In nodeJS, we need to set the local development environment, the test environment, the online environment, etc. At this time, the solution to always set the environment variables is to set the script property in package.json, as follows:

"scripts": {
   "start": "export NODE_ENV=development && nodemon -w src --exec \"babel-node src\""."build": "babel src --out-dir dist"."run-build": "node dist"."test": "echo \"Error: no test specified\" && exit 1"
 }
Copy the code

Export NODE_ENV=development; export NODE_ENV=development; export NODE_ENV=development; However, when developing the project with my friends, I found that an error was reported after executing YARN Start. After reading the error message, I found that export was not recognized under Window. Later, I found that the environment variable of Window could be defined by set. If you set NODE_ENV using the above method, you can do the following:

"scripts": {
   "start": "set NODE_ENV=development && nodemon -w src --exec \"babel-node src\""
 }
Copy the code

2. A Node-gyp error occurs when NPM install is executed

New ones are sometimes pulled during project developmentnodeProject code after executionnpm install, the following error is reported: node-gypIs in thenodeIf you encounter the same problem, we can use the following solution:

npm install -g node-gyp
Copy the code

Alternatively, delete package-lock.json or yarn.lock and reinstall yarn install or NPM install.

3. Node + KOA2 project to delete the cookie set solution

Because HTTP is a stateless protocol, cookies are needed to distinguish between users. We can think of cookies as a specification implemented jointly by browsers and servers.

Cookie processing is divided into the following 3 steps (basic and important knowledge) :

  1. The server sends a message to the clientcookie
  2. The browser willcookieSave (You can set Expires or maxAge on the back end tosessionFormal existence)
  3. Each time the browser will set it upcookieSending server

When developing node background project, we often involve the user management module, which means that we need to manage the user login state, and delete the user’s cookie in time when the user exits. Fortunately, KOA2 has its own method to deal with cookies, and we can set cookies in the following way:

router.post(api.validVip,
    async ctx => {
      ctx.cookies.set('vid'.'xuxiaoxi', { maxAge: 24 * 3600 * 1000}); });Copy the code

Above we randomly set a cookie valid for 1 day, so if the business changes, need to clear this cookie within the validity period, how should we deal with? Parse to give a relatively usable solution:

ctx.cookies.set('vid'.' ', { maxAge: 0 });
Copy the code

In this case, the cookie on the client is automatically invalid for the next request.

4. How does socket. IO work with KOA/Egg

We all know that complete socket. IO communication consists of two parts:

  1. Socket.io that integrates with (or is installed on) the NodeJS HTTP server
  2. Client library socket. IO -client loaded in the browser

If we are using KOA or Egg directly and we need to integrate HTTP with socket. IO, we can do something like this:

import koa from 'koa';
import http from 'http';

const app = new koa();
const server = http.createServer(app.callback());
const io = require('socket.io')(server);
// Normal business processing
// io
io.on('connection'.(socket) = > {
    console.log('a user connected');
    socket.on('doc load'.(msg) = > {
      console.log('doc load', msg)
      io.emit('getData', users)
    })
  });

server.listen(3000.() = > {
    // ...
});
Copy the code

Koa and socket. IO are compatible with each other. Then we can develop IM application normally

5. Error resolution caused by third-party nodeJS modules relying on a specific Node version

The main reason is that the third party is not fully backward compatible with the Node version. In this case, the solution is to update the third party package to the latest version (if it is still being maintained), or use the Node package management tool (n) to switch to the appropriate Node version, as follows:

// Update the latest package
npm i xxx@latest

// Use the package management tool n
npm i -g n
Copy the code

Using n makes it easy to manage node versions, so give it a try.

6. How does NodeJS create scheduled tasks

Scheduled task is one of the common functions in back-end development. Its essence is that the system automatically executes corresponding tasks in the background according to time rules. In Java, PHP and other background languages, there are abundant support for scheduled tasks. For NodeJS, a rising show, although it is not so mature, there are still scheduled task modules, such as Node-Schedule.

Node Schedule is a flexible Cron-class and non-Cron-class job scheduler for Node.js. It allows us to schedule jobs (arbitrary functions) to be executed on a specific date using optional repetition rules. It uses only one timer at any given time (instead of reevaluating upcoming jobs every second/minute).

A practical scenario is that we want the Node program to automatically capture the “commodity wool” of an e-commerce company and push it to our mailbox on November 11 or November 12 every year. At this time, we can use node Schedule to start a scheduled task to execute our business operations. Many of my Node applications follow a similar pattern. You can communicate with each other if you are interested.

What iscronWhat about Scheduling in style? itsgithubGives a simple introduction:

So we can write a timed task like this:

let schedule = require('node-schedule');

let testJob = schedule.scheduleJob('42 * * * *'.function(){
  console.log('This code will be executed at 42 minutes each time in the future, such as 22:42, 23:42');
});
Copy the code

7. Use import, export, and decorator @decorator syntax in the nodeJS project

We all know that nodeJS is now 14.0+ and supports the latest ES syntax well enough, but there are still some syntaxes that are not supported, such as es module import/export, @decorator, etc. In order to use these new features in the Node project, we have to use tools. Here we use babel7 to solve the above problems, as follows:

# .babelrc
{
  "presets": [["@babel/preset-env",
      {
        "targets": {
          "node": "current"}}]],"plugins": [["@babel/plugin-proposal-decorators", { "legacy": true }],
    ["@babel/plugin-proposal-class-properties", { "loose" : true}}]]Copy the code

We just need to create and write the above file in the project root directory, and install the corresponding module of Babel, as follows:

yarn add 
@babel/cli 
@babel/core 
@babel/node 
@babel/plugin-proposal-class-properties 
@babel/plugin-proposal-decorators 
@babel/preset-env
Copy the code

At this point, you can use these new syntax features just as you would a front-end project

8. Gracefully process JSON files in NodeJS and improve JSON read and write performance

There’s a lot to talk about when it comes to NodeJS optimizations, but here we’ll focus on JSON-related optimizations. We need to optimize from two aspects, one is the read/write performance of JSON files. At this time, we can use fast-jSON-stringify to greatly improve the read/write speed of JSON. Its essence is to provide a set of JSON-Schema constraints to make the JSON structure more ordered. This improves the reading and query speed of JSON. Use as follows:

const fastJson = require('fast-json-stringify')
const stringify = fastJson({
  title: 'H5 Dooring Schema'.type: 'object'.properties: {
    firstName: {
      type: 'string'
    },
    lastName: {
      type: 'string'
    },
    age: {
      description: 'Age in years'.type: 'integer'
    },
    reg: {
      type: 'string'}}})Copy the code

For example, in the background of H5-Dooring, where there are many interfaces that need to read and write JSON data frequently, using fast-JSON-stringify can greatly improve read and write performance.

On the other hand, if we operate JSON on the Node side, it will be very troublesome to use the native writing method. At this time, we had better encapsulate the JSON reading to improve the simplicity of the code, or we can directly use the third-party library JsonFile to easily read and write JSON files, as follows:

const json = require('jsonfile')
const fileName = 'h5-dooring.json'
const jsonData = jsonFile.readFileSync(fileName)
Copy the code

9. Nodejs read large file error solution

There are two ways to read and write files in NodeJS:

  1. fs.readFile()Read the file into the memory at one time, if the file is too largenodeAn error was reported due to insufficient memory
  2. Fs.createreadstream () reads as a file stream without worrying about the size of the file

Fs.createreadstream () is the first option if you want to read a large file (such as a video). In fact, if you need to parse a file, such as a resume, line by line to extract key language materials, We can use Node’s readline module to read and parse the file line by line, as shown in the following example:

const fs = require("fs");
const path = require("path");
const readline = require("readline");

const readlineTask = readline.createInterface({
    input: fs.createReadStream(path.join(__dirname, './h5-dooring'))}); readlineTask.on('line'.function(chunk) {
  // Read each row of data
});
 
readlineTask.on('close'.function() {
  // File read end logic
}
Copy the code

10. How to enable Gzip to optimize website performance

Enabling Gzip for NodeJS is also part of node performance optimization. This process can make our website load faster. We can use KOA’s KOA-COMPRESS middleware to implement Gzip function. The concrete implementation is as follows:

import koa from 'koa';
import compress from 'koa-compress';

const app = new koa();
/ / open gzip
const options = { threshold: 2048 };
app.use(compress(options));
Copy the code

Of course, koA-COMPRESS also has a lot of custom configuration items, you can feel.

11. Resolve the inconsistent path separators in Windows and Linux

This problem is also caused by the differences between systems, and it is also something to consider. We all know that on Linux the path separator is /, such as h5-dooring/ SRC /pages, In this case, we need to adapt, otherwise we will be deployed on different systems, so we need to configure the global path wildcard, the author’s solution is as follows:

import os from 'os'
const _$ = (os.platform().toLowerCase() === 'win32')?'\ \' : '/';
Copy the code

At this point, we use _$to replace the specific path, the above code we use node OS module, interested can study, we can use OS module to deal with many interesting problems caused by system differences.

12. How does NodeJS implement parent-child communication

Due to thenodejsIt is single-threaded, but sometimes we need to support multi-process business, for nownodejsWe can simulate multiple processes by using the parent process model, which we can usechild_process, the general process is as follows:I shared a lot beforenodeIt’s been used in all the field projectschild_process, the general implementation process is as follows:

// child.js
function computedTotal(arr, cb) {
    // Time-consuming computing tasks
}

// Communicate with the main process
// Listen for the main process signal
process.on('message'.(msg) = > {
  computedTotal(bigDataArr, (flag) = > {
    // Send the completion signal to the main processprocess.send(flag); })});// main.js
const { fork } = require('child_process');

app.use(async (ctx, next) => {
  if(ctx.url === '/fetch') {
    const data = ctx.request.body;
    // Notify the child process to start executing the task and pass in data
    const res = await createPromisefork('./child.js', data)
  }
  
  // Create an asynchronous thread
  function createPromisefork(childUrl, data) {
    // Load the child process
    const res = fork(childUrl)
    // Tell the child process to start work
    data && res.send(data)
    return new Promise(reslove= > {
        res.on('message'.f= > {
            reslove(f)
        })
    })  
  }
  
  await next()
})
Copy the code

13. The Node implements image editing/compression

In fact, there are also a lot of images that need to be processed on the Node side. After all, the quality of processing on the client side is not easy to control. At this point, we can use Node-images, which is a lightweight cross-platform image codec library on the Node side.

  • Lightweight: No image processing libraries need to be installed.
  • Cross-platform: A compiled. Node file is available for download on Windows.
  • Easy to use: Jquery-style API, easy to rely on

We can use it to crop and compress images, basically using the following:

const images = require("images");

images("input.jpg")                     // Load the image file
  .size(400)                          // Scale the image to 400 pixels wide
     .draw(images("logo.png"), 10.10)   // draw the Logo at (10,10)
       .save("output.jpg", {             // Save the image to a file with a quality of 50
        quality : 50                    
       });
Copy the code

You also use it in the H5-Dooring editor for image manipulation and editing, and you can use it more for your business.

14. Node side parses “command line command string” to realize online automatic package deployment project

Node parses CMD strings and executes command line commands. Node implements an automatic workflow by using the child_process module exec.

Build Online Automation packaging workflow from Scratch based on NodeJS (H5-Dooring Special Edition)

Here’s a simple example:

const cmdStr = `cd ${outWorkDir} && yarn build ${fid}`
// Parse command line commands to implement online automatic package build projects
exec(cmdStr, function(err, stdout, stderr){
  if(err) {
    console.log('api error:'+stderr);
    io.emit('htmlWorked', { result: 'error'.message: stderr })
  } else {
    // ...}})Copy the code

15. How to resolve Node application crashes, load balancing and process management

The best way to solve this problem is to use PM2 or Forever, which provides powerful Node process management, load balancing, and a degree of application monitoring. It is recommended to use PM2 to manage our Node applications in the online environment.

Think it works? Like to collect, by the way point like it, your support is my biggest encouragement! Wechat search “interesting talk front-end”, found more interesting H5 games, Webpack, Node, gulp, CSS3, javascript, nodeJS, Canvas data visualization and other front-end knowledge and actual combat.