A common API NodeJS
Buffer
1. ArrayBuffer
- (Note :ArrayBuffer does not refer specifically to buffers in Node.js).
- An ArrayBuffer instance object is used to represent a binary buffer of fixed length.
- The ArrayBuffer cannot be operated directly
Strongly typed array object
orDataView
Objects that represent the data in the buffer in specific formats and read and write the contents of the buffer in those formats. - Create ArrayBuffer.
new ArrayBuffer(length)
// Parameter: length Indicates the size of the created ArrayBuffer, in bytes.
// Return value: an ArrayBuffer of a specified size whose contents are initialized to 0.
// Exception: If length is greater than number. MAX_SAFE_INTEGER (>= 2**53) or is complex, RangeError is raised.
Copy the code
- Compare ArrayBuffer with TypedArray
var buffer = new ArrayBuffer(8);
var view = new Int16Array(buffer);
console.log(buffer);
console.log(view);
// Output result:
// ArrayBuffer {
// [Uint8Contents]: <00 00 00 00 00 00 00 00>,
// byteLength: 8
// }
// Int16Array(4) [ 0, 0, 0, 0 ]
// Cause ArrayBuffer displays a view of the Uint8Contents 8-bit unsigned content when it outputs a log
1 byte = 8 bits
// The number of bytes a single element takes up in the 8-bit view is 1, so it is divided into 8 elements.
// Int16Array is a 16-bit signed strongly typed array.
1 byte = 8 bits
// 2Byte = 16bit so the number of bytes taken up by an element in Int16Array is 2
// So we need 4 elements.
Copy the code
2. Uint8Array
- Uint8Array instance object represents an 8-bit array of unsigned integers.
- When created, the content is initialized to 0.
- After being created, you can click
.
The operator orThe array subscript
References elements in an array.
// Create by length
var uint8 = new Uint8Array(2);
uint8[0] = 42;
console.log(uint8[0]); / / 42
console.log(uint8.length) / / 2
console.log(uint8.BYTES_PER_ELEMENT) / / 1
// created by array literals
var arr = new Uint8Array([21.31]);
console.log(arr[1]); / / 31
// Created by another type array
var x = new Uint8Array([21.31]);
var y = new Uint8Array(x);
console.log(y[0]); / / 21
Copy the code
/ / development
// The number of bytes used to interpret a single element is different.
// The constant BTTES_PER_ELEMENR represents the number of bytes per element in a particular strongly typed array.
// This is related to the previous conversion
Int8Array.BYTES_PER_ELEMENT; / / 1
Uint8Array.BYTES_PER_ELEMENT; / / 1
Uint8ClampedArray.BYTES_PER_ELEMENT; / / 1
Int16Array.BYTES_PER_ELEMENT; / / 2
Uint16Array.BYTES_PER_ELEMENT; / / 2
Int32Array.BYTES_PER_ELEMENT; / / 4
Uint32Array.BYTES_PER_ELEMENT; / / 4
Float32Array.BYTES_PER_ELEMENT; / / 4
Float64Array.BYTES_PER_ELEMENT; / / 8
Copy the code
3. Relationship between ArrayBuffer and TypedArray
- ArrayBuffer: itself is just a store binary
(0 to 1)
There is no stipulation about how these binaries are allocated to the elements of the array, so the ArrayBuffer itself cannot operate directly, only throughView the view
To operate. - because
View the view
Some interfaces for arrays are deployed, which means we can manipulate the memory of an ArrayBuffer using arrays. - TypedArray: strongly TypedArray, provided with one for ArrayBuffer
View (view)
, reading and writing its subscripts will be reflected in the established ArrayBuffer.
4. Buffer in NodeJS
The creation of a Buffer
Buffer
The constructor is implemented in a way that is more suitable for Node.jsUintArray API
.Buffer
Instance, similar to an integer array, butBuffer
The size is fixed and physical memory is allocated outside the V8 heap.Buffer.alloc
case
// Create a Buffer of length 10 filled with 0.
const buf1 = Buffer.alloc(10);
// <Buffer 00 00 00 00 00 00 00 00 00 00>
// Create a Buffer of length 10 and fill it with 0x1.
const buf2 = Buffer.alloc(10.1) // 1 is base 10, but output is hexadecimal, such as 10 -> 0a
// <Buffer 01 01 01 01 01 01 01 01 01 01>
Copy the code
- call
Buffer.allocUnsafe
, the allocated memory segment is uninitialized, which is designed to make memory allocation very fast. - But that’s because of passing
Buffer.allocUnsafe
The memory of the created Buffer is not completely rewritten, which may cause old data to leak if the allocated memory segment contains sensitive old data and the Buffer memory is readable - Thus introducing a security hole into the program.
Buffer.allocUnsafe
case
// Create an uninitialized Buffer of length 10.
// This method is faster than calling buffer.alloc ().
// However, the returned Buffer instance may contain old data.
// Therefore, fill() or write() should be rewritten.
const buf3 = Buffer.allocUnsafe(10);
Copy the code
Buffer.from
case
// Create a Buffer containing [0x1,0x2,0x3]
// Note that numbers are hexadecimal by default.
const buf4 = Buffer.from([1.2.3]);
// Create a Buffer containing UTF-8 bytes
const buf5 = Buffer.from("test");
Copy the code
The character encoding of Buffer
- Buffer instances are typically used to represent sequences of encoded characters, such as UTF-8,UCS2,Base64, or hexadecimal encoded data.
- By using explicit character encodings, you can convert Buffer instances to normal JavaScript characters.
Buffer
withString
Transfers between cases
const buf = Buffer.from('hello world'.'ascii');
console.log(buf)
// Output 68656c6c6f20776f726c64 (hexadecimal)
console.log(buf.toString('hex'));
AGVsbG8gd29ybGQ = (base64)
console.log(buf.toString('base64'))
Copy the code
- The character encodings currently supported by Node. js include
- ‘ASCII ‘- supports only 7 bits of ASCII data, which is very fast if the high bits are removed.
- ‘UTf8 ‘- Multi-byte encoded Unicode character. Many web pages and other document formats use UTF-8.
- ‘UTF-16le ‘- 2 or 4 bytes, small byte order encoding Unicode characters. Proxy pairs (U+10000 to U+10FFFFF) are supported.
- Alias for ‘UCS2 ‘-‘ UTf16LE’.
- ‘base64’- Base64 encoding. When creating buffers from strings, this encoding will also correctly accept the “URL and filename security alphabet “as specified in chapter 5 of RFC4648.
- ‘latin1’- a way to encode Buffer as a one-byte encoded string (defined by IANA in RFC1345, p. 63, for latin-1 complementary blocks and C0/C1 control codes).
- Alias of ‘binary’ – ‘latin1’.
- ‘Hex’ – Encodes each byte into two hexadecimal characters.
5. Buffer memory management
Node memory management
- We refer to all memory occupied by the Node process during the execution of the program as resident memory.
- Resident memory consists of code area, stack, heap, and off-heap memory.
- The code area, stack, and heap are managed by V8, while the out-of-heap memory is not.
Graphic memory-resident ______________________ | _____________________ | | | | | code area outside the stack heap memory alarm | alarm | | V8 management from the V8Copy the code
Buffer memory allocation
-
As we know,Buffer objects are allocated not in v8’s heap memory, but by node’s c++ layer of memory implementation.
-
The reason:
- Because the storage space of large objects is uncertain, it is impossible to apply to the operating system, which causes pressure on the operating system.
- Therefore, node applies the strategy of c++ level memory allocation and JavaScript memory allocation in memory usage.
-
Introducing the slab memory allocation mechanism :(minor)
- using
Apply in advance, allocate afterwards
The way. - In simple terms, it is a fixed size area of memory. It has the following three states:
- Full: fully allocated.
- Partial: partial allocation.
- Empty: No allocation.
- This mechanism allows
8 KB boundaries
To determine whether the currently allocated object is a large object or a small object, which is the value of each slab. - Memory is allocated in slabs at the JavaScript level.
- using
-
Conclusion:
- An 8KB memory space (memory pool) is initialized at first load.
- According to the requested memory size, it can be divided into small Buffer objects and large Buffer objects.
- For the small Buffer(less than 4KB) object memory allocation: Determine whether the slab free space is enough.
- If enough, the remaining space allocation is used (pool offset increases).
- If not, allocate another 8KB of memory.
- For large Buffer(> 4KB) object memory allocation:
- Allocate the size you want directly at the C++ level.
- For the small Buffer(less than 4KB) object memory allocation: Determine whether the slab free space is enough.
- For both small and large Buffer objects, memory allocation is done at the C++ level, and memory management is done at the JavaScript level. Finally, it can be reclaimed by V8’s garbage collection tag, but the Buffer itself is reclaimed, and the out-of-heap memory functions are handed over to C++.
FastBuffer
- In addition to the Buffer class, there is the FastBuffer class
- We know that Uint8Array can be called like this:
Uint8Array(length); // Pass the length
Uint8Array(typedArray); // Through strongly typed arrays
Uint8Array(object); // Pass the object
Uint8Array(buffer[,byteOffset[,length]]); // Through Buffer, optional byte offset, optional length.
Copy the code
- The FastBuffer class declaration is as follows:
class FastBuffer extends Uint8Array {
constructor(arg1,arg2,arg3) {
super(arg1,arg2,arg3); }... }Copy the code
- Buffer.from can be called like this:
// buffer. from falls under the category of factory functions
Buffer.from(str[,encoding]) // Use string [and encoding](more convenient than type array here)
Buffer.from(array); // Through the array
Buffer.from(buffer); / / by buffer
Buffer.from(arrayBuffer[,byteOffset[,length]]) // Through Buffer, optional byte offset, optional length.
Copy the code
- FastBuffer summary:
- When the encoding is not set, utF8 is used by default.
- If the number of bytes required by the string is greater than 4kb, the memory is allocated directly.
- If the number of bytes required by a character string is less than 4kb but exceeds the pre-allocated 8kb memory pool space, apply for a new 8kb memory pool.
- When FastBuffer objects are created, the data is stored, the length is checked, and poolOffset and byte alignment are updated.
Buffers are usually static methods
Buffer.byteLength(string)
: Gets the string length in bytes
console.log(Buffer.byteLength("hello world"))
Copy the code
Buffer.isBuffer(any)
: assertion buffer
console.log(Buffer.isBuffer("It's not a buffer"))
console.log(Buffer.isBuffer(Buffer.alloc(10)))
Copy the code
Buffer.concat(Buffer[],byteLength?)
: merging buffer
const buffer1 = Buffer.from("hello");
const buffer2 = Buffer.from("world");
console.log(Buffer.concat([buffer1,buffer2]))
console.log(Buffer.concat([buffer1,buffer2],12))
Copy the code
Buffer commonly used instance methods
buf.write(sting[,offset[,length]][,encoding])
: Writes the string to buffer
const buf1 = Buffer.alloc(20);
console.log("Create an empty buffer",buf1);
buf1.write('hello');
console.log(` buf1. Write (" hello ") : to write hello `)
console.log(buf1);
buf1.write("hello".5.3)
console.log('buf1.write("hello",5,3): offset by five bytes and write the first three bytes of hello)
console.log(buf1)
console.log(Output string)
console.log(buf1.toString())
Copy the code
buf.fill(value[,offset[,end]][,encoding])
Filling buffer:- analogy
buf.write
Parameters:buf.write
: string + offset + length + encoding modebuf.fill
: Arbitrary value + offset + end position + encoding method
- analogy
buf.write
Meaning:buf.write
: Write as much content as there is, unless offset and length are specifiedbuf.fill
: Repeat until the container is filled, unless offset and end point are specified
const buf1 = Buffer.alloc(20);
buf1.fill("hello");
console.log(buf1);
const buf2 = Buffer.alloc(20);
buf2.fill("hello".4.6);
console.log(buf2);
Copy the code
buf.length
: the length of the buffer- Analogical static method
Buffer.byteLength(string)
:Buffer.byteLength(string)
: Enter a string, and return the length of bytes.buf.length
: Returns the length of the buffer instance in bytes.
const buf1 = Buffer.alloc(10);
console.log(buf1.length);
const buf2 = Buffer.from("eric");
console.log(buf2.length);
console.log(Buffer.byteLength("eric"));
Copy the code
buf.toString([encoding[,start[,end]]])
: Decodes buffer into a stringencoding
: Character encoding to use, default ‘utf8’.start
: Byte index to begin decoding. The default value of 0.end
: Byte index to end decoding (not included)”,” Default: buf.length
const buf = Buffer.from('test');
console.log(buf.toString('utf8'.1.3));
Copy the code
buf.toJSON()
: Returns the JSON format of buffer
const buf = Buffer.from("test");
console.log(buf.toJSON());
// { type: 'Buffer', data: [ 116, 101, 115, 116 ] }
Copy the code
buf.equals(otherBuffer)
: Compares whether other buffers have exactly the same bytes
const ABC = Buffer.from('ABC');
const hex414243 = Buffer.from('414243'.'hex');
const ABCD = Buffer.from('ABCD');
console.log(
ABC,
hex414243,
ABCD
);
console.log(ABC.equals(hex414243))
console.log(ABC.equals(ABCD))
// The output is as follows:
// <Buffer 41 42 43> <Buffer 41 42 43> <Buffer 41 42 43 44>
// true
// false
Copy the code
buf.slice([start[,end]])
: Intercepts a stringstart
New:Buffer
Default value :0end
New:Buffer
End position.
const buf1 = Buffer.from("abcdefghi")
console.log(buf1.slice(2.7).toString());
// Output result:
// cdefg
const buf2 = Buffer.from("abcdefghi")
console.log(buf2.toString("utf8".2.7));
// Output result:
// cdefg
Copy the code
buf.copy(target[,targetStart[,sourceStart[,sourceEnd]]])
Copy buffer:target
: Copy Buffer and Uint8Array.targetStart
: The number of bytes in the target buffer to be skipped before writing begins. Zero by default.sourceStart
: The index from the source buffer to start copying. The default value of 0.sourceEnd
: Index in the source buffer to end copying (not included). Default :buf.length.- Source.copy (target): Copies several values from the source to the target
- Returns the length of the part copied.
- What has changed is Target
const buf1 = Buffer.from("abcdefghi");
const buf2 = Buffer.from("test");
console.log(buf1.copy(buf2))
/ / output 4
console.log(buf2.toString)
/ / output "abcd"
Copy the code
Stream
concept
- When it comes to
flow
First we need to knowStreaming data
. Streaming data
isBytes of data
.- When exchanging and transmitting data between objects in an application program, the data contained in the object is always converted into stream data, and then transmitted through stream.
- Once the destination object is reached, the stream data is converted into data that the object can use.
- So, streams are used to transmit stream data, which is a kind of
Means of transport
. - Stream applications:
- HTTP requests and responses
- The HTTP socket
- And compression encryption
Why do you need streams?
- in
streaming
We don’t need to load all the data into memory at once, soLess memory
. - You don’t have to wait for all the data to be transferred before you can process it, so
Higher time use
. - For example, in
Transfer files
Scenario:- For smaller files, we can write all the files to memory and then write to the file.
- However, for large binary files, such as audio and video files, the size of a few GB, if you use this method, it is easy to run out of memory.
- In this case, we need to use drain transfer, read part, write part, no matter how big the file is, as long as time permits, it will always be processed.
The node in the flow
- In the node
flow
Is an abstract interface implemented by many objects in Node. - In Node, the default data stream is
Buffer/String
Type, but if setobjectMode
, you can let it receive anyJavaScript object
, the flow is calledObject flow
. - There are four basic stream types in Node.js:
type | Chinese | case |
---|---|---|
Readable | A readable stream | fs.createReadStream() |
Writable | Streams can be written | fs.createWriteStream() |
Duplex | Read/write flow | net.Socket |
Transform | A Duplex stream of data that can be modified and transformed during reading and writing | zlib.createDeflate() |
- The base class for all streams
require('events').EventEmitter
. - In the meantime, we can go through
const {Readable,Writable,Duplex,Transform} = require('stream')
Load the four stream base classes.
A readable stream
- A readable stream is an abstraction of the source that provides the data.
- Example of a readable stream:
- Standard input process.stdin
- Child process standard output and error output child_process.stdout,child_process.stderr
- File read stream fs.createreadStream
- Response received by the client
- Request received by the server
- Readable streams have two modes that can be switched at any time:
- Flow mode: Automatically reads data.
- Pause mode: Stops reading data.
- API for switching to flow mode:
- Listen for data events.
- Pause mode to resume
- Call PIPE to send data to the writable stream
- API for switching to pause mode:
- Pause the flow mode
- The flow mode is lowered using the unpipe method
- Readable stream listener events :’data’,’error’,’end’,’close’,’readable’.
- Other methods: destroy
- All Readable is implemented
stream.Readable
Class defined interface. - Custom readable streams
const Readable = require('stream').Readable;
class CustomReadStream extends Readable {
constructor(source,opt) {
/** Configuration items are passed to the base class */
super(opt)
this.source = source;
this.index = 0;
}
_read(highWaterMark) {
if(this.index === this.source.length) {
this.push(null)}else{
this.source
/** Intercept the chunk */
.slice(this.index,this.index+highWaterMark)
.forEach(element= > {
/** Notice the string */
this.push(element.toString())
})
}
this.index+=highWaterMark
}
}
const customStream = new CustomReadStream(
/ /,2,3,4,5,6,7,8,9,0 [1],
["A"."B"."C"."D"."E"."F"."G"]./** Sets the buffer size to 2 */
{ highWaterMark: 2 }
)
customStream.on('data'.chunk= > {
/** If console.log is used, each output will be a line */
process.stdout.write(chunk);
})
Copy the code
fs.createReadStream
Use cases of
const fs = require('fs');
const path = require('path')
const rs = fs.createReadStream(
path.resolve(process.cwd(),'example.md'),
{
flags:'r'.// What we want to do with the file
mode: 0o666./ / permission bits
encoding: 'utf8'.// Do not wear the default buffer, displayed as a string
// start: 3, // start from index 3.
// end: 8, // end at index 8
highWaterMark: 3.// Buffer size})/** Open file prompt */
rs.on('open'.function() {
process.stdout.write('open the file')})/** displays as utf8 string */
rs.setEncoding('utf8');
/** Add pause and resume mechanism to data listener */
rs.on('data'.function (data) {
process.stdout.write(data)
rs.pause(); // Suspend reading and firing data events
setTimeout(function(){
rs.resume(); // Resume reading and fire the data event
},2000)})/** An error event is raised if the read fails at night */
rs.on('error'.function() {
process.stdout.write('error');
})
/** If the content is finished, the end event */ is emitted
rs.on('end'.function(){
process.stdout.write('finish');
})
rs.on('close'.function(){
process.stdout.write('close the file')})Copy the code
Pseudo code: rs = fs createReadStream (filePath, option) rs. On (" open "|" data "| |" error "" end" | "close") rs. Pause rs. Resume rs. DestroyCopy the code
Streams can be written
- Writable flow is an abstraction of the ‘destination’ to which data is written.
- Examples of writable streams:
- The standard output and error output: process. Stdout, process. Stderr
- Standard input for child processes :child_process.stdin
- File write to stream: fs.createWritestream
- Request sent by the client: Request
- The server returns response
- Fs write Streams file
- Can write flow can monitor events: ‘drain’ tapping, ‘error’ and ‘close’ and ‘finish’, ‘pipe’, ‘unpipe’
- Custom writable streams
const Writable = require('stream').Writable;
class CustomWritable extends Writable {
constructor(arr,opt) {
super(opt);
// set the pointer to this.arr to the address of arr
this.arr = arr;
}
// Implement _write()
_write(chunk,encoding,callback) {
this.arr.push(chunk.toString()); callback(); }}const data = []
const customWritable = new CustomWritable(
data,
{
highWaterMark: 3
}
)
customWritable.write('1')
customWritable.write('2')
customWritable.write('3')
console.log(data)
Copy the code
fs.createWriteStream
Use case
let fs = require('fs');
let path = require('path')
let ws = fs.createWriteStream(
path.resolve(process.cwd(),'example.md'),
{
flags: 'w'.mode: 0o666.start:3.highWaterMark:3 // The default is 16K.})let flag = ws.write('1');
process.stdout.write(flag.toString('utf8')); // true -> Boolean is not standard output. String and buffer are standard output
console.log(flag) // true can output nonstandard output
flag = ws.write('2');
process.stdout.write(flag.toString('utf8')); // true
flag = ws.write('3');
process.stdout.write(flag.toString('utf8')); // false -> highWaterMark: false if there are 3
flag = ws.write('4');
process.stdout.write(flag.toString('utf8')); // false
Copy the code
fs.createWriteStream
Complex case
const fs = require('fs');
const path = require('path');
const ws = fs.createWriteStream(
path.resolve(process.cwd(), 'example'),
{
flags: 'w'.// What do we want to do with the file? 'w' write
mode: 0o666./ / permission bits
start: 0.// Start at 0
highWaterMark: 3 // Buffer size})let count = 9;
function write() {
let flag = true;
while (flag && count > 0) {
process.stdout.write(`before ${count}\n`)
flag = ws.write(
`${count}`.'utf8'./** The callback is asynchronous */
((i) = >() = >process.stdout.write(`after ${i}\n`))(count))
count--;
}
}
write();
ws.on('drain'.function () {
process.stdout.write('drain\n');
write()
})
ws.on('error'.function (error) {
process.stdout.write(`${error.toString()}\n`)})/** before 9 before 8 before 7 after 9 drain before 6 before 5 before 4 after 8 after 7 after 6 drain before 3 before 2 before 1 after 5 after 4 after 3 drain after 2 after 1 */
// Call the end method to close the write stream if you no longer need to write
ws.end();
Copy the code
pipe
It is the simplest and most direct method to connect two streams. The whole process of data transfer is realized internally, and there is no need to pay attention to the flow of internal data during development.pipe
The principle of simulation,by fs.highReadStream and fs.highWriteStream
cconst fs = require('fs');
const path = require('path');
const appDirctory = process.cwd()
const ws = fs.createWriteStream(path.resolve(appDirctory, 'pipe2.md'));
const rs = fs.createReadStream(path.resolve(appDirctory, 'pipe1.md'));
rs.on('data'.function (chunk) {
/** 1. Write the data read from the stream to the stream cache,(to achieve producer and consumer speed balancing) */
const flag = ws.write(chunk);
/** 2. If the write buffer is full, stop reading the stream */
if(! flag) rs.pause(); }) ws.on('drain'.function () {
/** 3. If the write stream cache is empty, restart the read stream */
rs.resume()
})
rs.on('end'.function () {
/** 4. When the read stream finishes reading, we also finish writing to the stream */
ws.end();
})
Copy the code
pipe
The use of the
const fs = require("fs");
const path = require("path");
const appdirectory = process.cwd();
const from = fs.createReadStream(path.resolve(appdirectory,'pipe1.md'));
const to = fs.createWriteStream(path.resolve(appdirectory,'pipe2.md'));
from.pipe(to);
// setTimeout(()=>{
// console.log(' close writing to 2.txt ');
// from.unpipe(to);
// console.log(' manually close file stream ');
// to.end()
/ /}, 2000)
Copy the code
- Simple implementation
pipe
fs.createReadStream.prototype.pipe = function(dest) {
this.on('data'.(data) = >{
const flag = dest.write(data);
if(! flag)this.pause();
})
dest.on('drain'.() = >{
this.resume()
})
this.on('end'.() = >{ dest.end(); })}Copy the code
Read-write flow (duplex flow)
- A readable and writable flow is also called a duplex flow.
- Duplex flow
Readability operation
andWritability operation
Completely independent of each other, this simply combines two features into one object. - Duplex streams implement both Readable and Writable interfaces.
- Examples of readable and writable streams:
- TCP sockets: net.createServer(socket)中的clien.
- On (“data”,function(data){… })
- Writability: socket.write(‘hello word’)
- TCP sockets: net.createServer(socket)中的clien.
- Custom duplex flow
const Duplex = require("stream").Duplex;
class CustomDuplex extends Duplex {
constructor(arr, opt) {
super(opt);
this.arr = arr;
this.index = 0;
}
/** Implements the _read method */
_read(size/** Cache size */) {
if (this.index >= this.arr.length) {
this.push(null)}else {
this.arr.slice(this.index, this.index + size).forEach((value) = > {
this.push(value.toString());
})
this.index += size; }}/** Implements the _write method */
_write(chunk, encoding, callback) {
this.arr.push(chunk.toString());
callback()
}
}
const data = [];
const customDuplex = new CustomDuplex(data, { highWaterMark: 3 });
/** Writes data to the stream */
customDuplex.write("1"/** chunk */);
customDuplex.write("2"/** chunk */);
customDuplex.write("3"/** chunk */);
console.log(data);
/** Reads data from the stream */
console.log(customDuplex.read(2/** size */).toString())
console.log(customDuplex.read(2/** size */).toString())
Copy the code
Transformation flows
Transformation flows
Transform Streams, in which input is transformed before output.Transformation flows
It is also a kind ofDuplex flow
The Readable and Wrible interfaces are also implemented, but we only need to implement the Transform method when we use them.- Transformation flow cases:
- data
Compress/uncompress
The modulezlib
Flow: such as createGzip/createGunzip createDeflate/createInflate. (the way zlib. Gzip/unzip) - data
Encrypt/decrypt
The modulecrypto
Flow: such as crypto createCipher/createDecipher
- data
- Custom conversion flow:
const Transform = require("stream").Transform;
class customTransform extends Transform {
constructor(opt) {
super(opt);
}
_transform(chunk,encoding,callback) {
/** * outputs the converted data to a readable stream */
this.push(chunk.toString().toUpperCase());
/** * Argument 1 is the Error object * argument 2, which, if passed, will be forwarded to readable.push() */callback(); }}let t = new customTransform({highWaterMark: 3});
t.on('data'.function(data){
console.log('data',data.toString());
});
// stdin.pipe(t) indicates that we write standard input to our transformation stream T, which is writable.
// pipe(process.stdout) reads data from the conversion stream T into the standard output, where T is a readable stream.
process.stdin.pipe(t).pipe(process.stdout);
Copy the code
(Extend) Object flow
- By default, the stream processes data
Buffer/String
Type. - But if you set it up
objectMode
Property, then we can let the stream accept any JAvaScript type
Let’s call thatObject flow
const Transform = require("stream").Transform;
const fs = require("fs");
const path = require("path")
const appDirectory = process.cwd()
const rs = fs.createReadStream(path.resolve(appDirectory,"user.json"))
rs.setEncoding("utf8")
const toJSON = new Transform({
readableObjectMode: true.transform:function(chunk,encoding,callback){
this.push(JSON.parse(chunk)); callback(); }})const jsonOut = new Transform({
writableObjectMode: true.transform:function(chunk,encoding,callback) {
console.log(chunk)
callback();
}
})
rs.pipe(toJSON).pipe(jsonOut)
Copy the code
Read './user.json' of file Set encoding of ReadStream to 'utf8' Create transform stream called 'toJSON' Create transform stream called 'jsonOut' ReadStream -> to json -> json out pipe pipeCopy the code
Events
- Events module is one of the core modules of Node. Almost all common Node modules inherit Events module, such as HTTP, FS, etc.
- Example 1: in order to
Wakeup event
, set aEvent listener
const EventEmitter = require('events').EventEmitter;
class Man extends EventEmitter {}
const man = new Man();
man.on('wakeup'.function(){
console.log('The man has woken up.');
})
man.emit('wakeup');
// Output is as follows:
// The man has woken up.
Copy the code
- Example 2: in order to
Wakeup event
, set multipleEvent listener
- When an event is triggered, event listeners are executed in the order they were registered.
const EventEmitter = require('events').EventEmitter;
class Man extends EventEmitter {}
const man = new Man();
man.on('wakeup'.function(){
console.log('The man has woken up.');
})
man.on('wakeup'.function(){
console.log('The man has woken up again.');
})
man.emit('wakeup');
// Output is as follows:
// The man has woken up.
// The man has woken up again.
Copy the code
- Example 3: Register event listeners that run only once.
once
)
const EventEmitter = require('events').EventEmitter;
class Man extends EventEmitter {}
const man = new Man();
man.on('wakeup'.function(){
console.log('The man has woken up.');
})
man.once('wakeup'.function(){
console.log('The man has woken up again.');
})
man.emit('wakeup');
// Output is as follows:
// The man has woken up.
// The man has woken up again.
man.emit('wakeup');
// Output is as follows:
// man has woken up
Copy the code
- Example 4: If an event is fired before registering event listeners, it is ignored.
const EventEmitter = require("events").EventEmitter;
class Man extends EventEmitter {}
const man = new Man();
man.emit('wakeup'.1)
man.on('wakeup'.function(index){
console.log('The man has woken up ->'+index)
})
man.emit('wakeup'.2)
// Output is as follows:
// The man has woken up -> 2'
Copy the code
- Example 5: Prove that EventEmitter is
sequential
Rather thanAsynchronous execution
const EventEmitter = require("events").EventEmitter;
class Man extends EventEmitter {}
const man = new Man();
man.on('wakeup'.function(){
console.log('The man has woken up');
})
man.emit('wakeup')
console.log('The woman has woken up');
// Output is as follows:
// The man has woken up
// The woman has woken up
/ / conclusion:
// Execute sequentially
Copy the code
- Example 6: Remove event listeners
const EventEmitter = require("events").EventEmitter;
class Man extends EventEmitter {}
const man = new Man();
function wakeup () {
console.log('The man has woken up')
}
man.on('wakeup',wakeup);
man.emit('wakeup')
// Output is as follows:
// The man has woken up
man.removeListener('wakeup',wakeup);
man.emit('wakeup')
/ / no output
Copy the code
- Manual implementation
EventEmitter
/** * we need to validate the listener */ when we have a listener in the argument
function checkListener(listener) {
if(typeoflistener ! = ="function") {
throw TypeError("'listener must be a function.")}}/** * the event listener constructor */
class EventEmitter {
constructor() {
this._events = {}
}
addListener(eventName,listener) {
checkListener(listener)
if(!this._events[eventName]) {
this._events[eventName] = []
}
this._events[eventName].push(listener);
}
on(eventName,listener) {
this.addListener(eventName,listener);
}
emit(eventName,... args) {
const listeners = this._events[eventName];
if(! listeners)return;
listeners.forEach(fn= >fn.apply(this,args));
}
removeListener(eventName,listener) {
checkListener(listener);
const listeners = this._events[eventName];
if(! listeners)return false;
const index = listeners.findIndex(item= > item === listener);
if(index === -1) return false;
listeners.splice(index,1)}off(eventName,listener) {
this.removeListener(eventName,listener)
}
removeAllListeners(eventName) {
if(this._events[eventName]) {
delete this._events[eventName]; }}once(eventName,listener) {
checkListener(listener)
/** wrapped to call self-destruct function */
const wrap = (. args) = > {
listener.apply(this,args);
this.removeListener(eventName,wrap);
}
this.addListener(eventName,wrap); }}Copy the code
Main methods listed EventEmitter | __ addListener on once | __ emit | __ removeListener off removeAllListenerCopy the code
fs
concept
- Node Common module of file system
Fs related API
fs.readFile(path[,option],callback)
: Reads the file contentpath
: File pathencoding?
: Encoding modecallback(err,data)
: callback function
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.readFile(
path.resolve(appDirectory,"example.md"),
"utf8".(err,data) = > {
if(err) throw err;
process.stdout.write(data)
}
)
Copy the code
fs.writeFile(file,data[,option],callback)
: Writes the file contentfile
: File name or file descriptordata
: Indicates written dataoption
:option.encoding
: Specifies the encoding format for writing characters. The default is UTF8option.mode
: File mode (permission) Default: 0O666callback(err)
: callback function
- Note:
fs.writeFile
, the default is all replacement. - See supported file system flags. The default value of ‘w’
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
const data = "It is a test."
fs.writeFile(
path.resolve(appDirectory,"example.md"),
data,
(err) = >{
if(err) throw err;
console.log("Write succeeded")})Copy the code
fs.appendFile(file,data[,option],callback)
: Appends the file contentfile
: File name or file descriptordata
: Indicates the appended dataoption
:option.encoding
: Specifies the encoding format for writing characters. The default is UTF8option.mode
: File mode (permission) Default: 0O666callback(err)
: callback function
- Note:
fs.appendFile
: The default is to append to the tail - See supported file system flags. The default value of ‘a’
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
const data = "It is a content which append"
fs.appendFile(
path.resolve(appDirectory,"example.md"),
data,
(err) = >{
if(err) throw err;
console.log("Append successful")})Copy the code
fs.stat(path[,option],callback)
: Determine file status (including whether it is a folder)path
: File pathoptions
:options.bigint
: Whether the value in the returned fs.stat object is of type Bigint. The default value is false
callback(err,stats)
: callback function
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.stat(
path.resolve(appDirectory,"example.md"),
(err,stats) = >{
if(err) throw err;
console.log(stats)
/** These two are not enumerable and are functions that we print directly */
console.log(stats.isFile())
console.log(stats.isDirectory())
}
)
// Stats {
// dev: 4004941460,
// mode: 33206,
// nlink: 1,
// uid: 0,
// gid: 0,
// rdev: 0,
// blksize: undefined,
// ino: 1688849861160504,
// size: 56,
// blocks: undefined,
/ / atimeMs: 1618224089682.408.
/ / mtimeMs: 1618224089682.408.
/ / ctimeMs: 1618224089682.408.
/ / birthtimeMs: 1616552557350.4214.
/ / atime: the 2021-04-12 T10: darts. 682 z,
/ / mtime: the 2021-04-12 T10: darts. 682 z,
/ / ctime: the 2021-04-12 T10: darts. 682 z,
/ / birthtime: the 2021-03-24 T02:22:37. 350 z}
// true
// false
Copy the code
fs.rename(oldPath,newPath,callback)
: Renames the fileoldPath
: Path of the old filenewPath
: New file pathcallback(err)
: callback function
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.rename(
path.resolve(appDirectory,"rename.md"),
path.resolve(appDirectory,"example.md"),
(err) = >{
if(err) throw err;
console.log('Rename file')})Copy the code
fs.unlink(path,callback)
: Delete filespath
: File pathcallback(err)
: callback function
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.unlink(
path.resolve(appDirectory,"example.md"),
(err) = >{
if(err) throw err;
console.log('Deleted successfully')})Copy the code
fs.mkdir(path[,options],callback)
Create a filepath
Path:options
:options.recursive
: Indicates whether to create it recursively. Default value false(note that older versions do not support recursive creation)options.mode
: File mode (permission) Not supported on Windows. Default: 0 o777callback(err)
: callback function
/** non-recursive creation */
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.mkdir(
path.resolve(appDirectory,"example"),
(err) = > {
if(err) throw err;
console.log("Folder created successfully")})/** recursively create */
fs.mkdir(
path.resolve(appDirectory,"a/b"),
{
recursive:true
},
err= > {
if(err) throw err;
console.log("Recursive folder creation succeeded"); })Copy the code
fs.readdir(path[,options],callback)
: Read folderpath
Path:options
:options.encoding
: Determines the encoding type of the content returned after reading. The default is’ UTf8 ‘.options.withFileTypes
: Default value false.
callback(err,files<string[]>|<buffre[]>|<fs.Dirent[]>)
;
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.readdir(
path.resolve(appDirectory,"b"),
{
encoding:'buffer'.// Set to buffer,files returns a file named buffer object
withFileTypes: true // Add the file type
},
(err,files) = > {
if(err) throw err;
console.log(files); })/ / /
// Dirent { name:
, [Symbol(type)]: 2 }
// ]
Copy the code
fs.rmdir(path[,options],callback)
: Delete folderpath
Path:options
:options.maxRetries
Error: retries. EBUSY EMFILE, ENFILE, ENOTEMPTY or EPERM, each will retry the retry interval according to the set retry the operation. Ignore recursive if it is not true, default :0options.retryDelay
: Interval of retries, ignored if recursive is not true. Default value: 100options.recursive
: If true, a recursive directory deletion is performed. In recursive mode, an error is reported if path does not exist
callback(err)
: callback function
const fs = require("fs");
const path = require("path");
const appDirectory = process.cwd();
fs.rmdir(
path.resolve(appDirectory,"b"),
{
recursive: true // We usually use recursive delete, whether there is content in the folder, for the user is black box
},
err= > {
if(err) throw err;
console.log("File deleted successfully")})/ * * error code: https://blog.csdn.net/a8039974/article/details/25830705 * /
Copy the code
Node.js file system operation library
- Listen for file changes chokidar
- The installation
npm install chokidar --save-dev
Copy the code
const chokidar = require("chokidar");
chokidar
.watch(
process.cwd(),
{
ignorad: './node_modules'
}
)
.on(
'all'.(event,_path) = > {
console.log('Listen for changes',event,_path)
}
)
Copy the code
path
path.basename(path[,exit])
Returns thepath
The last part of the path
const path = require("path");
console.log(
path.basename(
path.resolve(process.cwd(),"example.md"),
".md"))// example
Copy the code
path.dirname(path)
: Returns the directory where the file resides
const path = require("path");
console.log(
path.dirname(
path.resolve(process.cwd(),"example.md")))// D:\parent
Copy the code
path.extname(path)
: Returns the file extension
const path = require("path");
console.log(
path.extname(
path.resolve(process.cwd(),"example.md")))// .md
Copy the code
path.join([...paths])
Returns thepath
Path stitching of
const path = require("path");
console.log(
path.join("/nodejs/"."/example.md"))// \nodejs\example.md
// Remove redundant slashes when concatenating
Copy the code
path.normalize(path)
: Normalized path
const path = require("path");
console.log(
path.normalize("/nodejs/example2.md/.. /example.md"))// \nodejs\example.md
// Format round and round expressions
Copy the code
path.resolve([..paths])
: Resolves a path to an absolute path- analogy
path.join()
:path.resolve()
The path is resolved to an absolute path.2. The problem of extracting redundant slashes is not solved.path.join()
: Solves only redundant slashes.
- analogy
const path = require("path");
console.log(
path.resolve("./example.md"))// D:\parent\example.md
console.log(
path.resolve(process.cwd(),"/example.md"))// D:\example.md parsing error -> changed to top-level directory
console.log(
path.resolve(process.cwd(),"example.md"))// D:\parent\example.md
Copy the code
path.parse(path)
: Returns an object containing the path property
const path = require('path');
const pathObj = path.parse('/nodejs/test/index.js');
console.log(pathObj)
/ / {
// root: '/',
// dir: '/nodejs/test',
// base: 'index.js',
// ext: '.js',
// name: 'index'
// }
Copy the code
path.format(pathObject)
: Converts a path object to a path
const path = require('path');
const pathObj = path.parse('/nodejs/test/index.js');
console.log(pathObj)
/ / {
// root: '/',
// dir: '/nodejs/test',
// base: 'index.js',
// ext: '.js',
// name: 'index'
// }
console.log(path.format(pathObj));
// /nodejs/test\index.js
Copy the code
path.sep
: Returns the system-specific path fragment separator
const path = require("path")
console.log(path.sep)
/ / /
Copy the code
path.win32
You can access the path method of Widows
Util (Common tools)
util.callbackify(original)
: passes in a function that returns promise, constructed asAbnormal priority
Callback-style functions.
const util = require('util');
async function hello() {
return 'hello world'
}
let helloCb = util.callbackify(hello);
helloCb((err,res) = >{
if(err) throw err;
console.log(res)
})
Copy the code
util.promisify(original)
: pass in aAbnormal priority
A callback style function constructed as the Promise function.
const fs = require("fs");
const util = require("util");
const path = require("path");
const stat = util.promisify(fs.stat);
stat(
path.resolve(process.cwd(),"example.md")
)
.then(data= >{
console.log("Obtaining file status succeeded",data);
})
.catch(error= >{
console.error("Failed to get file status",error);
})
// Obtain file status successfully Stats {
// dev: 109512952,
// mode: 33206,
// nlink: 1,
// uid: 0,
// gid: 0,
// rdev: 0,
// blksize: 4096,
// ino: 140737488355889780,
// size: 7,
// blocks: 0,
/ / atimeMs: 1618245914082.411.
/ / mtimeMs: 1617905296801.911.
/ / ctimeMs: 1617905296801.911.
/ / birthtimeMs: 1617904969580.9304.
/ / atime: the T16 2021-04-12: she. 082 z,
/ / mtime: the 2021-04-08 T18:08:16. 802 z,
/ / ctime: the 2021-04-08 T18:08:16. 802 z,
/ / birthtime: the 2021-04-08 T18:02:49. 581 z
// }
Copy the code
const fs = require("fs");
const util = require("util");
const path = require("path");
const stat = util.promisify(fs.stat);
(async function statFn() {
try {
const data = await stat(path.resolve(process.cwd(),"example.md"))
console.log("View file status succeeded",data)
}
catch (error) {
console.error("Failed to view file status",error)
}
})()
// Obtain file status successfully Stats {
// dev: 109512952,
// mode: 33206,
// nlink: 1,
// uid: 0,
// gid: 0,
// rdev: 0,
// blksize: 4096,
// ino: 140737488355889780,
// size: 7,
// blocks: 0,
/ / atimeMs: 1618245914082.411.
/ / mtimeMs: 1617905296801.911.
/ / ctimeMs: 1617905296801.911.
/ / birthtimeMs: 1617904969580.9304.
/ / atime: the T16 2021-04-12: she. 082 z,
/ / mtime: the 2021-04-08 T18:08:16. 802 z,
/ / ctime: the 2021-04-08 T18:08:16. 802 z,
/ / birthtime: the 2021-04-08 T18:02:49. 581 z
// }
Copy the code
util.types.isDate(value)
: Checks whether the data is Date
const util = require("util");
console.log(
util.types.isDate(new Date()))Copy the code
- In addition,
lodash
It is also highly recommended as a consistent, modular, high-performance JavaScript utility.
npm install lodash --save
Copy the code
Global object
- JavaScript has a special object called
Global Object
.- We can access the global object and its properties from anywhere in the program.
- We will be
Global object
Is calledThe global variable
.
- Such as:
- In browsers, Windows are usually used as a global object
- In NodeJS, global is used as the global object
Global objects and global variables
- Global object
The most fundamental function
Is a global variablehost
. - According to ECMAScript, a variable that meets one of the following conditions is a global variable:
- (Formerly European Computer Manufacturers Association Script)
- Properties of global objects;
- Variables defined in the outermost layer;
- Implicitly defined variables (variables for which no direct assignment is defined);
- Note that:
- Although: when you define a variable in the outermost layer, the variable becomes a property of the global object.
- But in Node.js, you can’t define variables in the outermost context, because all the code belongs to the current module, and the module itself is not the outermost context.
- Principle of use:
- Avoid the introduction of global variables by not defining direct assignment variables.
- Because abusing global variables can
Polluting namespace
andIncrease code coupling risk
.
__filename
__filename
saidCurrently executing script
File name of the.- In other words, the absolute path of the current module is a variable inside the module
- with
path.filename()
.
__dirname
__dirname
saidCurrently executing script
The directory where.- In other words, the absolute path to the current module directory is a variable inside the module
- with
path.dirname()
process.cwd()
saidThe node command is currently executed
The working directory of the.
setTimeout(cb,ms)
clearTimeout
setInterval
clearInterval
console
processs
process
Is a global variable (that is, a property of the Global object).- It is used for
Describes the current node.js process status
Object that provides a simple interface to the operating system. - The following is a
process.on
Some events to listen for:exit
: Triggered when the current process is about to exit.beforeExit
: Triggered when the process clears the event loop and has no other schedule.- Normally, Node exits when the process has no other schedule, but,
beforeExit
Listeners can be called asynchronously, and Node will continue executing.
- Normally, Node exits when the process has no other schedule, but,
uncaughtException
: Emitted when an exception bubbles into the event loop.Signal events
: Triggered when the process receives a signal.- For the signal list, see POSIX signal names, such as SIGINT and SIGUSR1
process.on('exit'.function(code){
// The following code is never executed
setTimeout(function() {
console.log('This code will not execute')},0)
console.log('Exit code :',code)
})
console.log('End of program execution');
Copy the code
- Exit status code:
Status code | English description | Product description |
---|---|---|
1 | Uncaught Fatal Exception | There are uncaught exceptions and they are not handled by the domain or uncaughtException handler. |
3 | Internal JavaScript Parse Error | JavaScript source code causes parsing errors when starting the Node process. Very rare and only available during Node development. |
4 | Internal JavaScript Evaluation Failure | JavaScript source starts the Node process, returns a function failure when evaluating. Very rare and only available during Node development. |
5 | Fatal Error | Fatal unrecoverable bug in V8. It is usually printed to stderr with the content: FATAL ERROR |
6 | Non-function Internal Exception Handler | No exception is caught, the internal exception handler is somehow set to on-function and cannot be called. |
7 | Internal Exception Handler Run-Time Failure | An exception that is not caught and is thrown by the exception handler itself. For example, if process.on(‘ uncaughtException ‘) or domain.on(‘ error ‘) throws an exception. |
9 | Invalid Argument | It may be that an unknown parameter is given, or the parameter given has no value. |
10 | Internal JavaScript Run-Time Failure | The JavaScript source code throws an error when starting the Node process, which is very rare and only occurs during Node development. |
12 | Invalid Debug Argument | The arguments -debug and/or -debug-brk were set, but the wrong port was selected. |
> 128 | Signal Exits | If the Node receives a fatal signal, such as SIGKILL or SIGHUP, then the exit code is 128 plus signal code. As is standard Unix practice, the exit signal code is placed high. |
process.stdout.write
: Standard output terminal
process.stdout.write('hello world! '+"\n");
Copy the code
process.argv
Returns theWhen starting the process
The enteredCommand line arguments
An array of components.
console.log(process.argv);
Copy the code
node scripts.js
# [
# 'D:\\nodejs\\node.exe',
# 'D:\\my_frontend_files\\systematization\\jscripts.js'
#]
Copy the code
process.execPath
Returns theWhen starting the process
Command line tools used.
console.log(process.execPath);
Copy the code
node scripts.js
# 'D:\nodejs\node.exe'
Copy the code
process.platform
Returns theInformation platform
.
console.log(process.platform);
Copy the code
# 'win32'
Copy the code
The orientation of this in the module
this
Point to theexports
- Proof:
console.log(this);
module.exports.foo = 5;
console.log(this);
/ / {}
// { foo: 5 }
// Exports
Copy the code
Node.js event loop model
background
- In traditional Web services, most are used
Multithreading mechanism
To solve the concurrency problem. - The reason is that I/O events block threads, and blocking means waiting.
- The design of Node.js uses that
Single thread mechanism
That is, each Node.js process has only one main thread to execute code. - So Node.js uses an implementation loop to delegate blocking I/O operations to a thread in the thread pool.
- The main thread itself is only constantly scheduled and does not perform actual I/O operations.
Event loop
Event loop
Is the Node. Js processingNon-blocking I/O operations
The mechanism.- As we know,Node.js uses
Single thread mechanism
That is to say, everyNode. Js process
There is only one main thread to execute the code. - then
Event loop
allowNode.js
throughMove operations to the system kernel
This way, to perform I/O operations. - And because most kernels are multithreaded, they can handle multiple operations in the background.
- When one of these operations completes, the kernel notifies Node.js so that callbacks can be added to
Polling the queue
To the final execution.
This section describes the running process of the event loop
- When Node.js starts, it initializes the event loop:
- Process supplied input scripts that may make API, timer, or process.nexttick calls.
- Then start processing the event loop:
- Each phase has a FIFO queue to perform the callback.
- While each phase is special, typically, when an event loop enters a particular phase, it will perform any operation specific to that phase and then execute the callback in the phase queue until the queue is exhausted or the maximum number of callbacks has been executed.
- When the queue is exhausted or the callback limit is reached, the event loop moves to the next phase.
- Also, between each event loop run, Node.js checks to see if it is waiting for any asynchronous I/O or timers, if not, it shuts down completely.
┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ ┌ ─ > │ timers │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ │ │ pending Callbacks │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ │ │ idle, Prepare │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ │ incoming: │ │ │ poll │ < ─ ─ ─ ─ ─ ┤ connections, │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ data, Etc. │ │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ │ check │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ └ ─ ─ ┤ close callbacks │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘Copy the code
Stage preview
phase | Chinese | describe |
---|---|---|
timers | Timer phase | This phase is executedsetTimeout andsetInterval Set callback |
pending callbacks | Pending callback phase | This phase performs callbacks for certain system operations, such as TCP errors (normally no concern) |
idle,prepare | Idle, preparatory phase | For internal use only |
poll | Polling phase | Fetch the newly completed I/O event; Execute I/ O-related callbacks (except for timers,setImmdiate, which turns off the scheduling of callbacks), and node will block at this point when appropriate |
check | The test phase | setImmediate() This is where the callback function is executed |
close callbacks | Close the callback phase | Perform some closing callbacks, such as socket.on(“close”,…) |
Detailed analysis of each stage
Timers (Stage)
- We know that,
The timer
A time threshold can be specified after the callback, but this is not the exact time at which it is executed. - They can be delayed by system scheduling or other callbacks running.
Timer callback
It just runs as early as possible after the specified time.- Parse the following code:
- When the event loop enters
Polling phase
When, it has an empty queue (fs.readfile is not yet complete). - When 95ms passes, fs.readfile finishes reading the file and adds its callback, which takes 10ms to complete, to the polling queue for execution.
- After the callback completes, there are no more callbacks in the queue.
- The event loop detects that the timer threshold (100ms) has been reached and returns to
Timer phase
To perform the timer callback. - So, the actual delay for the timer’s callback is 105ms.
- When the event loop enters
const fs = require("fs");
function someAsyncOperation(callback) {
fs.readFile('/path/to/file',callback);
}
const timeoutScheduled = Date.now();
setTimeout(() = >{
const delay = Date.now() - timeoutScheduled;
console.log(
`${delay}ms have passed since I was scheduled`)},100)
someAsyncOperation(() = >{
const startCallback = Date.now();
while(Date.now() - startCallback < 10) {}})Copy the code
Pending Callbacks Phase of pending callbacks
- This phase performs callbacks for certain system operations, such as TCP errors. There is no need to pay attention.
Poll Polling phase
- The polling phase has two main functions:
- Calculation:
blocking
andPolling ` ` I/O
The time of - Processing:
Polling the queue
(poll queue).
- Calculation:
- When the event loop enters
Polling phase
And there is noThe timer that is scheduled
, one of the following two things will happen:- if
Polling the queue
Not empty, thenEvent loop
Will traverse the accessThe callback queue
And execute them synchronously until the queue is exhausted, or until a system-specific hard limit is reached (what should be the limit?). . - if
Polling the queue
Is empty, one of two things happens:
2.1 if
The script
Has beensetImmediate()
If yes, thenEvent loop
The end ofPolling phase
And continue toThe check phase
To execute thoseThe script to be scheduled
2.2 if.The script
Has not beensetImmediate()
If yes, thenEvent loop
Will wait forCallbacks are added to the queue
Execute them immediately. - if
- Once the
Polling the queue
Is empty,Event loop
Will check:Which timers have reached the time threshold?
.If there is one or moreThe timer
When ready, thenEvent loop
Will return toTimer phase
To perform these timer callbacks.
Check phase
- if
Polling phase
Idle and the script is in usesetImmediate()
Then the check queue is enteredEvent loop
Will enterThe check phase
“Rather than inPolling phase
Wait. - This phase allows the callback to be performed immediately after the poll phase is complete.
setImmediate()
:- In fact, it’s a
Special timer
And it’s on theEvent loop
theSingle phase
(Check phase) run. - It USES
libuv API
And in the APIPolling phase
Execute the callback when complete.
- In fact, it’s a
- Typically, when executing code, the event loop eventually reaches the polling phase, which waits for incoming connections, requests, and so on.
- But if it’s already in use
setImmdiate()
Set the callback, andPolling phase
Is idle, it will end and enterThe check phase
Instead of waiting for polling events.
Close Callbacks Close the callback function phase
- If a socket or handle suddenly closes (e.g
socket.destroy()
) will be issued at this stage'close'
Events. - Otherwise it will pass
process.nextTick()
Send out.
The difference between setImmediate and setTimeout
setImmediate()
andsetTimeout()
Similar, but they are called at different times.setImmediate()
Designed to be in the presentPolling phase
Execute the script when complete.setTimeout()
The plan is in millisecondsAfter the minimum threshold has passed
Run the script.- Call order:
The timer
theExecution order
Will vary depending on the context in which they are invoked.- in
In the main module
A call:- The order will be limited by the performance of the process, and the two orders will not be fixed (this may be affected by other running applications on the computer).
- in
I/O callback (that is, non-main module)
A call:setImmdiate()
Always execute first.
- Why is it
In the main module
How uncertain is the order of execution of the two?- Main code part: When executing setTimeout,
- The question is, why is the order of execution uncertain externally (e.g., mainline)?
- Answer:
- First, the main code section executes
setTimeout()
Set timer at this time due tosetTimeout()
The time threshold has not been reached, so it has not been writtenTimers queue
. - but
setImmediate()
Is written to immediately after being executedCheck the queue
. - The main code is done. Here we go
Event loop
The first stage is timers. - At this time
The timer queue
It could be empty, or there could be a callback.- if
The timer queue
No callback, then executeCheck the queue
The next loop is checked again and executedTimers queue
The callback; - if
The timer queue
If there is a callback, execute it firstTimer phase
The callback is executedThe check phase
So this is due toThe timer
Is caused by uncertainty.
- if
process.nextTick
-
Process.nexttick () is not technically part of the event loop. Instead, regardless of the current phase of the event loop, the nextTick queue is processed after the current operation completes.
-
The difference between process.nexttick () and setImmediate() :
process.nextTick()
: Executes immediately at the same stage.setImmediate()
In:Event loop
The next iteration ortick
On the trigger.
-
Position of nextTick in the event loop
┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ ┌ ─ > │ timers │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ nextTickQueue │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ │ │ pending callbacks │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ nextTickQueue │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ | | idle, Prepare │ | └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ | nextTickQueue nextTickQueue | ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ | │ poll │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ nextTickQueue │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ │ │ check │ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┬ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘ │ nextTickQueue │ ┌ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┴ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐ └ ─ ─ ┤ close callbacks │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘Copy the code
Microtasks micro tasks
- In the Node world,
Micro tasks
Are callbacks from the following objects:process.nextTick()
then()
- Timing:
After the main thread ends
As well asAfter each stage of the event cycle
, run immediatelyMicrotask callback
. process.nextTick()
withpromise.then
Priority of:- If you have both of them
The same microtask queue
Is executed firstprocess.nextTick
The callback. - process.nextTick > promise.then
- If you have both of them
Self-detection (loop of events)
async function async1() {
console.log('async1 start')
await async2()
console.log('async1 end')}async function async2() {
console.log('async2')}console.log('script start')
setTimeout(function () {
console.log('setTimeout0')
setTimeout(function () {
console.log('setTimeout1');
}, 0);
setImmediate(() = > console.log('setImmediate'));
}, 0)
process.nextTick(() = > console.log('nextTick'));
async1();
new Promise(function (resolve) {
console.log('promise1')
resolve();
console.log('promise2')
}).then(function () {
console.log('promise3')})console.log('script end')
Copy the code