What is stream

  1. A stream is an abstract interface in Node.js that handles streaming data. The Stream module is used to build objects that implement the stream interface. Node.js provides a variety of stream objects. For example, HTTP server requests and process.stdout are instances of flows.
  2. Benefits of streaming Non-blocking data processing improves efficiency; Chunk data chunking saves memory; Pipeline can improve the expansibility, convenient combination;
    // Use the stream module
    const stream=require('stream')
Copy the code

There are four basic stream types in Node.js:

  1. Writable – A stream that can write data (for example, fs.createWritestream ()).
  2. Readable – A stream that can read data (for example, fs.createreadStream ()).
  3. Duplex – a stream that is both readable and writable (such as net.socket).
  4. Transform – That can modify or Transform data during reading and writing.

‘drain’ event in Writable Stream

  1. Stream2.write (chunk) returns false if stream1 is piped to Stream2 because stream1 is too fast for Stream2 to write. The ‘drain’ event is emitted until it is ready to continue writing data to Stream2. It is also understood that stream2 has finished writing data, has dried up and is ready to continue writing data. Therefore, when writable.write(chunk) returns a value of false, you should stop writing data and listen for a wriable drain event before writing
    //studyDrainEvent.js
    const fs=require('fs');
    function writeOneMillionTimes(writer, data) {
        let i = 1000000;
        write();
        function write() {
            let ok = true;
            do {
                i--;
                if (i === 0) {
                    // Write the last time.
                    writer.write(data);
                } else {
                    // Check to see if you can continue writing.
                    // Do not pass in the callback because the write is not finished.
                    ok = writer.write(data);
                    if(ok===false) {console.log('I can't write any more.'); }}}while (i > 0 && ok);
            if (i > 0) {
                // Aborted prematurely.
                // Continue writing when the 'drain' event is emitted.
                writer.once('drain'.() = >{
                    console.log('Dry up'); write(); }); }}}const writer=fs.createWriteStream('./bigfile.txt');
    writeOneMillionTimes(writer, 'hello world\n');

Copy the code

Finish event in Writable Stream

The Finish event is fired after stream.end() is called and the buffer data has been passed to the underlying operating system.

Five Readable Stream

The default in the paused state, add data event listeners, becomes flowing state, delete data event listeners, becomes the paused state. Pause () changes it to paused. Resume () can turn it into flowing state

Custom stream

// Create a Writable Stream
const {Writable}=require('stream')

const outStream=new Writable({
    write(chunk,encoding,callback){
        console.log(chunk.toString()); callback(); }})// User input
// process.stdin.pipe(outStream);
process.stdin.on('data'.(chunk) = >{
    outStream.write(chunk);
})
Copy the code

2.

const {Readable}=require('stream');

const inStream=new Readable({
    read(chunk,encoding,callback){
        this.push(String.fromCharCode(this.currentCharCode++));
        if(this.currentCharCode>90) {this.push('\n');
            this.push(null); }}}); inStream.currentCharCode=65;
// inStream.pipe(process.stdout);
inStream.on('data'.(chunk) = >{
    process.stdout.write(chunk);
})
Copy the code

Seven buffer

Both writable and readable streams store data in internal buffers, which can be retrieved using writable.writableBuffer or readable.readableBuffer, respectively. When stream.push(chunk) is called, the data is buffered in a readable stream. If the consumer of the stream does not call stream.read(), the data remains in the internal queue until consumed. Once the total size of the internal readable buffer reaches the threshold specified by highWaterMark, the stream temporarily stops reading data from the underlying resource until the currently buffered data is consumed (that is, the stream stops calling the internal readable._read() method used to fill the readable buffer). The highWaterMark is a highWaterMark that indicates the buffering capacity of a stream. Once the total size of the internal readable buffering reaches a threshold specified by the highWaterMark, the stream stops reading data until the current buffered data is consumed. Because Duplex and Transform are both readable and writable, they maintain two separate internal buffers for reads and writes, allowing the read and write sides to operate independently of each other while maintaining data flows. For example, a Net. Socket instance is a Duplex stream whose readable side can consume data received from the Socket and whose writable side can write data to the Socket. Because data can be written to the socket faster or slower than data can be received, the read and write ends should operate (or buffer) independently.

Eight transform flow examples

const fs = require("fs");
const zlib = require("zlib");
const file ='./bigfile.txt';
fs.createReadStream(file)
    .pipe(zlib.createGzip())
    .pipe(fs.createWriteStream(file + ".gz"));
Copy the code