1. The concept of flow

  • A stream is an ordered set of byte data transfers with a starting point and an ending point, and with good efficiency. With event and non-blocking I/O libraries, the flow module allows it to be processed dynamically when it is available and released when it is not needed.
  • A stream in Node.js is an abstract interface that processes stream data. The STREAM module provides the basic API. Using these apis, you can easily build objects that implement the flow interface. For example, HTTP server request and Response objects are streams.
  • Streams can be readable, writable, or read-write. All streams are instances of EventEmitter

2. Why stream

If a file is read synchronously using fs.readfilesync, the program blocks and all data is written to memory. With fs.readfile, the program does not block, but all data is still written to memory at once and then read by the consumer. If the file is large, memory usage becomes an issue. In this case there is an advantage to being low. Instead of writing a stream to memory at once, it is written to a buffer and then read by the consumer instead of writing the entire file to memory, saving memory space.

  1. When not using streams: You will find that the memory footprint is very large when the files are very large.
  2. When using streams:

    Using streams, you will find that the memory footprint is very small. The process of streams can also be understood as follows:

3. Four flow types

There are four basic types of streams in Node.js:

  • Readable – Readable operation. (such as the fs. CreateReadStream ())
  • Writable – Writable operation. (such as the fs. CreateWriteStream ())
  • Duplex – Readable and write operation. (such as.net. Socket)
  • Transform – A Duplex stream that can modify and Transform data during reading and writing (e.g. Zlib.createdeflate ())

4. CreateReadStream

CreateReadStream implements the Stream.Readable interface object, which reads object data as stream data and starts emitting data when listening for data events

var util = require('util');
var fs = require("fs")

fs.createReadStream = function(path, options) {
  return new ReadStream(path, options);
};
util.inherits(ReadStream, Readable);
Copy the code

4.1 Creating a readable stream

var rs = fs.createReadStream(path,[options]); Path Path for reading files. 2. Options Flags Operations to open files'r'Encoding the default value is null. Start index read position end index read position (including the end position) highWaterMark the default size of the cache read is 64kb > if utf8 is specified, highWaterMark must be larger than 3 bytesCopy the code

4.2 Setting the Encoding

// With the specified {encoding:'utf8'} same effect, set rs.setencoding ('utf8');
Copy the code

4.3 Listening for Data Events

// Once the stream listens for a data event, it can read the contents of the file and emit data. // Set the default size of the cache to read a paragraph and read a paragraph until it is finished. // By default, after listening for the data event, the data is read continuously, and then the data event is triggered, and then the data is read again. Don't stop. // Want the flow to have a pause and resume trigger mechanism, see 4.8 Pause and resume trigger data rs.on('data'.function (data) {
    console.log(data);
});
Copy the code

4.4 Listening for the End Event

Rs.on ()'end'.function () {
    console.log('Read complete');
});
Copy the code

4.5 Listening for Error Events

Rs.on (rs.on)'error'.function () {
    console.log("error");
});
Copy the code

4.6 Listening to open Events

Rs.on (open and close)'open'.function () {
    console.log("File open");
});
Copy the code

4.7 Listening to Close Events

Rs.on (open and close)'close'.function () {
    console.log("File closed");
});
Copy the code

4.8 Pausing and resuming triggering data

// Use pause() and resume() methods.'data'.function(data) { console.log(data); rs.pause(); // Suspend reading and firing data eventssetTimeout(function() { rs.resume(); // resume reading and firing data event},2000); });Copy the code

The above can be seen:

  • openindataBefore,openFirst open the file and thendataLaunch after reading content.
  • endincloseBefore the file is found to have read the executionend, and then close the fileclose

4.9 Two modes of readable stream

  1. Readable streams actually work in one of two modes: flowing and Paused

  2. Under flowing mode, readable streams automatically read data from the underlying system and provided it to the application as quickly as possible through events from the EventEmitter interface.

  3. In paused mode, the stream.read() method must be explicitly called to read a piece of data from the stream.

  4. All of the Readable streams that initially worked in Paused mode could be switched to flowing mode in one of three ways:

    1. Listening to the'data'The event
    var rs = fs.createReadStream(path,[options]);
    Copy the code
    1. callstream.resume()methods
    2. callstream.pipe()Method to send data toWritable
  5. A readable stream can be switched to paused mode by:

    1. Call pipe Destination if no pipe destination existsstream.pause()Method implementation.
    2. If there is a pipeline target, you can cancel it'data'Event listens and callsstream.unpipe()Method to remove all pipe targets.

If Readable switches to flowing mode and there is no consumer to process the data in the flow, that data will be lost. This can happen, for example, if the readable.resume() method is called without listening for ‘data’ events, or if listening for ‘data’ events is disabled.

5. CreateWriteStream

CreateWriteStream implements the Stream. Writable interface to write stream data to the object

fs.createWriteStream = function(path, options) {
  return new WriteStream(path, options);
};

util.inherits(WriteStream, Writable);
Copy the code

5.1 Creating a Writable Stream

// Write data to the writable stream, instead of writing to the file immediately, write to the cache first. The cache size is highWaterMark (default: 16K). Then wait until the cache is full before actually writing to the file. var ws = fs.createWriteStream(path,[options]); Path File path 2. Options Flags Operations to open files. The default value is'w'Encoding the default value is utf8 start start position highWaterMark the default size of the write cache is 16kbCopy the code

5.2 write method

// The write method returns flag, which should returnfalseYou can't write it in, but if you do write it, you don't lose it, you cache it in memory. Wait until the cache is empty and then read it out of memory.letflag = ws.write(chunk,[encoding],[callback]); Chunk Data to be written buffer/string 2. Encoding Encoding format This parameter is available when chunk is a string, optional. 3false, when not fulltrueThe cache cannot write backfalse, can then write backtrue)Copy the code

5.3 end method

ws.end(chunk,[encoding],[callback]); > indicates that no data is to be written to Writable next. By passing the optional chunk and encoding arguments, one more piece of data can be written to Writable before closing the stream. If the optional callback function is passed, it will be used as a callback function'finish'Event callback functionCopy the code

5.4 drain method of tapping

  • When a stream is not in drain state, the call to write() caches the data block and returns false. Once all currently cached data blocks are drained (accepted by the operating system for output), the ‘drain’ event is emitted
  • It is recommended that once write() returns false, no blocks of data can be written until the ‘drain’ event is emitted
Drain ws.on(drain ws.on)'drain'.function () {
    console.log('drain');
});
Copy the code

5.5 finish method

  • After the stream.end() method is called and the buffer data has been passed to the underlying system, the ‘Finish’ event is emitted
ws.end('the end');
ws.on('finish'.function () {
    console.log("Write complete");
});
Copy the code

Reference:

  1. Node.js v8.9.3 Document stream
  2. The use of streams in Node.js
  3. Node. Js Stream (flow)
  4. Node Stream (1) Basic introduction to Stream and download files