First, learn about Buffer
1.1. Binary data
All content in a computer: words, numbers, pictures, audio, and video will eventually be represented in binary.
JavaScript can deal directly with very intuitive data: strings, for example, which is what we usually show the user.
No, JavaScript can handle images, too.
- In fact, on the web side, images are always handed over to the browser;
- JavaScript, or HTML, just tells the browser the address of an image;
- The browser is responsible for taking the image and eventually rendering it;
But it’s different for servers:
- The server has more local file types to handle;
- For example, a file that holds text is not used
utf-8
It’s coded withGBK
, so we must read their binary data, and then through GKB into the corresponding text; - For example, we need to read an image data (binary), and then process the image data by some means (cropping, format conversion, rotation, adding filters). Node has a Sharp library, which reads the image or the Buffer passed in the image and then processes it.
- For example, to establish a long connection in Node through TCP, TCP transmits byte streams. We need to transfer data into bytes and then pass them in, and we need to know the size of the transferred bytes (the customer service needs to determine how much content to read based on the size).
We will find that for front-end development, there is usually very little to do with binary, but for the server side in order to do a lot of functions, we have to directly manipulate the binary data;
So Node provides us with a class Buffer, which is global, to make it easier for developers to do more.
1.2. Buffer and binary
As we mentioned earlier, buffers store binary data, so how exactly do they store it?
- We can think of a Buffer as an array that stores binary;
- Each entry in this array can hold 8 bits of binary:
00000000
Why 8 bits?
-
In a computer, it’s rare that we manipulate a bit binary directly, because a bit binary stores a very limited amount of data;
-
So 8 bits are usually grouped together as a unit, called a byte;
-
That is, 1byte = 8 bits, 1kb=1024 bytes, 1 MB =1024kb;
-
For example, ints in many programming languages are 4 bytes long is 8 bytes;
-
For example, TCP transmits byte streams, and the number of bytes needs to be specified when writing and reading.
-
For example, RGB values are 255, so they are essentially stored in a single byte in the computer;
In other words, a Buffer is an array of bytes, each of which is the size of a byte:
What happens if we want to put a string into a Buffer?
const buffer01 = new Buffer("why");
console.log(buffer01);
Copy the code
Of course, we are no longer expected to do this:
So we can create it with another method:
const buffer2 = Buffer.from("why");
console.log(buffer2);
Copy the code
What if it’s Chinese?
const buffer3 = Buffer.from("Wang Hongyuan");
console.log(buffer3);
// <Buffer e7 8e 8b e7 ba a2 e5 85 83>
const str = buffer3.toString();
console.log(str);
/ / wang hong yuan
Copy the code
If encoding and decoding are different:
const buffer3 = Buffer.from("Wang Hongyuan".'utf16le');
console.log(buffer3);
const str = buffer3.toString('utf8');
console.log(str); / / � � s ~ CQ
Copy the code
Other uses of Buffer
2.1. Other Buffer creation
Buffers can be created in a number of ways:
Take a look at buffer.alloc:
- We will see that we have created an 8-bit Buffer in which all data defaults to 00;
const buffer01 = Buffer.alloc(8);
console.log(buffer01); // <Buffer 00 00 00 00 00 00 00 00>
Copy the code
We can also operate on it:
buffer01[0] = 'w'.charCodeAt();
buffer01[1] = 100;
buffer01[2] = 0x66;
console.log(buffer01);
Copy the code
It can also be obtained in the same way:
console.log(buffer01[0]);
console.log(buffer01[0].toString(16));
Copy the code
2.2. Buffer and file reading
Text file reading:
const fs = require('fs');
fs.readFile('./test.txt'.(err, data) = > {
console.log(data); // <Buffer 48 65 6c 6c 6f 20 57 6f 72 6c 64>
console.log(data.toString()); // Hello World
})
Copy the code
Image file reading:
fs.readFile('./zznh.jpg'.(err, data) = > { console.log(data); // <Buffer ff d8 ff e0 ... 40418 more bytes>});
Copy the code
Image file reading and conversion:
- Will read a picture, into a 200×200 picture;
- Here we can use some help
sharp
Library to complete;
const sharp = require('sharp');const fs = require('fs'); sharp('./test.png') .resize(1000.1000) .toBuffer() .then(data= > { fs.writeFileSync('./test_copy.png', data); })
Copy the code
3. Memory allocation of Buffer
In fact, when we create Buffer, we do not request memory frequently from the operating system. By default, we request 8 * 1024 bytes of memory, which is 8 KB
- The node/lib/buffer. Js: 135 rows
Buffer.poolSize = 8 * 1024;let poolSize, poolOffset, allocPool;const encodingsMap = ObjectCreate(null);for (let i = 0; i < encodings.length; ++i) encodingsMap[encodings[i]] = i;function createPool() { poolSize = Buffer.poolSize; allocPool = createUnsafeBuffer(poolSize).buffer; markAsUntransferable(allocPool); poolOffset = 0; }createPool();Copy the code
If we call Buffer. From the request Buffer:
- Here we use the creation from a string as an example
- The node/lib/buffer. Js: 290 rows
Buffer.from = function from(value, encodingOrOffset, length) { if (typeof value === 'string') return fromString(value, encodingOrOffset); // If it is an object, there is another case //... };
Copy the code
Let’s look at the fromString call:
- The node/lib/buffer. Js: 428 rows
function fromString(string, encoding) { let ops; if (typeofencoding ! = ='string' || encoding.length === 0) { if (string.length === 0) return new FastBuffer(); ops = encodingOps.utf8; encoding = undefined; } else { ops = getEncodingOps(encoding); if (ops === undefined) throw new ERR_UNKNOWN_ENCODING(encoding); if (string.length === 0) return new FastBuffer(); } returnfromStringFast(string, ops); }Copy the code
Next we look at fromStringFast:
- What we’re doing here is deciding if there’s enough left to fill the string;
- If not, then pass
createPool
Create a new space; - If enough, use it directly, but do it later
poolOffset
Offset change of; - The node/lib/buffer. Js: 428 rows
function fromStringFast(string, ops) { const length = ops.byteLength(string); if (length >= (Buffer.poolSize >>> 1)) return createFromString(string, ops.encodingVal); if (length > (poolSize - poolOffset)) createPool(); let b = new FastBuffer(allocPool, poolOffset, length); const actual = ops.write(b, string, 0, length); if(actual ! == length) {// byteLength() may overestimate. That's a rare case, though. b = new FastBuffer(allocPool, poolOffset, actual); } poolOffset += actual; alignPool(); return b; }
Copy the code
Four Stream.
4.1 know the Stream
What is flow?
- Our first reaction should be water, continuous flow;
- Streaming in a program has a similar meaning. We can imagine that when we read data from a file, the binary (bytes) of the file will be continuously read into our program.
- And this string of bytes is the stream in our program;
So, we can understand flow like this:
- Is a representation and abstraction of contiguous bytes;
- Streams should be readable and writable;
Before we learned to read and write files, we can directly read and write files through readFile or writeFile, why still need stream?
- The direct way of reading and writing files, although simple, but can not control some details of the operation;
- For example, where to read from, where to read, how many bytes to read at a time;
- Pause reading at a certain point, resume reading at a certain point, and so on;
- Or the file is too large, like a video file, to read all at once;
In fact, many objects in Node are implemented based on streams:
- Request and Response objects of HTTP module;
- Process. Stdout object;
All other streams are instances of EventEmitter:
We can look at the Node source code has such operations:
The classification of streams:
- Writable: A stream to which data can be written (for example fs.createWritestream ()).
- Readable: A stream from which data can be read (for example, fs.createreadStream ()).
Duplex
: at the same time as theReadable
And the flow ofWritable
(e.g.net.Socket
).Transform
:Duplex
Streams of data that can be modified or transformed as they are written and read (e.gzlib.createDeflate()
).
Here we will explain Writable and Readable through fs operations, and you can learn the other two on your own.
4.2. Readable
Before we read the information from a file:
fs.readFile('./foo.txt'.(err, data) = > { console.log(data); })Copy the code
This method is to read all the contents of a file into the program (memory) at once, but this method has many of the problems we mentioned earlier:
- File size, read location, end location, size of one read;
At this point, we can use createReadStream. Let’s look at a few parameters.
- Start: indicates the position where the file is read.
- End: indicates the position where file reading ends.
- HighWaterMark: Length of bytes read at one time, default is 64KB;
const read = fs.createReadStream("./foo.txt", { start: 3.end: 8.highWaterMark: 4});
Copy the code
How do we get the data?
- You can obtain the data read by listening to data events.
read.on("data".(data) = > { console.log(data); });Copy the code
We can also listen for other events:
read.on('open'.(fd) = > { console.log("File opened"); })read.on('end'.() = > { console.log("File reading finished"); })read.on('close'.() = > { console.log("File closed"); })Copy the code
We can even pause and resume reading at a certain point:
read.on("data".(data) = > { console.log(data); read.pause(); setTimeout(() = > { read.resume(); }, 2000); });Copy the code
4.3. Writable
Previously we wrote a file like this:
fs.writeFile('./foo.txt'."Content".(err) = >{});Copy the code
This is equivalent to writing everything into a file at once, but there are a number of problems with this approach:
- For example, we want to write things little by little, precisely where we write each time, etc.
At this point, we can use createWriteStream. Let’s look at a few parameters.
- Flags: The default value is
w
If we want to append to write, we can use ita
ora+
; - Start: write position.
Let’s do a simple write
const writer = fs.createWriteStream("./foo.txt", {
flags: "a+".start: 8
});
writer.write("Hello?".err= > {
console.log("Write succeeded");
});
Copy the code
If we want to listen for some events:
writer.on("open".() = > {
console.log("File open");
})
writer.on("finish".() = > {
console.log("File write completed");
})
writer.on("close".() = > {
console.log("File closed");
})
Copy the code
We find that we cannot listen for the close event:
- This is because the write stream does not close automatically after being opened;
- We must manually close it to tell Node that the write is complete;
- And it sends out one
finish
The event;
writer.close();
writer.on("finish".() = > {
console.log("File write completed");
})
writer.on("close".() = > {
console.log("File closed");
})
Copy the code
Another very common method is end:
end
The method is equivalent to doing two steps:write
Incoming data and callsclose
Methods;
writer.end("Hello World");
Copy the code
4.4. Pipe method
Under normal circumstances, we can read the input stream, manually into the output stream to write:
const fs = require('fs');
const { read } = require('fs/promises');
const reader = fs.createReadStream('./foo.txt');
const writer = fs.createWriteStream('./bar.txt');
reader.on("data".(data) = > {
console.log(data);
writer.write(data, (err) = > {
console.log(err);
});
});
Copy the code
We can also do this with pipe:
reader.pipe(writer);
writer.on('close'.() = > {
console.log("Output stream off");
})
Copy the code
Note: All of this content is published on the public account CoderWhy. We will update our tutorials on Flutter, TypeScript, React, Node, Uniapp, MPvue, data structures and algorithms, etc. We will also update some of our own learning experiences