1. process
Process is a global variable, so it can be used directly without requiring the require operation.
The first is to use it to obtain information about the process, such as the local environment when the process is working, through the process can be obtained. The second process allows you to perform some operations on the current process, such as listening for built-in events during process execution and creating child processes to perform more operations.
1. Obtain memory related information
// Check memory consumption
console.log(process.memoryUsage());
/** * RSS: permanent memory * heapToal: total memory size * heapUsed: used memory * external: extended memory - C/C++ core modules * arrayBuffers: buffer size */
Copy the code
2. Obtain CPU information
console.log(process.cpuUsage());
/** * user: time segment occupied by the user * system: time segment occupied by the system */
Copy the code
3. At runtime, you can view the running directory, Node environment, CPU architecture, user environment, and system platform through Process.
process.cwd(); // Run the directory
process.version; / / the Node version
process.versions; // Run the environment version
process.arch; / / CPU architecture
process.env.Node_ENV; // The environment needs to be set first
process.env.PATH; // Environment variables
process.env.USERPROFILE; // The administrator directory path varies with the environment. Process.env.home
process.platform; // Platform Win32 MacOS
Copy the code
4. Run time can obtain startup parameters, PID, running time,
process.argv; By default, there are two values: the Node directory and the directory where the script is executed.
process.argv0; // Get the first value, there is only one API
process.pid; // Get the running PID
process.ppid;
process.uptime; // Script run time
Copy the code
Events listen for what is provided in process. I won’t focus on what exactly happens in Process, but rather familiarize myself with event-driven programming and publish-subscribe patterns in NodeJS.
Process implements the EMIT interface. You can use on to listen for events. There are many internal events, such as exit, that are executed when the program exits. Note that the events bound here can only execute synchronous code, not asynchronous code.
process.on('exit'.(code) = > { / / exits
console.log(code); / / 0
})
process.on('beforeExit'.(code) = > { // Before exiting
console.log(code); / / 0
})
Copy the code
Exit manually, this exit does not execute beforeExit, and the code after exit does not execute because exit is already exit.
process.exit();
Copy the code
Standard output, input, error
process.stdout; // is a stream that can be read or written to.
process.stdout.write('123'); / / 123
Copy the code
const fs = require('fs');
fs.createReadStream('text.txt').pipi(process.stdout); // Read file output.
Copy the code
process.stdin; // Get the console input
process.stdin.pipe(process.stdout); // Output after input
Copy the code
// Set the character encoding
process.stdin.setEncoding('utf-8');
// Listen for readable events, that is, content
process.stdin.on('readable'.() = > {
// Get the input
let chunk = process.stdin.read();
if(chunk ! = =null) { process.stdout.write(chunk); }})Copy the code
2. path
The built-in module in Node can be imported directly using require. Its main function is to handle the directory and path of the file. You just call different methods. Path is like a toolbox, you just need to master the tools it provides, which are methods.
const path = require('path');
Copy the code
1. basename()
Gets the base name in the path
path.basename(__filename); // test.js
// Passing the second argument will omit the suffix if it matches, and return the true suffix if it does not
path.basename(__filename, '.js'); // test
path.basename('/a/b/c'); // c
path.basename('/a/b/c/'); // c
Copy the code
2. dirname()
Gets the name of the directory in the path
path.dirname(__filename); // d:\Desktop\test
path.dirname('/a/b/c'); // /a/b
path.dirname('/a/b/c/'); // /a/b
Copy the code
3. extname()
Gets the extension name in the path
path.extname(__filename); // .js
path.extname('/a/b'); //
path.extname('/a/b/index.html.js.css'); // .css
path.extname('/a/b/index.html.js.'); // .
Copy the code
4. isAbsolute()
Obtain whether the path is an absolute path
path.isAbsolute('a'); // false
path.isAbsolute('/a'); // true
Copy the code
5. join()
Splice multiple path fragments to restore the full available path
path.join('a/b'.'c'.'index.html'); // a/b/c/index.html
path.join('/a/b'.'c'.'index.html'); // /a/b/c/index.html
path.join('a/b'.'c'.'.. / '.'index.html'); // a/b/index.html
Copy the code
6. resove()
Return an absolute path
path.resove(); // Get the absolute path
Copy the code
7. parse()
Parsing path
const obj = path.parse('/a/b/c/index.html');
/**
* root: /
* dir: /a/b/c
* base: index.html
* ext: .html
* name: index
*/
Copy the code
8. format()
Serialized paths, contrary to the parse functionality, concatenate objects into full paths.
path.format({
root: '/'.dir: '/a/b/c'.base: 'index.html'.ext: '.html'.name: 'index'
});
// /a/b/c/index.html
Copy the code
9. normalize()
Normalizes a path to make an unavailable path available. This method will be translated if there are translation characters.
path.normalize('a/b/c/d'); // a/b/c/d
path.normalize('a//b/c.. /d'); // a/b/c.. /d
path.normalize('a\\/b/c\\/d'); // a/b/c/d
Copy the code
3. Buffer
Buffers are generally called buffers, and it can be argued that buffers allow developers to manipulate binary with JS. IO operations operate on binary data. A Buffer in NodeJS is a piece of memory space. Its size does not take up the size of V8 memory, and Buffer requests are not generated by Node. It’s just that the V8 GC does the collection.
Buffer is a global variable in NodeJS and can be used directly without requiring. Typically used with a stream to act as a buffer for data.
Alloc creates buffers of a specified size in bytes, with no data by default
AllocUnsafe Creates buffers of a specified size, but it is unsafe to use fragmented space to create buffers, which may contain garbage and dirty data, not necessarily empty.
From Receives data to create Buffer
Prior to v6, it was possible to create Buffer objects by instantiating them, but this created too many privileges and restricted how they could be instantiated in order to control permissions.
/ / create a Buffer
const b1 = Buffer.alloc(10);
const b2 = Buffer.allocUnsafe(10);
Copy the code
Buffers created from can receive three types: strings, arrays, and buffers. The second parameter is the encoding type.
const b3 = Buffer.from('1');
const b4 = Buffer.from([1.2.3]);
Copy the code
Some common instance methods of Buffer.
Fill: A Buffer is filled with data that is repeatedly written to the last bit
Write: Writes as much data as possible to the Buffer.
ToString: Extracts data from Buffer
Slice: capture Buffer
IndexOf: Finds data in Buffer
Copy: copies the data in the Buffer
Static methods of Buffer.
Concat: Concatenates multiple buffers into a new Buffer
IsBuffer: Checks whether the current data is a Buffer
Buffer split method implementation.
Array.Buffer.splice = function(sep) {
let len = Buffer.form(sep).length;
let ret = [];
let start = 0;
let offset = 0;
while(offset = this.indexOf(sep, start) ! = = -1) {
ret.push(this.slice(start, offset))
start = offset + len;
}
ret .push(this.slice(start));
return ret;
}
Copy the code
4. fs
Buffers and streams are everywhere in Node, and are used to manipulate binary data.
Fs is a built-in core module, all file-related operations are realized through FS, such as file and directory creation, deletion, information query or file reading and writing.
If you want to manipulate binary data in the file system, you need to use the API provided by the FS module, and Buffer and Stream are inseparable in this process.
Before introducing fs module, we first need to introduce the basic knowledge of file system, such as permission bits, identifiers, file descriptors and so on.
File permissions are classified into R, W, and X. R is the read permission, W is the write permission, and X is the execute permission. If you use base 8 numbers, r is 4, W is 2, x is 1, and if you don’t have this permission, it’s a 0.
The operating system divides users into three categories: the owner of the file, which generally refers to the current user; the group to which the file belongs, which is similar to the current user’s family; and finally, other users, who are visitors.
Flag in Node indicates the operation mode on the file, such as whether the file can be read or written.
R: readable
W: writable
S: indicates synchronization
+: Indicates the reverse operation
X: indicates exclusive operation
A: Indicates the append operation
Fd is the identifier assigned by the operating system to an open file, through which a particular file can be identified and traced. There are differences between operating systems, and Node smoothes them out for us.
The file descriptor is incremented each time Node operates on a file, and it starts at 3. Because 0,1,3 are already taken up by inputs, outputs, and errors. We will get this fd later when we open the file with fs.open.
Fs any file manipulation API has both synchronous and asynchronous modes, here only the asynchronous API is demonstrated, synchronization is basically the same
1. Read and write files
ReadFile: Reads data from a specified file
const fs = require('fs');
const path = require('path');
fs.readFile(path.resolve('aaa.txt'), 'utf-8'.(err, data) = > {
console.log(err);
console.log(data);
})
Copy the code
WriteFile: writes data to a specified file
fs.writeFile('bbb.txt'.'hello', {
mode: 438./ / operation
flag: 'w+'.encoding: 'utf-8'
}, (err) = > {
console.log(err);
})
Copy the code
AppendFile: Writes data to the specified file in append mode
fs.appendFile('bbb.txt'.'hello', {}, (err) = > {
console.log(err);
})
Copy the code
CopyFile: Copies data from each file to another file
fs.copyFile('aaa.txt'.'bbb.txt'.(err) = > {
console.log(err);
})
Copy the code
WatchFile: Monitors a specified file
fs.watchFile('bbb.txt', {
interval: 20 // Monitor once for 20ms
}, (curr, prev) = > {
console.log(curr); // Current information
console.log(prev); // Previous message
if(curr.mtime ! == prev.mtime) {// The file has been modified
}
})
fs.unwatchFile('bbb.txt'); // Cancel monitoring
Copy the code
2. Open and close the file
We have used fs to implement file reading and writing operations. Since we have already achieved the file reading and writing operation, why Node needs to provide a separate open and close API?
Because readFile and writeFile work by reading or writing all the contents of a file into memory at one time, this method is obviously not reasonable for large files, so we need a way to realize the operation of reading and writing or writing while reading. In this case, file opening, reading, writing, and closing should be treated as separate processes, so there are open and close.
const fs = require('fs');
const path = require('path');
// open
fs.open(path.resolve('aaa.txt'), 'r'.(err, fd) = > {
console.log(err);
console.log(fd);
fs.close(fd, (err) = >{}); })Copy the code
3. Directory operations
Access: Checks whether a file or directory has operation rights
fs.access('aaa.txt'.(err) = > {
console.log(err); // There is no permission
})
Copy the code
Stat: Obtains directory and file information
fs.stat('aaa.txt'.(err, stat) = > {
console.log(stat); // size isFile(), isDirectory()
})
Copy the code
Mkdir: creates a directory
fs.mkdir('a/b/c', {
recursive: true.// create recursively
}, (err) = > {
console.log(err);
})
Copy the code
Rmdir: Deletes a directory
fs.rmdir('a', {
recursive: true.// Delete recursively
}, (err) = > {
console.log(err);
})
Copy the code
Readdir: Reads the contents of a directory without recursing subdirectories
fs.readdir('a'.(err, files) = > {
console.log(files);
})
Copy the code
Unlink: deletes a specified file
fs.unlink('a'.(err) = > {
console.log(err);
})
Copy the code
6. commonjs
CommonJS was created to solve the problem of front-end modularity. The authors hope to force browsers to implement front-end modularity, but due to the single-threaded blocking nature of browsers, CommonJS is not suitable for browser platforms. CommonJS is a superset, it is a specification at the language level, modularity is only one part of this specification.
7. Events
Unified event management is realized in Node through EventEmitter class. This class is rarely introduced in actual development. This class is mostly used by built-in modules such as FS and HTTP.
Node is an event-driven asynchronous operation architecture with the built-in Events module, which provides EventEmitter classes. Its instance objects can register events and publish events and delete events.
On: Adds the callback function that is called when the event is raised
Emit: Triggers events that invoke each event listener in the order in which they were registered
Once: Registers a listener to execute once
Off: Removes a specific listener
const EventEmitter = require('events');
const ev = new EventEmitter();
ev.on('event'.() = > {
})
ev.emit('event');
Copy the code
7. stream
Flow is not NodeJS original content, can use the ls in the Linux system | grep *. Js command operations, is actually to the ls command to get to the content of the hands of the grep, this is a flow operation.
Using streams improves efficiency in both space and time. NodeJS was born to improve IO performance, and the most commonly used file systems and networks are streaming applications.
A stream in NodeJS is an abstract interface for processing streaming data. The Stream object in NodeJS provides an object for manipulating the stream. You can only learn more about streams if you use them more.
The segmented processing of streams can operate multiple chunks of data at the same time without occupying large memory space at the same time. Flow with pipe, the extender becomes simple.
The Stream module is built into NodeJS, which implements the stream manipulation object. The Stream module implements four concrete abstractions. All streams inherit from EventEmitter.
Readable: Readable stream that can be used to obtain data.
Writeable: Writeable stream that can write data.
Duplex: a Duplex flow that can be measured and written.
Tranforms: Transform streams that can be read, written, and converted to data.
const fs = require('fs');
const rs = fs.createReadStream('./a.txt');
const ws = fs.createWriteStream('./b.txt');
rs.pipe(ws);
Copy the code
1. A readable stream
Produce data streams for consumption.
const rs = fs.createReadStream('./a.txt');
Copy the code
const { Readable } = require('stream');
const source = ['a'.'b'.'c'];
class MyReadable extends Readable {
constructor() {
super(a);this.source = source;
}
_read() {
const data = this.source.shift() || null;
this.push(data); }}const myreadable = new MyReadable(source);
myreadable.on('data'.(chunk) = > {
console.log(chunk.toString());
})
Copy the code
2. Can write flow
A stream for consuming data, response.
const ws = fs.createWriteStream('./b.txt');
Copy the code
const { Writable } = require('stream');
class MyWriteable extends Writable {
constructor() {
super();
}
_write (chunk, en, done) {
process.stdout.write(chunk.toString());
process.nextTick(done);
}
}
const mywriteable = new MyWriteable();
mywriteable.write('yindong'.'utf-8'.() = > {
consoel.log('end');
})
Copy the code
3. Duplex
const { Duplex } = require('stream');
class MyDuplex extends Duplex {
constructor(source) {
super(a);this.source = source;
}
_read() {
const data = this.source.shift() || null;
this.push(data);
}
_write(chunk, en, next){ process.stdout.write(chunk); process.nextTick(next); }}const source = ['a'.'b'.'c'];
const myDuplex = new MyDuplex(source);
mtDuplex.on('data'.(chunk) = > {
console.log(chunk.toString());
})
mtDuplex.write('yindong'.() = > {
console.log('end');
})
Copy the code
4. Transform
const { Transform } = require('stream');
class MyTransform extends Transform {
constructor() {
super(a); }_transform(chunk, en, cb) {
this.push(chunk.toString().toUpperCase());
cb(null); }}const t = new MyTransform();
t.write('a');
t.on('data'.(chunk) = > {
console.log(chunk.toString());
})
Copy the code
8. A linked list
A linked list is a data storage structure.
The new Version of Node uses a linked list structure to store queued data in the buffer when the write method of a file writable stream works.
Compared with array, linked list has more obvious advantages. In many languages, the length of data stored in array has an upper limit, array will move the position of other elements when performing insert or delete operation, and array is realized as an object in JS. So it will be less efficient in use.
Of course, this is all relative, but in practice arrays are very powerful.
A linked list is a collection of nodes. The nodes are called Node nodes, and each Node has a reference to an object that points to the next Node. Combining these references to the next Node makes a chain. There are different types of linked lists that we hear about, bidirectional linked lists, one-way linked lists, circular linked lists. Commonly used is the general bidirectional linked list.
9. Assertion
Assert, and if it does not, it stops the program and prints an error
const assert = require('assert');
// Assert (condition, one paragraph)
function sum(a, b) {
assert(arguments.length === 2.'Must have two arguments');
assert(typeof a === 'number', 'The first argument must be a number'); }Copy the code
10. C++ Addons
A plug-in written in C and used in Node.
11. Cluster
multithreading
A program is a process, a process will have multiple threads and processes are strictly isolated, with independent execution space. All threads in the same process share a set of space, code
- Multiple processes
High cost, slow speed, security, interprocess communication trouble, easy to write code.
- multithreading
Low cost, fast, unsafe, easy to communicate between threads, complex to write code.
Child_Processes, Cluster, Process.
12. Command Line Options
Gets command line arguments
13. Crypto
Signature to complete the encryption algorithm
const cryptp = require("crypto");
let obj = cryto.createHash('md5');
obj.update('123');
console.log(obj.digest('hex')); / / the MD5 32 bits
Copy the code
14. OS
System related
const os = require('os');
// Get the number of cpus
os.cpus();
Copy the code
15. Events
The event queue
const Event = require('events').EventEmitter;
let ev = new Event();
// Bind events
ev.on('msg'.function(a, b,c) {
console.log(a, b, c); // 12, 3, 8
})
// Send events
ev.emit('msg'.12.3.8);
Copy the code
Implement a Events for yourself:
class EventEmitter {
constructor() {
this._events = {};
}
on(eventName, callBack) {
if (this._events[eventName]) {
this._events[eventName].push(callBack);
} else {
this._events[eventName] = [callBack]; }}emit(eventName) {
this._events[eventName].forEach(fn= >{ fn(); })},off(eventName, callBack) {
this._events[eventName] = this._events[eventName].filter(fn= >fn ! == callBack) } }Copy the code
16. url
Request URL module
const url = require('url');
url.parse('url'.true); // The url and parameters are parsed, including query-string functionality.
Copy the code
17. Net
TCP stable.net
UDP UDP/Datagram
DNS/Domain
18. DNS
Domain name resolved to IP
const dns = require('dns');
dns.resolve('baidu.com'.(err, res) = >{})Copy the code
19. http
Net based HTTP service
const http = require('http');
const server = http.createServer((request, response) = > {
// request indicates request data
// Response indicates the response data
// Set the return value type
res.writeHead(200, {'Content-type' : 'text/html'});
response.write = 'server start';
return response.end(); / / end
});
server.listen(3000); // The port number that the service listens to
Copy the code