An overview of the
Node read and write is a common requirement, this article on read, synchronous write, asynchronous write simple application
Read all directories under the folder
/** * read all directories under the folder * input folder path, this parameter is used recursively, the first call is the same as inputUrl * list directory array * inputUrl folder path **/
function readAllFilesSync(input, list, inputUrl) {
// Synchronously obtain all directories under the folder
const files = fs.readdirSync(input);
/ / great soul
for (let i = 0; i < files.length; i++) {
const file = files[i];
// Get directory status
const stats = fs.statSync(input + file);
// Whether the directory is a file
if (stats.isFile()) {
// If yes, the directory is stored
list.push(input.replace(inputUrl, "") + file);
} else {
// Call this method recursively if it is not a file but a folder
readAllFilesSync(input + file + "/", list, inputUrl); }}}Copy the code
Fs.readdir (); fs.readdir(); fs.readdir()
Write files
If you one-time write files too much, can appear too open file error, and I looked up some solutions, basic it is in the system to change the number of open the maximum file at the same time, so I used the synchronization, speaking, reading and writing, as to the efficiency, this is an awkward question, less the number of files, can’t see the efficiency difference, the number of files, Asynchronous reading and writing presents the same problem. Of course, productivity can be improved in a multi-process way, as you can see in another blog I wrote
asynchronous
/** * asynchronous file write * input read folder directory * output write directory * list file array **/
function writeFiles(input, output, list) {
for (const file of list) {
// Get the relative path of the file to the input
let fileDir = file.split("/");
fileDir.splice(fileDir.length - 1.1);
fileDir = fileDir.join("/");
const url = output + fileDir;
// Create a folder based on the directory
fs.mkdir(url, { recursive: true }, () = > {
/ / read the file
fs.readFile(input + file, (err, res) = > {
// The res is a buffer. You can convert the format and then modify what you read
/ / write file
fs.writeFile(output + file, res, (err) = > {
if (err) console.log(err); }); }); }); }}Copy the code
synchronous
/** * synchronously write files * input read folder directory * output write directory * list file array **/
function writeFilesSync(input, output, list) {
/ / timing
console.time();
for (let i = 0; i < list.length; i++) {
// Get the relative path
const file = list[i];
let fileDir = file.split("/");
fileDir.splice(fileDir.length - 1.1);
fileDir = fileDir.join("/");
const url = output + fileDir;
// Create a directory folder
fs.mkdirSync(url, { recursive: true });
/ / read the file
const res = fs.readFileSync(input + file);
/ / write file
fs.writeFileSync(output + file, res);
// The timer ends
if (i === list.length - 1) console.timeEnd(); }}Copy the code
conclusion
Multi-file reading and writing is not difficult in itself, but there are still a lot of holes to be covered. If you need to improve your reading and writing efficiency, please refer to another article of mine. Welcome to my personal blog