preface

My Github /blog, give a little star

Recently, I had to write a script to automatically read a file in a folder and write its contents to another newly generated file. Since this scenario is quite common and there are no good (hand-teaching) solutions available on the Web, it is not very friendly for people who have not yet learned node.js front-end

So this article will teach you to write a script like this, will try to explain clearly, to ensure that you will see!

Scenario, for example,

Suppose there is a project with the following directory:

|-- app1

    |-- config.json

    |-- index.js

|-- app2

    |-- config.json

    |-- index.js

|-- app3

    |-- config.json

    |-- index.js

|-- app4

    |-- config.json

    |-- index.js

|-- config.all.js

|-- package.json

Copy the code

Json file in each app folder. This is the configuration file we need to read. Now all we need to do is write a Node script to listen for file changes in the current directory. And read the configuration files of each app in real time and write them into the config.all.js file.

Now assume that the config file config.json will look something like this:

{

  "name""vortesnail".

  "github""github.com/vortesnail".

  "age""24".

  "address""earth".

  "hobby": ["sing"."dance"."rap"."code"]

}

Copy the code

The config.json content in each app file is inconsistent, which is more consistent with our actual project scenario.

scripting

Install chokidar

Because there are a lot of problems and limitations with native Fs. watch, we used the third-party module Chokidar for file listening.

npm install chokidar

Copy the code

Creating a script File

Now create our script file in the root directory: auto-config.js, name it whatever you want, of course. Start by referencing our third-party module Chokidar and the Node core modules FS, Path, and Process.

const chokidar = require('chokidar')

const fs = require('fs')

const path = require('path')

const process = require('process')

const PROJECT_PATH = process.cwd()

Copy the code

PROJECT_PATH indicates the current directory path.

Using chokidar. Watch

Watch (‘.’, {}).on(),. PROJECT_PATH represents the current path. There are problems using PROJECT_PATH. If you know, please comment!

const chokidar = require('chokidar')

const fs = require('fs')

const path = require('path')

const process = require('process')

const PROJECT_PATH = process.cwd()



chokidar.watch('. ', {

  persistenttrue.

  ignored/ (^ | [\ / \ \]) \.. |auto-config.js|config.all.js|node_modules/.

  depth1

}).on('all', (event, pathname) => {

  console.log(event, pathname)

  // do something later...

})

Copy the code
  • persistent: with the nativefs.watchWhether to protect the process from exiting persistent listening. The default value is true.
  • Ignored: Specifies the file or folder to be ignored.
  • Depth: listens only to the current directory and subdirectories at the next level.

Using the fs. ReaddirSync

Use fs.readdirsync (PROJECT_PATH) to read a list of files in the current directory in the form of an array containing the name of each file or folder. Update our code:

chokidar.watch('.', {

  persistent: true,

ignored: /(^|[\/\\])\.. |auto-config.js|config.all.js|node_modules/,

  depth: 0

}).on('all', (event, pathname) => {

  console.log(event, pathname)

- // do something later...

+ const rootFilenames = fs.readdirSync(PROJECT_PATH)

+ console.log(rootFilenames)

})

Copy the code

It is now possible to run node auto-config.js in the current directory to view the print of the current console, and you will find that the array of file names in the current directory is loiterously printed:

[

  'app1'.

  'app2'.

  'app3'.

  'app4'.

  'auto-config.js'.

  'config.all.js'.

  'node_modules'.

  'package-lock.json'.

  'package.json'

]

Copy the code

The reason for the loop is that for the first time, Chokidar listens for add events for all files in the current directory.

Loop through each folder and get a list of subdirectory files

With the files in the current directory, we need to filter out the folder, and then use the above fs.readdirsync ([path]) on that folder (for example, our app1 and app2 folders) to get the list of files in the directory where the configuration file resides. [path] can be obtained by string concatenation.

chokidar.watch('.', {

    // ...

}).on('all', (event, pathname) => {

  console.log(event, pathname)

  const rootFilenames = fs.readdirSync(PROJECT_PATH)

- console.log(rootFilenames)

+ rootFilenames.forEach(function(file) {

+ const newPath = path.join(PROJECT_PATH, `/${file}/`)

+ const subFilenanme = fs.readdirSync(newPath)

+ console.log(subFilenanme)

+})

})

Copy the code

But now an error is reported because for fs.readdirsync, if the current path read is a file instead of a folder, an error will occur and the program will be terminated. So we need to make a judgment on it.

Read file status fs.stat

Using fs.stat(path,callback) instead of fs.statsync, we can handle some operations after an error occurs.

  • callbackThere are two arguments: (err, stats), stats is an fs.stats object.
  • stats.isDirectory()Check whether it is a folder.

The update code is as follows:

chokidar.watch('. ', {

    // ...

}).on('all', (event, pathname) => {

  console.log(event, pathname)

  const rootFilenames = fs.readdirSync(PROJECT_PATH)

  rootFilenames.forEach(function(file{

    const newPath = path.join(PROJECT_PATH, ` /${file}/ `)

    fs.stat(newPath, function(err, stats{

      if(err){

        console.log(file + 'is not a directory... ')

      } else {

        const isDir = stats.isDirectory() // Is a folder

        if (isDir) {

          const subFilenanmes = fs.readdirSync(newPath)

          console.log(subFilenanmes)

        }

      }

    })

  })

})

Copy the code

Now that we have the list of files in the subdirectory, we can determine if we found the files we need to read and read them.

Use fs.readFileSync and fs.writeFileSync

We need a variable to store the value we read, which we use here

let content = ' '

Copy the code

Here I simply read the.json file, add one to it, and write it all to the newly generated config.all.js file.

Add the following code:

chokidar.watch('.', {

  persistent: true,

ignored: /(^|[\/\\])\.. |auto-config.js|config.all.js|node_modules/,

  depth: 0

}).on('all', (event, pathname) => {

  console.log(event, pathname)

+ let content = ''

  const rootFilenames = fs.readdirSync(PROJECT_PATH)

  rootFilenames.forEach(function(file) {

    const newPath = path.join(PROJECT_PATH, `/${file}/`)

    fs.stat(newPath, function(err, stats) {

      if(err){

console.log(file + 'is not a directory... ')

      } else {

Const isDir = stats.isdirectory () // is the directory

        if (isDir) {

          const subFilenanmes = fs.readdirSync(newPath)

- console.log(subFilenanmes)

+ subFilenanmes.forEach(function(file) {

+ if (file === 'config.json') {

+ const data = fs.readfilesync (path.join(newPath, file), 'utf-8') // Read the file contents

+ content += data + ',' + '\n'

+}

+ fs.writeFileSync(path.join(PROJECT_PATH, 'config.all.js'), `module.exports={data: [${content}]}`)

+})

        }

      }

    })

  })

+ console.log(' configuration table config.all.js has been automatically generated... `)

})

Copy the code

Nodeauto-config. js, then open the config.all.js file in the root directory to see if all config.json files in the app directory are written to it. And any changes you make to the contents of any files in the current directory and subdirectories will regenerate the configuration table.

To deal with defects

The final print because the first listen will generate a lot of… This looks ugly, so you can add a shaker and only output it once. In addition, you can also add some hints where appropriate, now release the complete code:

const chokidar = require('chokidar')

const fs = require('fs')

const path = require('path')

const process = require('process')

const PROJECT_PATH = process.cwd()



chokidar.watch('. ', {

  persistenttrue.

  ignored/ (^ | [\ / \ \]) \.. |auto-config.js|config.all.js|node_modules/.

  depth0

}).on('all', (event, pathname) => {

  console.log(event, pathname)

  let content = ' '

  const rootFilenames = fs.readdirSync(PROJECT_PATH)

  rootFilenames.forEach(function(file{

    const newPath = path.join(PROJECT_PATH, ` /${file}/ `)

    fs.stat(newPath, function(err, stats{

      if(err){

        console.log(file + 'is not a directory... ')

      } else {

        const isDir = stats.isDirectory() // Is a folder

        if (isDir) {

          const subFilenanmes = fs.readdirSync(newPath)

          subFilenanmes.forEach(function(file{

            if (file === 'config.json') {

              const data = fs.readFileSync(path.join(newPath, file), 'utf-8'// Read the contents of the file

              content += data + ', ' + '\n'

            }

            fs.writeFileSync(path.join(PROJECT_PATH, 'config.all.js'), `module.exports={data: [${content}]} `)

          })

        }

      }

    })

  })



  success()

})



function debounce(func, wait{

  var timeout;

  return function ({

    var context = this;

    var args = arguments;

    clearTimeout(timeout)

    timeout = setTimeout(function(){

      func.apply(context, args)

    }, wait);

  }

}



const success = debounce((a)= > {

  console.log('The config table config.all.js has been automatically generated... ')

}, 500)

Copy the code

Now try node auto-config.js and see what happens

Webpack packages the configuration

Sometimes, we need to package up a script file, throw it into the nginx environment, open our script in that root directory, automatically listen for changes to the file, generate a configuration table, and completely free!

Packing configuration is easy, don’t panic!

Install necessary plug-ins

There are just some dependencies needed for webPack packaging. The node-loader package is the most important.

npm install -D webpack webpack-cli node-loader

Copy the code

Webpack configuration

The root directory creates webpack.auto-js

const path = require('path');



module.exports = {

  target"node".

  entry: {

    script: path.resolve(__dirname, "auto-config.js"),

  },

  output: {

    publicPath' '.

    filename'[name].js'.

    path: path.resolve(__dirname, "build"),

  },

  module: {

    rules: [

      {

        test/\.node$/.

        use'node-loader'

      }

].

  },

  node: {

    fs'empty'.

    child_process'empty'.

    tls'empty'.

    net'empty'

  },

};

Copy the code

The important thing is that the target: node file and the entry file should be written correctly, because fsevents has a.node file that we need to process and a Node-loader is needed to recognize and translate it.

Modify the package. The json

"scripts": {

  "test": "echo \"Error: no test specified\" && exit 1",

+ "build:auto": "webpack -p --progress --config webpack.auto.js"

},

Copy the code

packaging

Now, you execute NPM Run Build: Auto on the console, and a little script that can listen and read and write is done! Feel free to use it. Dump the typed package into any directory and run:

node auto-config.js

// If you do not have permission, you need to add sudo

sudo node auto-config.js

Copy the code

Perfect!

conclusion

I think the writing method still has a lot of improvement, but my level is limited, if there is a big guy has a better opinion, very hope you can say it in the comment area, so that more like me “thirst for knowledge” students get growth, thank you! 🙏

Recommended reading: This time, understand HTTPS thoroughly