background

In the past six months, I took over a non-Web development work, and have been dealing with data, database and scripts. The original project was Bat script and Script of Windows server. NET command line procedures to run a variety of tasks. I didn’t have much experience with shells before, and the BAT scripts were a bit difficult for me. They were far from bash in every respect, so I started migrating

A simple bash tutorial is recommended

bash-handbook

And the complex logic of the EXE part, or I am familiar with node.js to reconstruct, before this I have not command line development experience, it is to feel the stone cross the river

The resources

Node.js command-line program development tutorial

Node.js Command Line Tool (CLI)

The development of

Links to global directives

As most documentation will mention, once the directive name and code path are written under the bin property in package.json, NPM link is used globally

Notice the current directory

Remember that code may be executed in any directory, so it is not advisable to use relative positioning to read files in your own code directory, such as let s = fs.readfilesync (‘./a.json’)

Instead, use __dirname to get the absolute path to the code directory and then read the file, but require doesn’t need to worry about that, it handles it itself

Get and parse the parameters

I have referred to several mainstream command line modules on NPM, and found that most of them are tightly coupled with specific functions, requiring code to configure command line parsing parameters, which can realize many advanced functions, such as default values, parameter verification, automatic generation of help documents, and so on.

However, I hope to dynamically require the code every time different subcommands are passed, and I also hope to customize the help document. Therefore, I only need a set of relatively simple parsing library, and then deal with the parameters by myself. Therefore, YARgs is temporarily used

Output the Markdown document to the console

const marked = require('marked');
const TerminalRenderer = require('marked-terminal');

marked.setOptions({ renderer: new TerminalRenderer({ showSectionPrefix: false }) });
process.stdout.write(marked(fs.readFileSync(path.join(__dirname, './readme.md')).toString()));
Copy the code

The marked- Terminal module converts the Markdown text to ANSI format with color information, and then outputs it to the console aesthetically

Read data from the input stream

To implement a standard UNIx-like console application, support for pipes is essential. Thanks to node encapsulation, we have an easy way to retrieve input stream data

  • Stream the data from process.stdin, where the Node Promise wrapper and concurrency control module I wrote earlier can come in handy
  • Use directlyfs.readFileSync(0)Read the whole thing,0This is a file handle to the standard input stream, if the data is small or needs to be fully parsed (such as JSON), otherwise we prefer streaming

Debugging information is directed to stderr

To maintain standard output and see debug log output on the console, we can send all debug data to Stderr, and the final run results are printed to STdout

In the common logging module log4js, this configuration is fine so that all log output does not affect the standard output stream

log4js.configure({
  appenders: { err: { type: 'stderr' } },
  categories: { default: { appenders: ['err'], level: 'all' } }
});
Copy the code

Log4js document

Wait for the output stream to finish writing before closing the process

If there are output streams or log files, be careful not to call process.exit to actively terminate the process and wait for the stream write file to complete

Resolve => {ws.on('finish', resolve); ws.end(); }); //node4js log4js.shutdown(cb);Copy the code

The best case scenario is not to write process.exit, just remember that ws.end() ends the write, and the process itself will exit when the active stream (including the standard output stream) is complete and the file handle is released

Use more memory

Node.js uses a v8 engine with a default heap size of 1.7GB. If you add NODE_OPTIONS=–max-old-space-size=40960 to the environment variable, you can increase the available heap size. Optimizing code, streamlining data structures, streaming, and avoiding closures make sense

Use the bake/template function

When using templates or rules to batch process data, consider using pre-built functions or baking templates to optimize efficiency. Front-end compilation frameworks have a lot of research on this, if you are interested

Perform SQL query on CSV

Some complex business requirements need to do data statistics summary file, the original is to use lodash a set of functions to deal with, but is still some heavy and complicated and difficult to read, then introduces alasql, support streaming to SQL query files, even support the predicative method, I now also is still in a preliminary attempt stage, can understand the related requirements

conclusion

Benefited from now all basic front end rendering/JS precompiled frameworks using the CLI tools of Node to provide service, there will be many convenient on NPM module for us to use, on the use of Node to develop command line tool is very convenient, if there is a complex implementation requirements on shell, try to use the Node code to handle

For some understanding and guidance on console programs, I recommend reading UNIX Programming Arts