Node.js applications do not need to go through the compilation process and can be directly copied to the deployment machine for execution, which is indeed more convenient than C++, Java and other compiled applications. However, node.js application execution requires a runtime environment, which means you need to install Node.js on the deployment machine first. Although there is no trouble to go to where, but after all, one more step, especially for the offline environment of the deployment machine, the degree of trouble has to rise. Let’s say you write some small desktop-level tools in Node.js, and then install Node.js on the client. Even worse, it becomes even more difficult to deploy multiple Node.js applications upstream from the deployment machine that depend on different Versions of Node.js.
Ideally, node.js should be packaged as a single executable that can be copied to when deployed. In addition to ease of deployment, there is also intellectual property protection as there is no need to copy the source code.
Tools that package Node.js as an executable include PKG, Nexe, Node-Packer, and Enclose. PKG is the best in terms of packaging speed, ease of use, and functional integrity. This article is about my experience of using PKG to package Node.js applications over the past six months.
The packaging principle of PKG is simply to package js code and related resource files into the executable, and then hijack some functions in FS to enable it to read the code and resource files in the executable. For example, the original require(‘./a.js’) would be hijacked into a virtual directory require(‘/snapshot/a.js’).
The installation
PKG can be installed globally or locally. Local installation is recommended:
npm install pkg --save-devCopy the code
usage
PKG -h: PKG -h: PKG -h
pkg [options] <input>Copy the code
can be specified in three ways:
1. A script file, such as PKG index.js; 2. Package. json, for example, PKG package.json, uses the bin field in package.json as the entry file. 3. A directory, such as pkg. will look for package.json files in the specified directory, and then look for the bin field as the entry file.
-t specifies the destination platform and Node version to package. For example, -t node6-win-x64,node6-linux-x64, and node6-macos-x64 can package executable programs of three platforms at the same time. 2.-o specifies the name of the output executable file, but if -t specifies multiple targets, then –out-path specifies the output directory. -c specifies a JSON configuration file that specifies that additional scripts and resource files need to be packaged, usually using package. JSON configuration.
The best practice for using PKG is to specify the packaging parameters in the PKG field in package.json and use NPM scripts to perform the packaging process, for example:
{
...
"bin": "./bin/www",
"scripts": {
"pkg": "pkg . --out-path=dist/"
},
"pkg": {
"scripts": [...]
"assets": [...],
"targets": [...]
},
...
}Copy the code
Scripts and assets are used to configure scripts and resource files that are not packaged into executable files. Glob wildcards can be used for file paths. Here comes the question: why aren’t some scripts and resource files packaged?
Answering this question involves the PKG’s mechanism for packaging files. According to the PKG documentation, PKG only automatically packages files relative to __dirname, __filename, such as path.join(__dirname, ‘.. The/path/to/asset ‘). As for require(), since require itself is relative to __dirname, it can be packaged automatically. Suppose the file has the following code:
require('./build/' + cmd + '.js')
path.join(__dirname, 'views/' + viewName)Copy the code
These paths are not constant, and the PKG can’t automatically identify which files to package, so the files are lost, so use scripts and assets to tell the PKG that the files need to be packaged. So how do we know which files aren’t packed? Do you have to go through the source code line by line? In fact, it is very simple, just need to package the file to run, the error message will tell you which files are missing, and PKG in the process of packaging it can not automatically package some files.
Matters needing attention
If there’s one area where PKG could be improved, it’s the inability to automatically package binary *.node files. If your project references a binary module, such as SQlite3, then you need to manually copy the *. Node file to the same directory as the executable. I usually use cp node_modules/**/*.node. One click. However, if you want to package cross-platform, such as the Linux version on Windows, the corresponding binary module also needs to be changed to the Linux version, usually you need to manually download or compile.
So why can’t the PKG pack binary modules into it? My guess is that require loads a JS file and node file, and their mechanism is different. It also makes sense, by design, not to automatically package binaries, because binaries are platform-specific. If I generate a Linux file on Windows, I need to pull the Linux version of the.node file, which can be difficult. Also, some binary modules do not provide a precompiled version, which needs to be compiled when installed, and PKG cannot simulate a compilation environment on other platforms. Nexe can automatically package binary modules, but only executables for the current platform and version. This means that if a Node.js application references binary packages, then the application cannot be packaged across platforms, so I don’t think Nexe is a good design in this regard.