preface
The original article is published in my blog, welcome to follow ~
Happy New Year to you all in the Year of the Pig 2019! This article is long and requires patience
Some time ago, I used electron vue to develop a cross-platform (currently supporting the three major desktop operating systems) free open source map bed upload application — PicGo. During the development process, I stepped in many holes, not only from the application business logic itself, but also from electron itself. I learned a lot in the process of developing this application. Since I also started to learn electron from 0, so many experiences should also be able to give some inspiration and instructions to students who want to learn electron development for the first time. Therefore, write a Electron development of actual combat experience, with the most close to the actual project development Angle to elaborate. I hope it helps.
It is expected to be launched from several series of articles or aspects:
- Electron – vue primer
- Simple development of the Main and Renderer processes
- JSON database lowDB is introduced based on Lodash
- Some cross-platform compatibility measures
- Release and update through CI
- Develop plug-in system – CLI part
- Develop plug-in system – GUI part
- Think of writing again…
instructions
PicGo is developed using electron vue, so if you know vue, it will be faster to learn along. If you have a technology stack like React or Angular, you may not learn much from the Render side of the build, but you should be able to do so on the Electron side.
If the previous article did not read friends can follow from the previous article.
As far as I’m concerned, this article is really hard to write. How to build a plug-in system, I spent half a year. It’s not easy to get this right in one or two articles. So there may be some flaws in the text, which will be polished later.
Plug-in systems — containers
Most people write plugins for other frameworks such as Vue, React or Webpack. We can call the framework that provides the plug-in system a “container,” and through the API exposed by the container, plug-ins can be mounted to the container or plugged into the lifecycle of the container to implement some more customized functionality.
Webpack, for example, is essentially a process system that exposes a number of lifecycle hooks via Tapable that plugins can tap into for pipelining-for example, Babel plugins that escape ES6 code to ES5; Plug-ins in the SASS, LESS, Stylus series compile preprocessed CSS code into normal CSS code that the browser can recognize, and so on.
We’re going to implement a plug-in system, which is essentially implementing such a container. The container and its plug-ins need to have the following basic characteristics:
- The container can perform basic functions without third-party plug-ins
- Plug-ins are independent
- Plug-ins can be configured and managed
The first point should be easy to understand. What good is a plug-in system if it doesn’t work because no third party plug-ins exist? However, unlike third-party plug-ins, many plug-in systems have their own built-in plug-ins, such as vuE-CLI, a series of built-in plug-ins for Webpack. At this time, some functions of the plug-in system itself will be realized by the built-in plug-in.
Second, plug-in independence means that the plug-in itself does not actively affect the operation of other plug-ins. Of course, one plug-in can depend on the results of another.
Third, plug-ins that cannot be configured and managed will run into problems from the plug-in installation stage. So the container needs to have a well-designed entry to register the plug-in.
In the next part, I will combine picGO-core and PicGo to explain in detail how to build and implement the CLI plug-in system and GUI plug-in system.
CLI plug-in system
An overview of the
In fact, CLI plug-in systems can be considered guI-free plug-in systems, that is, plug-in systems running on the command line or without a visual interface. Why do we need to pull CLI plug-in system when developing Electron plug-in system? Here’s a quick review of Electron’s structure:
You can see that most of the functionality is provided by the Main process, except for the interface rendering of the Renderer. For PicGo, its bottom layer should be an upload process system, as follows:
- Input: Accepts Input from outside, either through a path or a complete base64 image by default
- Transformer: Convert input into objects that can be uploaded by the uploader (including image size, base64, image name, etc.)
- Uploader: Uploads output from a converter to a specified location. The default Uploader will be SM.ms
- Output: Output the uploaded results, usually in the imgUrl Output
So in theory the underlying layer should be implemented on the Node.js side. The Electron Renderer process just implements the GUI interface and calls the API provided by the flow system implemented on the underlying Node.js side. Similar to the separation of the front and back ends when we usually develop web pages, but now this back end is based on node.js implementation of the plug-in system. With this in mind, I started to implement Picgo-Core.
The life cycle
For example, Vue has beforeCreate, Created, Mounted, etc., and Webpack has beforeRun, Run, afterCompile, etc. This is also the soul of a plug-in system, which gives the plug-in more freedom by accessing the life cycle of the system.
So we can implement a lifecycle class first. The code can be referenced for Lifecycle.
The lifecycle process can be referenced to the flowchart above.
class Lifecycle {
// Entry for the entire lifecycle
async start (input: any[]) :Promise<void> {
try {
await this.beforeTransform(input)
await this.doTransform(input)
await this.beforeUpload(input)
await this.doUpload(input)
await this.afterUpload(input)
} catch (e) {
console.log(e)
}
}
// Get raw input, before conversion
private async beforeTransform (input) {
// ...
}
// Convert the input to an uploadable format
private async doTransform (input) {
// ...
}
// Uploader before upload
private async beforeUpload (input) {
// ...
}
/ / Uploader to upload
private async doUpload (input) {
// ...
}
// Uploader Uploader is uploaded
private async afterUpload (input) {
// ...}}Copy the code
In practical use, we can use:
const lifeCycle = newLifeCycle() lifeCycle.start([...] )Copy the code
To run the entire life cycle of the upload process. We haven’t seen anything about plug-ins yet. This is to fulfill our first condition: the container can perform basic functions without third-party plug-ins.
Broadcast events
A lot of times we need to deliver events in a certain way. Just like the publish-subscribe model, it is published by the container and subscribed by the plug-in. Lifecycle will inherit EventEmmit from Node.js directly: Lifecycle will inherit EventEmmit from Node.js.
class Lifecycle extends EventEmitter {
constructor () {
super()}// ...
}
Copy the code
Lifecycle will also have the Emit and on methods of EventEmitter. For the container, we simply emit the event.
In Picgo-Core, for example, the entire upload process broadcasts events, notifies the plug-in of its current stage, and sends the current input or output as it broadcasts.
private async beforeTransform (input) {
// ...
this.emit('beforeTransform', input) // Broadcast events
}
Copy the code
Plug-ins are free to listen for the events they want to listen for. For example, the plugin wants to know the result of the upload (pseudocode) :
plugin.on('finished'.(output) = > {
console.log(output) / / get the output
})
Copy the code
There are some useful events when developing PicGo-Core. I also want to share that not all plug-in systems have such events, but they can sometimes be useful for you and your project.
Progress of the event
When we upload or download files, there’s one thing we notice: the progress bar. Also, picgo-core exposes an event called uploadProgress that tells the user the current uploadProgress. However, in Picgo-core, the upload progress is calculated from beforeTransform, which is divided into 5 fixed values for easy calculation.
private async beforeTransform (input) {
this.emit('uploadProgress'.0) // Before conversion, progress 0
}
private async doTransform (input) {
this.emit('uploadProgress'.30) // Start conversion, progress 30
}
private async beforeUpload (input) {
this.emit('uploadProgress'.60) // Start uploading, progress 60
}
private async afterUpload (input) {
this.emit('uploadProgress'.100) // Upload complete, progress 100
}
Copy the code
Returns -1 if the upload fails:
async start (input: any[]) :Promise<void> {
try {
await this.beforeTransform(input)
await this.doTransform(input)
await this.beforeUpload(input)
await this.doUpload(input)
await this.afterUpload(input)
} catch (e) {
console.log(e)
this.emit('uploadProgress'.- 1)}}Copy the code
By listening for this event, PicGo can make the following upload progress bar:
The system informs
Notification events can be published if there is a problem or if there is information that needs to be told to the user through system-level notifications. By listening for this event, system notifications can be called to publish. The plugin can also publish this event for PicGo to listen on. As shown in the figure above, the notification in the upper right corner is displayed after a successful upload.
Access life cycle
In the previous section, we looked at event broadcasting in the life cycle, and you can see that event broadcasting is not about delivering results. Picgo-core just publishes the event, and it doesn’t care if the plugin listens or not, and what happens when it listens. (It’s a bit like UDP). But a lot of times we actually need to plug into the lifecycle to do something.
Taking the upload process as an example, if I want to compress images before uploading, listening for the beforeUpload event cannot do this. Because even if you have compressed the image in the beforeUpload event, the upload process will be finished long ago and the life cycle will continue after the emit event.
Therefore, we need to implement a function in the container’s life cycle that allows the plug-in to access its life cycle, and execute the actions of the plug-in in the current life cycle before sending the results to the next life cycle. You can see that there is an action “waiting” for the plug-in to execute. So picgo-core uses the simplest and most intuitive async function with await to implement “wait”.
We don’t have to worry about how plug-ins are registered, as we’ll see later. Let’s start by implementing how to access the plug-in lifecycle.
The following uses the life cycle beforeUpload as an example:
private async beforeUpload (input) {
this.ctx.emit('uploadProgress'.60)
this.ctx.emit('beforeUpload', input)
// ...
await this.handlePlugins(beforeUploadPlugins.getList(), input) // Execute and "wait" for plug-in execution to finish
}
Copy the code
You can see that we await the completion of the execution of the lifecycle method handlePlugins (how to implement this is described below) with await. And we run the list of plug-ins is through beforeUploadPlugins getList () (below will show how to realize) obtained, these are only for beforeUpload the life cycle of a plug-in. Then pass the input to handlePlugins for the plug-ins to call.
Now let’s implement handlePlugins:
private async handlePlugins (plugins: Plugin[], input: any[]) {
await Promise.all(plugins.map(async (plugin: Plugin) => {
await plugin.handle(input)
}))
}
Copy the code
We “wait” for all plug-ins to execute with promise.all and await. Note here that each PicGo plug-in needs to implement a Handle method for picgo-core to call. As you can see, this implements the second feature we talked about: the plug-in is independent.
You can also see here that we have built an environment with async and await to “wait” for the plug-in execution to finish. This solves the problem of not being able to access the life cycle of a plug-in system through broadcast events alone.
No, wait, there’s another problem here. Where is beforeUploadPlugins. GetList ()? The above is just a sample code. In fact, PicGo-Core reserves five different plug-ins for different life cycles in the upload process:
- beforeTransformPlugins
- transformer
- beforeUploadPlugins
- uploader
- afterUploadPlugins
Called in each of the five cycles of the upload. Although the timing of these five plug-ins is different, their implementation is the same: they have the same registration mechanism, the same method for getting the list of plug-ins, getting the plug-in information, and so on. So let’s go ahead and implement a plug-in class with a life cycle.
Life cycle plug-in classes
This is a key part of the plugin system. This class implements how plug-ins should be registered with our plugin system and how the plugin system obtains them. See lifecycleplugins.ts for the code for this section.
Here is the implementation:
class LifecyclePlugins {
// list is a list of plug-ins. Represented as an object.
list: {
[propName: string]: Plugin
}
constructor () {
this.list = {} // Initialize the plug-in list to {}
}
// Plugin registration entry
register (id: string, plugin: Plugin): void {
// If the plug-in does not provide an ID, it will not be registered
if(! id)throw new TypeError('id is required! ')
// If the plugin does not have a handle method, it will not be registered
if (typeofplugin.handle ! = ='function') throw new TypeError('plugin.handle must be a function! ')
// If the plug-in id is duplicated, it will not be registered
if (this.list[id]) throw new TypeError(`${this.name} duplicate id: ${id}! `)
this.list[id] = plugin
}
// Get the plug-in by the plug-in ID
get (id: string): Plugin {
return this.list[id]
}
// Get the list of plug-ins
getList (): Plugin[] {
return Object.keys(this.list).map((item: string) = > this.list[item])
}
// Get the list of plug-in ids
getIdList (): string[] {
return Object.keys(this.list)
}
}
export default LifecyclePlugins
Copy the code
The most important method for plug-ins is the Register method, which is the entry point for plug-in registration. Once registered with the Register, the plug-in will be written to Lifecycle as id: Plugin in the list inside Lifecycle. Note that Picgo-core requires each plug-in to implement a handle method that can be called later in the lifecycle.
Here’s how to register a plug-in in pseudocode:
beforeTransformPlugins.register('test', {
handle (ctx) {
console.log(ctx)
}
})
Copy the code
Here we have registered a plugin with id test, which is a plugin for the beforeTransform phase that prints incoming information.
Lifecycleplugins.getlist () then calls the lifecycleplugins.getList () method to get a list of plugins for that lifecycle.
Pull out the core class
If you are simply implementing a plug-in system that runs in a Node.js project, the above two parts are basically enough:
- The Lifecyle class is responsible for the entire lifecycle
- The LifecylePlugins class is responsible for registering and calling plug-ins
However, a good CLI plug-in system also requires at least the following parts (at least in my opinion) :
- It can be invoked from the command line
- Ability to read configuration files for additional configuration
- Command line to install plug-ins
- The plug-in is configured on the cli
- Friendly log message prompt
See the vuE-cli3 tool here.
So we need at least the following:
- Command line operations related to classes
- Configuration file operation related
- Plug-in installation, uninstall, update and other related operations of the class
- The plug-in loads the relevant classes
- Classes related to the output of log information
None of these are particularly strongly coupled to the lifecycle classes themselves, so you don’t need to implement them all in the lifecycle classes.
Instead, we pull out a Core as the Core and include the above classes in this Core class, which is responsible for registering command-line commands, loading plug-ins, optimizing log information, calling life cycle, and so on.
Finally, expose the core class for use by users or developers. This is the implementation of picgo.ts, the Core of Picgo-core.
The implementation of PicGo itself is not complicated and basically just calls the methods of the above class instances.
But notice something here that I haven’t mentioned before. PicGo – Core in addition to the Core PicGo several subclasses, basically in the building phase function will be introduced to a constructor parameter called CTX. What is this parameter? The argument is this of the PicGo class itself. By passing this, subclasses of Picgo-core can also use the methods exposed by the PicGo Core class.
For example, the Logger class implements nice command-line logging:
It’s also easy to call Logger methods in other subclasses:
ctx.log.success('Hello world! ')
Copy the code
Where CTX is the this pointer to PicGo itself, as we said above.
The concrete implementation of each of the classes we cover next.
Log output related classes
I’m going to start with this class because it’s the simplest and least invasive class. You can have it with or without it, but it’s icing on the cake.
PicGo’s beautification library, Chalk, is used to output colorful command line text:
It’s also easy to use:
const log = chalk.green('Success')
console.log(log) // Success with green font
Copy the code
We intend to implement four output types, SUCCESS, WARN, INFO, and error:
Create the following class:
import chalk from 'chalk'
import PicGo from '.. /core/PicGo'
class Logger {
level: {
[propName: string] :string
}
ctx: PicGo
constructor (ctx: PicGo) { // Pass PicGo's this to the constructor so that the Logger can use the methods exposed by the PicGo core class
this.level = {
success: 'green',
info: 'blue',
warn: 'yellow',
error: 'red'
}
this.ctx = ctx
}
// The actual output function
protected handleLog (type: string, msg: string | Error) :string | Error | undefined {
if (!this.ctx.config.silent) { // If it is not silent mode, silent mode does not output log
let log = chalk[this.level[type]] (`[PicGo ${type.toUpperCase()}] : `)
log += msg
console.log(log)
return msg
} else {
return}}// There are four different types
success (msg: string | Error) :string | Error | undefined {
return this.handleLog('success', msg)
}
info (msg: string | Error) :string | Error | undefined {
return this.handleLog('info', msg)
}
error (msg: string | Error) :string | Error | undefined {
return this.handleLog('error', msg)
}
warn (msg: string | Error) :string | Error | undefined {
return this.handleLog('warn', msg)
}
}
export default Logger
Copy the code
Then mount the Logger class to the PicGo core class:
import Logger from '.. /lib/Logger'
class PicGo {
log: Logger
constructor () {
// ...
this.log = new Logger(this) // Pass this to Logger, which is CTX in Logger
}
// ...
}
Copy the code
This allows other classes mounted to PicGo’s core classes to call methods in log using ctx.log.
Configuration file related
Most of the time, we write systems, plug-ins, more or less need some configuration before they can be used. Such as vue.config.js for vue-cli3, _config.yml for hexo, etc. PicGo is no exception. It can be used directly by default, but if you want to do something else, you need to configure it. So configuration files are an important part of a plug-in system.
I used LowDB as a read/write library for JSON configuration files on PicGo Electron and had a good experience. In order to be forward compatible with PicGo configuration, I still use this library when writing PicGo-Core. I mentioned some specific uses of lowdb in a previous article, if you are interested, you can take a look – portal.
Because lowDB does persistent configuration like MySQL, it requires a concrete JSON file on disk as a carrier, so it cannot initialize the configuration by creating a configuration object. So everything unfolds from this configuration file:
Picgo-core takes a default configuration file: homedir()/.picgo/config.json, which is used when PicGo is instantiated without a configuration file path. If the consumer provides a specific configuration file, the supplied configuration file is used.
Here’s how to initialize PicGo:
import fs from 'fs-extra' class PicGo extends EventEmitter { configPath: string private lifecycle: Lifecycle // ... constructor (configPath: String = ") {super() this.configPath = configPath // Pass configPath this.init()} init() {if (this.configPath === ") {// If no configuration file path is provided, This. ConfigPath = homedir() + '/.picgo/config.json'} if (path.extName (this.configPath). JSON) {this.configPath = "return this.log.error('The configuration file only supports JSON format.') } const exist = fs.pathExistsSync(this.configPath) if (! Exist) {// Create fs.ensureFilesync (' ${this.configPath} ')} // exist) {// Create fs.ensureFilesync (' ${this.configPath} ')} //... } / /... }Copy the code
So instantiating PicGo looks like this:
const PicGo = require('picgo')
const picgo = new PicGo() // If no configuration file is provided, use the default configuration file
/ / or
const picgo = new PicGo('./xxx.json') // The supplied configuration file is used
Copy the code
With the configuration file in place, we only need to implement three basic actions:
- Initial Configuration
- Reading configuration
- Write configuration (write configuration includes creation, update, deletion, etc.)
Initial Configuration
Generally speaking, our systems have some default configurations, and PicGo is no exception. We can choose to write the default configuration into the code, or we can choose to write the default configuration into the code. Because PicGo’s profile has persistence requirements, it makes sense to write some key default configurations to the profile.
The initial configuration will use some lowDB knowledge, which will not be expanded here:
import lowdb from 'lowdb'
import FileSync from 'lowdb/adapters/FileSync'
const initConfig = (configPath: string): lowdb.LowdbSync<any> = > {const adapter = new FileSync(configPath, { // Adapter for lowdb to read configuration files
deserialize: (data: string) :Function= > {
return (new Function(`return ${data}`()}})))const db = lowdb(adapter) // The exposed DB object
if(! db.has('picBed').value()) { // If picBed is not configured
db.set('picBed', { // generate a configuration where the default bed is sm.ms
current: 'smms'
}).write()
}
if(! db.has('picgoPlugins').value()) { / / in the same way
db.set('picgoPlugins', {}).write()
}
return db // Expose db for external use
}
Copy the code
ConfigPath can be passed in during PicGo initialization to initialize and retrieve the configuration.
init () {
// ...
let db = initConfig(this.configPath)
this.config = db.read().value() // Store the contents of the configuration file in this.config
}
Copy the code
Reading configuration
Once the configuration is initialized, it is easy to get the configuration:
import { get } from 'lodash'
getConfig (name: string = ' ') :any {
if (name) { // If the name of the configuration item is provided
return get(this.config, name) // Returns the result of the configuration item
} else {
return this.config // Otherwise return to full configuration}}Copy the code
The lodash get method is used to get the following information:
For example, the configuration content looks like this:
{
"a": {
"b": true}}Copy the code
Normally we need to get a.B:
let b = this.config.a.b
Copy the code
If a does not exist, then the above statement will be incorrect. If a does not exist, then a.b is undefined. The lodash get method avoids this problem and can be easily obtained:
let b = get(this.config, 'a.b')
Copy the code
If a does not exist, then the result b will not return an error, but undefined.
Write to the configuration
With the above foreshadowing, it’s easy to write. Through the interface provided by lowDB, write configuration is as follows:
const saveConfig = (configPath: string, config: any) :void= > {
const db = initConfig(configPath)
Object.keys(config).forEach((name: string) = > {
db.read().set(name, config[name]).write()
})
}
Copy the code
We can use:
saveConfig(this.configPath, { a: { b: true}})Copy the code
Or:
saveConfig(this.configPath, { 'a.b': true })
Copy the code
Either way, the following configuration is generated:
{
"a": {
"b": true}}Copy the code
And you can see that the latter is obviously a little bit cleaner. This is thanks to the set method provided by LoDash in lowDB.
We have now completed the configuration file related operations. Picgo-core, which encapsulates all of these operations into a class, was initially implemented as a little tool because it was too small and complicated. Of course, this is not the point, the point is to implement the configuration file related operations, your system and the system’s plug-ins will benefit from this. The system can expose the apis of operations related to configuration files to the plug-in for use. Next we will improve the plug-in system step by step.
Plug-in action class
I’m not sure what the name of this class is, but I wrote pluginHandler in the code, so I’ll call it pluginHandler. There are three main purposes for this class:
- through
npm
Install the plug-in — install - through
npm
Uninstall the plug-in — uninstall - through
npm
Update plugin — update
Use NPM to distribute plug-ins, which is the solution most Node.js plug-in systems will choose. After all, NPM is a natural “plugin store” in the absence of its own plugin store (such as VSCode). Of course, there are many benefits to publishing on NPM, such as the ease with which plug-ins can be installed, updated, and uninstalled, for example, at zero cost to Node.js users. This is what the pluginHandler class does.
The implementation of pluginHandler comes from Feflow, thank you.
When we install an NPM module, it is very simple:
npm install xxxx --save
Copy the code
However, we installed it in the current project directory. PicGo introduces a configuration file, so we can install the plug-in directly in the directory where the configuration file is located, so if you want to uninstall PicGo, you can just install it. But it’s too much to ask the user to open the PicGo configuration file every time to install the plug-in. It’s not elegant either.
In contrast, if we install picgo globally, in any corner of the file system only need to install a Picgo plug-in through Picgo install XXX, without locating the picgo configuration file folder, so that the user experience will be much better. Here you can compare the steps of vuE-CLI3 plug-in installation.
To achieve this effect, we need to call the NPM command in code. So how does Node.js implement command line invocation through code?
Here we can use cross-spawn to achieve cross-platform, code-calling from the command line.
The spawn method is also available natively in Node.js (in child_process), but cross-spawn solves some cross-platform issues. It’s the same thing in use.
const spawn = require('cross-spawn')
spawn('npm'['install'.'@vue/cli'.'-g'])
Copy the code
As you can see, its arguments are passed in as an array.
In addition to the main commands install, update, and uninstall, the parameters of the plug-in operation are the same. So we pulled out a method for execCommand to implement the common logic behind them:
execCommand (cmd: string, modules: string[], where: string, proxy: string = ' ') :Promise<Result> {
return new Promise((resolve: any, reject: any) :void= > {
// Spawn's command-line arguments are passed in as arrays
// Here you concatenate the command with an array of plug-ins to install
CMD here refers to the command to execute, such as install\uninstall\update
let args = [cmd].concat(modules).concat('--color=always').concat('--save')
const npm = spawn('npm', args, { cwd: where }) // Execute NPM and specify the path to execute through CWD -- the folder where the configuration files are located
let output = ' '
npm.stdout.on('data'.(data: string) = > {
output += data // Get the output log
}).pipe(process.stdout)
npm.stderr.on('data'.(data: string) = > {
output += data // Get the error log
}).pipe(process.stderr)
npm.on('close'.(code: number) = > {
if(! code) { resolve({ code:0, data: output }) // If no error is reported, the normal log is output
} else {
reject({ code: code, data: output }) // If an error occurs, an error log is generated}})})}Copy the code
Most of the key parts are already commented in the code. Of course, there are some caveats here. Pay attention to this sentence:
const npm = spawn('npm', args, { cwd: where }) // Execute NPM and specify the path to execute through CWD -- the folder where the configuration files are located
Copy the code
{CWD: WHERE}, which is the value passed in from the outside, indicates the directory in which the NPM command will be executed. This is one of the key points of the plugin action class — you can easily install the PicGo plugin anywhere on the system without having to open the configuration file directory to install the plugin.
Let’s implement the install method so that the other two can be analogous.
async install (plugins: string[], proxy: string) :Promise<void> {
plugins = plugins.map((item: string) = > 'picgo-plugin-' + item)
const result = await this.execCommand('install', plugins, this.ctx.baseDir, proxy)
if(! result.code) {this.ctx.log.success('Plug-in installed successfully')
this.ctx.emit('installSuccess', {
title: 'Plug-in installed successfully',
body: plugins
})
} else {
const err = 'Plug-in installation failed, failure code is${result.code}, the error log is${result.data}`
this.ctx.log.error(err)
this.ctx.emit('failed', {
title: 'Plug-in installation failed',
body: err
})
}
}
Copy the code
ExecCommand (‘install’, plugins, this.ctx.basedir, proxy) const result = await this. ExecCommand (‘install’, plugins, this.ctx.basedir, proxy) Now that the plug-in is installed, how do I load it?
Plug-in loading class
As mentioned above, we will install the plug-in in the directory where the configuration file is located. It is worth noting that, due to NPM, if there is a package.json file in the directory, installing the plug-in, updating the plug-in, and so on will modify the package.json file at the same time. So we can read package.json to find out what PicGo plug-ins are in the current directory. This is also an important part of Hexo’s plug-in loading mechanism.
Thanks to Hexo for the pluginLoader implementation ideas.
PicGo has a constraint on naming plug-ins (and this is how many plug-in systems choose to do it) that they must start with picgo-plugin-. This makes them easy for plug-in loading classes to recognize.
There’s a little pit here. If package.json is not in the directory where our configuration file is located, the command to install the plug-in will cause an error message. We don’t want the user to see this error, so when we initialize the plugin to load the class, we need to check whether the file exists. If not, we need to create a file:
class PluginLoader {
ctx: PicGo
list: string[]
constructor (ctx: PicGo) {
this.ctx = ctx
this.list = [] // List of plug-ins
this.init()
}
init (): void {
const packagePath = path.join(this.ctx.baseDir, 'package.json')
if(! fs.existsSync(packagePath)) {// If it does not exist
const pkg = {
name: 'picgo-plugins',
description: 'picgo-plugins',
repository: 'https://github.com/Molunerfinn/PicGo-Core',
license: 'MIT'
}
fs.writeFileSync(packagePath, JSON.stringify(pkg), 'utf8') // Create the file}}// ...
}
Copy the code
Now let’s implement the most critical load method. We need the following steps:
- Through the first
package.json
To find all the legitimate plug-ins - through
require
To load the plug-in - By maintaining
picgoPlugins
Configure to determine whether plug-ins are disabled - Exposed by executing a plug-in that is not disabled
register
Method to implement plug-in registration
import PicGo from '.. /core/PicGo'
import fs from 'fs-extra'
import path from 'path'
import resolve from 'resolve'
load (): void | boolean {
const packagePath = path.join(this.ctx.baseDir, 'package.json')
const pluginDir = path.join(this.ctx.baseDir, 'node_modules/')
// Thanks to hexo -> https://github.com/hexojs/hexo/blob/master/lib/hexo/load_plugins.js
if(! fs.existsSync(pluginDir)) {// If the plugins folder does not exist, return false
return false
}
const json = fs.readJSONSync(packagePath) / / read package. Json
const deps = Object.keys(json.dependencies || {})
const devDeps = Object.keys(json.devDependencies || {})
// 1. Get the list of plug-ins
const modules = deps.concat(devDeps).filter((name: string) = > {
if (!/^picgo-plugin-|^@[^/]+\/picgo-plugin-/.test(name)) return false
const path = this.resolvePlugin(this.ctx, name) // Get the plug-in path
return fs.existsSync(path)
})
for (let i in modules) {
this.list.push(modules[i]) // Push the plugin into the plugin list
if (this.ctx.config.picgoPlugins[modules[i]] || this.ctx.config.picgoPlugins[modules[i]] === undefined) { // 3. Check whether the plug-in is disabled. If the plug-in is undefined, it is a newly installed plug-in
try {
this.getPlugin(modules[i]).register() // 4. Call the plug-in's 'register' method to register
const plugin = `picgoPlugins[${modules[i]}] `
this.ctx.saveConfig( // Set the plugin to enable --> change the value of the newly installed plugin from undefined to true
{
[plugin]: true})}catch (e) {
this.ctx.log.error(e)
this.ctx.emit('notification', {
title: `Plugin ${modules[i]} Load Error`,
body: e
})
}
}
}
}
resolvePlugin (ctx: PicGo, name: string) :string { // Get the plug-in path
try {
return resolve.sync(name, { basedir: ctx.baseDir })
} catch (err) {
return path.join(ctx.baseDir, 'node_modules', name)
}
}
getPlugin (name: string) :any { // Get the plug-in by its name
const pluginDir = path.join(this.ctx.baseDir, 'node_modules/')
return require(pluginDir + name)(this.ctx) // 2. Get the plugin from require and pass in CTX
}
Copy the code
The load method is the most critical part of the entire plug-in system load. It may not be easy to understand just by looking at the steps and code. Let’s illustrate this with a specific plug-in example.
Suppose I write a plug-in for picgo-plugin-xxx. My code is as follows:
// The plugin system will pass in picgo's CTX to facilitate the plugin to call picgo exposed API
// So we need a CTX parameter to receive the API from Picgo
module.exports = ctx= > {
// The plug-in system calls this method to register the plug-in
const register = (a)= > {
ctx.helper.beforeTransformPlugins.register('xxx', {
handle (ctx) { // The plug-in's Handle method is also passed the CTX convenience API
console.log(ctx.output)
}
})
}
return {
register
}
}
Copy the code
We already have an overview of how plug-ins run:
- Run the life cycle first
- When you run to some life cycle, like this one right here
beforeTransform
, so this is the stage to getbeforeTransformPlugins
These plug-ins beforeTransformPlugins
These plug-ins are supported byctx.helper.beforeTransformPlugins.register
Method is registered and can be passedctx.helper.beforeTransformPlugins.getList()
To obtain- Each will be called once the plug-in is in hand
beforeTransformPlugins
thehandle
Method, and passed inctx
For plug-in use
Pay attention to the above step 3, CTX. Helper. BeforeTransformPlugins. Register when this method is invoked? The pluginLoader calls the register method of each plug-in during the loading phase of the plug-in described in this section. In the register method of each plug-in, we write:
ctx.helper.beforeTransformPlugins.register('xxx', {
handle (ctx) { // The plug-in's Handle method is also passed the CTX convenience API
console.log(ctx.output)
}
})
Copy the code
At this time, CTX. Helper. BeforeTransformPlugins. Register this method is invoked.
As a result, the entire plug-in and plug-ins for each lifecycle are pre-registered before the lifecycle begins. So when the life cycle starts, all you need to do is get the registered plug-in through getList() and execute the process.
This explains the problems I had with running Hexo to generate a blog. I’ve installed a few Hexo plugins before, but they don’t work for some reason. It turns out that they were not written to the package.json dependency fields because they were not installed with –save. The first step for Hexo to load a plugin is to get the list of valid plugins from package.json. If the plugin is not in package.json, even if it is in node_modules, it will not take effect.
With plug-ins, let’s talk about how to invoke and configure from the command line.
Command line action classes
PicGo’s command-line classes rely on two libraries: commander. Js and Inquirer. These two libraries are also popular for node.js command-line applications. The former is responsible for command line parsing and executing related commands. The latter is responsible for providing a command line interface to interact with the user.
For example, you can type:
picgo use uploader
Copy the code
If uploader is used as a parameter, use will be called. If uploader is used as a parameter, Inquirer.
If you’ve used command-line tools such as VUe-cli3 or create-react-app, you’ll be familiar with this situation.
First, we write a command line operation class that exposes the API to other parts of the registry command. Here you can refer to Commander.
import PicGo from '.. /core/PicGo'
import program from 'commander'
import inquirer from 'inquirer'
import { Plugin } from '.. /utils/interfaces'
const pkg = require('.. /.. /package.json')
class Commander {
list: {
[propName: string]: Plugin
}
program: typeof program
inquirer: typeof inquirer
private ctx: PicGo
constructor (ctx: PicGo) {
this.list = {}
this.program = program
this.inquirer = inquirer
this.ctx = ctx
}
// ...
}
export default Commander
Copy the code
Then we instantiate the picgo-core Core class:
import Commander from '.. /lib/Commander' class PicGo extends EventEmitter { // ... cmd: Commander constructor (configPath: string = '') { super() this.cmd = new Commander(this) // ... } / /...Copy the code
This allows other sections to call commander. Js using ctx.cmd.program and inquirer using ctx.cmd.inquirer.
There are many tutorials on the web for using these two libraries. For a quick example, let’s start with PicGo’s most basic function: uploading images from the command line.
Command registration
In keeping with the previous plugin structure, we write the command registration to handle as well.
import PicGo from '.. /.. /core/PicGo'
import path from 'path'
import fs from 'fs-extra'
export default {
handle: (ctx: PicGo): void= > {
const cmd = ctx.cmd
cmd.program // Here is a commander.js instance
.command('upload') // Register the command upload
.description('upload, go go go') // Command description
.arguments('[input...] ') // Command parameters
.alias('u') // Alias u for the command
.action(async (input: string[]) = > {// The function executed by the command
const inputList = input // Get input
.map((item: string) = > path.resolve(item))
.filter((item: string) = > {
const exist = fs.existsSync(item) // Check whether the input address exists
if(! exist) { ctx.log.warn(`${item} is not existed.`) // Return a warning message if it does not exist
}
return exist
})
await ctx.upload(inputList) // Upload the image (call the start function of the lifecycle)}}})Copy the code
So if we register the command in some way:
import PicGo from '.. /.. /core/PicGo'
import upload from './upload'
// ...
export default (ctx: PicGo): void= > {
ctx.cmd.register('upload', upload) // The registration logic here is the same as lifecyclePlugins.
// ...
}
Copy the code
When you get to this point, you probably think you’re done. In fact, we are missing the last step, we are missing an entry point to accept the command we typed. For example, now that we’ve written the command, we’ve written the registration of the command, then how do we use it on the command line?
Command Line use
Json fields bin and main. The main field refers to the file you got when const XXX = require(‘ XXX ‘) The bin field points to a file that you can type directly from the command line after the global installation.
For example, the bin field for picgo-core looks like this:
// ...
"bin": {
"picgo": "./bin/picgo"
},
Copy the code
If you have picgo installed globally, you can use the picgo command. After installing @vue/ CLI, you can use the vue command.
So let’s see what./bin/picgo does. The source code is here.
#! /usr/bin/env node
const path = require('path')
const minimist = require('minimist')
let argv = minimist(process.argv.slice(2)) // Parse the command line
let configPath = argv.c || argv.config || ' ' // Check whether configPath is provided
if(configPath ! = =true&& configPath ! = =' ') {
configPath = path.resolve(configPath)
} else {
configPath = ' '
}
const PicGo = require('.. /dist/index')
const picgo = new PicGo(configPath) // instantiate picgo
picgo.registerCommands() // Register the command
try {
picgo.cmd.program.parse(process.argv) // Call the commander.js parsing command
} catch (e) {
picgo.log.error(e)
if (process.argv.includes('--debug')) {
Promise.reject(e)
}
}
Copy the code
The key is in the sentence picgo.cmd.program.parse(process.argv), which calls commander. Js to parse process.argv, the commands and arguments in the command line.
/bin/picgo upload can be used to invoke commands during development, while picgo upload can be used to invoke commands in production, after the user has installed it globally.
Processing configuration items
As mentioned earlier, configuration items are an important part of a plug-in system. Different plug-in systems handle configuration items differently. For example, Hexo provides _config.yml for user configuration and VUe-cli3 provides vue.config.js for user configuration. PicGo also provides config.json for users to configure, but on top of that, I wanted to provide a more convenient way for users to configure directly from the command line without having to open the configuration file.
For example, we can use the command line to select the currently uploaded map bed:
$ picgo use
? Use an uploader (Use arrow keys)
smms
❯ tcyun
weibo
github
qiniu
imgur
aliyun
(Move up and down to reveal more choices)
Copy the code
This interaction on the command line requires the aforementioned Inquirer. Js to help us achieve this effect.
They can also be used simply by passing a prompt (an array of questions), which then returns the result of the question in the form of an object, usually remembered as an answer.
PicGo simplifies the process by simply providing a config method that returns a legitimate prompts array. PicGo then automatically calls Inquirer. Js to execute it and writes the results to a configuration file.
For example, PicGo’s built-in Imgur bed config code looks like this:
const config = (ctx: PicGo): PluginConfig[] => {
let userConfig = ctx.getConfig('picBed.imgur')
if(! userConfig) { userConfig = {} }const config = [
{
name: 'clientId'.type: 'input'.default: userConfig.clientId || ' ',
required: true
},
{
name: 'proxy'.type: 'input'.default: userConfig.proxy || ' ',
required: false}]return config // This config is a legal prompt array
}
export default {
// ...
config
}
Copy the code
Then we implement code to call it from the command line, source portal:
The following code has been simplified
import PicGo from '.. /.. /core/PicGo'
import { PluginConfig } from '.. /.. /utils/interfaces'
// Process uploader's config array and write to the configuration file
const handleConfig = async (ctx: PicGo, prompts: PluginConfig, name: string) :Promise<void> = > {const answer = await ctx.cmd.inquirer.prompt(prompts)
let configName = `picBed.${name}`
ctx.saveConfig({
[configName]: answer
})
}
export default {
handle: (ctx: PicGo): void= > {
const cmd: typeof ctx.cmd = ctx.cmd
cmd.program
.command('set') // Register a set command
.alias('config') / / alias config
.description('configure config of picgo')
.action(async() = > {try {
let prompts = [ Prompt prompts
{
type: 'list',
name: 'uploader',
choices: ctx.helper.uploader.getIdList(), // Get the Uploader list
message: `Choose a(n) uploader`.default: ctx.config.picBed.uploader || ctx.config.picBed.current
}
]
let answer = await ctx.cmd.inquirer.prompt(prompts) // Wait for inquirer to process user input
const item = ctx.helper.uploader.get(answer.uploader) // Get the user selected uploader
if (item.config) { // If Uploader provides the config method
await handleConfig(ctx, item.config(ctx), answer.uploader) // Handle the Prompts array exposed by the config method
}
ctx.log.success('Configure config successfully! ')}catch (e) {
ctx.log.error(e)
if (process.argv.includes('--debug')) {
Promise.reject(e)
}
}
})
}
}
Copy the code
The above is for the Uploader config method configuration processing, for other plug-ins are the same, no further details. This allows us to quickly configure the configuration file from the command line, and the user experience is ++.
Plug-in System Release
With all that said, we are writing a plugin system locally. How do we distribute it so that others can install and use it? There are many articles about publishing modules to NPM, see this article for example. What I want to talk about here is how to publish a library that can be used at the command line and also use API calls in node.js projects, such as const picgo = require(‘picgo’).
CLI and API call coexist
It’s actually mentioned in the top part of this. When publishing an NPM library, we usually specify the entry file for the library in the main field of package.json. So the user can use it in node.js projects by, for example, const picgo = require(‘picgo’).
If we want to register a command after the library is installed, we can specify the corresponding entry file for the command in the bin field. Such as:
// ...
"bin": {
"picgo": "./bin/picgo"
},
Copy the code
This will register a command called picgo in the system after the global installation.
Of course, the entry files for bin and main are usually different. The bin entry file must be used to parse the command line. So we usually use command-line parsing libraries such as minimist or commander. Js to parse command-line parameters.
summary
At this point, the key part of a CLI plug-in system is basically realized. So in Electron, we can use the plugin system we wrote in the main process and use the API exposed by the plugin to build the plugin system for our application. The next article will detail how to integrate the CLI plug-in system into Electron, implement the GUI plug-in system, and add some additional mechanisms to make the plug-in system on the GUI more flexible and powerful.
This article is a lot of problems I encountered in the development of PicGo, step on the pit. Perhaps behind a few simple words in the article is my countless times of reference and debugging. Hopefully this article has given you some insight into the development of electron- Vue. PicGo, picgo-core, picGo-Core, picGo-Core, PicGo-Core, PicGo-Core, PicGo-Core, PicGo-Core, Picgo-Core, Picgo-Core, Picgo-Core If so, please follow my blog and the rest of this series.
Note: the pictures in this article are all my personal works unless otherwise specified, please send a private message
reference
Thanks for these high quality articles:
- Developing a Command Line Interface (CLI) with Node.js
- The practice of writing CLI in Node.js
- Node.js module mechanism
- Front-end plug-in system design and implementation
- Hexo plugin mechanism analysis
- How to implement a simple plug-in extension
- Publish and maintain TypeScript modules using NPM
- Examples of typescript NPM packages
- Publish the NPM package through Travis – CI
- Dynamic load module in plugin from local project node_modules folder
- Follow the old driver around the Node command line
- And all the great articles I didn’t get to record, thank you!