Recently, I have been working on the Electron project. With the launch of the project, I would like to make a summary here. This paper mainly records some pit points, the whole Electron construction process has a lot of articles said very good, there is no need to repeat.

Mainly includes

  1. Pain points regarding communication between the main and renderer processes
  2. Schema selection for persistent data
  3. Access permission to the packaged file directory

Pain points regarding communication between the main and renderer processes

First take a look at the official instructions

Electron provides multiple communication modes for main process and Renderer processes. For example, ipcRenderer and ipcMain modules can be used to send messages, and remote module can be used to communicate in RPC mode

There are two types of communication mentioned here. The former requires the data to be passed through event registration publishing, while the latter hides the details of communication and shares the data just like calling ordinary objects.

For example, to get list data, use event registration to publish the way the code looks like this

Const {ipcMain} = require(const {ipcMain} = require('electron')
ipcMain.on('getList', (event, arg) => {service.getList(arg) event.sender.send('getList-done'.'list data');
})
Copy the code
Const {ipcRenderer} = require(const {ipcRenderer} = require('electron')

ipcRenderer.on('getList-done', (event, arg) => { console.log(arg); / / output'list data'
})

ipcRenderer.send('getList'.'list args') / / component destroyed after removing the monitor ipcRenderer removeAllListeners ('getList-done') / / or ipcRenderer removeListener removes the specified monitorCopy the code

With the Remote module, the code looks something like this

Const {app} = require(const {app} = require('electron'App.service = new service ()Copy the code
// const {remote} = require('electron') / / call directly by remote module getList const res. = remote app. Service. GetList ()Copy the code

As you can see by comparison, using the former leads to a lot of boilerplate code, while using the Remote module is very convenient. However, the seemingly beautiful remote module has exposed many problems in practice. The getList query took 10ms in the main process, but it took nearly 100ms to get the data through the remote module.

This puzzled me for a while. After debugging step by step, I found that every access to the object went into the metaToValue method, including access to properties on the object. In addition, the rendering process will be stuck. According to the documents, since remote is the essence of sending inter-synchronization process messages, it will block the main process. In other words, if a query is time-consuming, the user interface will directly freeze or even fail to respond.

As for the question here, I have limited understanding, you can refer to the article of Huangshan Big man, which has a detailed analysis of remote module.

Due to an inherent flaw in the Remote module, interprocess communication can only be done using event registration. But in a real project, you would need to communicate with the main process so frequently (accessing the local database) that registering events for each method would be too tedious. The following is my package scheme:

Ipcmain.on (// in the main process)"queryData", async function(event, args) { const { serviceName, serviceMethod, serviceArg, key } = args; const res = await app[serviceName][serviceMethod](... serviceArg) event.sender.send("queryFinish", {
        key,
        data: res
    })
})
Copy the code
type ipcParam = {
  serviceName: string;
  serviceMethod: string;
  serviceArg: any | null;
};

export functionIpcHelper (args: ipcParam) {// Generate a const key = getLongId();let listener: any = null;

  const bindEvent = (r: any) => {
    returnAsync (Event: any, arg: any) => {// Use the key to uniquely identify the sender and sender of a messageif (arg.key === key) {
        r(arg.data);
        ipcRenderer.removeListener("queryFinish", listener); }}; };return new Promise((r, j) => {
    ipcRenderer.send("queryData", {
      ...args,
      key
    });
    listener = bindEvent(r);
    ipcRenderer.on("queryFinish", listener); }); } / / callsexport functiongetList(... params: any):Promise<any>{return ipcHelper({
    serviceName:'testService',
    serviceMethod: "getList", serviceArg: params }); } const res = await getList('parameters')...Copy the code

By promising events, local data can be requested in the renderer process, in the actual page, just as easily as an Ajax request.

Schema selection for persistent data

Since a major focus of the project itself is to work with local data, persistent storage of local data becomes an issue. We had two plans in our group:

  1. indexedDBBrowser built-in database, document database
  2. sqliteMature embedded database, relational database

The first phase of the project decided to use indexedDB for the following reasons

  1. Built-in in the browser, there is no need to worry about installation packaging problems
  2. Completely through the renderer process, no communication with the main process is required
  3. Document database, no extra learning requiredsqlGrammar, low threshold

However, after a period of testing and development, the following shortcomings were found

  1. Performance deteriorates significantly when the data volume exceeds 10W
  2. Multi-table joint query is limited and often requires all data to be loaded into memory
  3. Debugging is difficult. The built-in interface of the browser can only look through the data page by page, and there is no supporting tool, so the data cannot be queried quickly

The second phase of the project is replaced with SQlite3, which has the following advantages

  1. Performance is excellent, supporting queries and inserts of the order of a million data
  2. The ecosystem is rich and there are mature third-party tools (we used Navicat) that can quickly locate problems

For indexedDB and SQlite3, Dexie and Knex are used to assist with data manipulation. I would like to write a separate article to introduce this part if I have time. There is a lack of Chinese materials in this aspect, and I also stepped in many pits in the project process.

Access permission to the packaged file directory

Since SQlite3 is used, files need to be generated and accessed. It worked fine in development mode, but after packaging, I had trouble finding the path.

Path.join (__dirname); // Use __dirname as the root path to create files.Copy the code

__dirname is the absolute path of js execution, resources, app. Asar, dist, electron, which is stored in the.asar package

The contents of an archive cannot be changed, so methods in the Node APIs that modify the file do not work when using the ASAR archive.

There are two ways to solve this problem

  1. Point the file generation directory to a directory at the level of. Asar or higher, for examplepath.join(__dirname, '/.. /.. ')
  2. Set under build in package.json
    "extraResources": {// Move the files you want to access to the outer directory"from": "template"."to": "temp"
     },
    Copy the code

conclusion

That’s about it. We welcome your comments and discussion