Author: Zhou Quan

The social Rubik’s Cube platform is jingdong’s SNS activity building platform, which has many built-in templates, and each template has a template JSON to generate a form. After the operation students and merchants configure this form, they can generate an activity page. Template JSON is standard structured data that contains fields such as name, type, control type, validator, default value, and so on. In the past, handwritten JSON has been used, which is very inefficient and error prone. For its structured data can be edited in GUI mode, we write an editor based on the Electron reference Github Desktop client architecture, and generate JSON by filling in forms. So here are the points that can be recorded during the development of the Electron editor and the points worth learning from the Github Desktop client code.

I. About Electron

Electron is an open source library developed by Github for building cross-platform desktop applications using HTML, CSS, and JavaScript. Electron does this by merging Chromium and Node.js into the same runtime environment and packaging it as an application for Mac, Windows, and Linux.

That’s the official introduction from Electron. Based on the Electron platform, we can develop desktop applications using familiar front-end technology stacks. Electron the process running the package.json main script is called the main process (hereinafter referred to as main). The script that runs in the main process displays the user interface (renderer) by creating a Web page. A Electron application always has one and only one main process. Main is used to create applications, create browser Windows, and is a complete Node process that does not get the DOM, BOM, and other interfaces. The Renderer process runs in the browser window created by Main, which can retrieve the DOM, BOM, and Node apis. The two processes can communicate with each other through the IPC interface provided by Electron.

Ii. Development environment construction

We learned that Electron is divided into two types of processes, main and Renderer. Therefore, the development environment can not be set up as a common front-end application, like a WebPack configuration. And we want to do that

  1. Start the development environment with one click
  2. A key to pack
  3. A key to release

Then you need two WebPack configuration files.

One for the development environment – webpack.dev.ts.

// webpack.dev.ts
const mainConfig = merge({}, base.mainConfig, config, {
  watch: true
})

const rendererConfig = merge({}, base.rendererConfig, config, {
  module: {
    rules: [
      {
        test: /\.css$/,
        use: ['style-loader'.'css-loader'],
      },
      {
        test: /\.styl$/,
        use: ['style-loader'.'css-loader'.'stylus-loader'],
      }
    ]
  },
  devServer: {
    contentBase: path.join(__dirname, base.outputDir),
    port: 8000,
    hot: true,
    inline: true,
    historyApiFallback: true,
    writeToDisk: true}})module.exports = [rendererConfig, mainConfig]
Copy the code

The other is for production – webpack.prod.ts.

const config: webpack.Configuration = {
  mode: 'production',
  devtool: 'source-map',}const mainConfig = merge({}, base.mainConfig, config)

const rendererConfig = merge({}, base.rendererConfig, config, {
  module: {
    rules: [
      {
        test: /\.css$/,
        use: [MiniCssExtractPlugin.loader, 'css-loader'],
      },
      {
        test: /\.styl$/,
        use: [MiniCssExtractPlugin.loader, 'css-loader'.'stylus-loader'],
      }
    ]
  },
  plugins: [
    new MiniCssExtractPlugin({ filename: 'renderer.css' }),
    new BundleAnalyzerPlugin({
      analyzerMode: 'static',
      openAnalyzer: false,
      reportFilename: 'renderer.report.html',})]})module.exports = [mainConfig, rendererConfig]
Copy the code

Here’s a reference to Desktop writing webPack configuration files in Typescript. With interface, webpack configuration file can be automatically completed by the editor. Specific usage can reference webpack.js.org/configurati webpack document…

Each configuration file exports an array of configuration objects for main and Renderer.

Use Webpack-dev-server to start hot updates that implement renderer, and main uses WebPack’s Watch mode.

{
  "compile:dev": "webpack-dev-server --config scripts/webpack.dev.ts"
}
Copy the code

Use Nodemon to monitor the compiled products of Main. If Nodemon monitors the changes, run electron again. Restart the application, which indirectly implements Main’s Livereload.

Nodemon is a utility that will monitor for any changes in your source and automatically restart your server.

{
  "app": "electron ."."app:watch": "nodemon --watch 'dest/main.js' --exec npm run app",}Copy the code

This enables the one-click startup of the development environment and the ability to listen for code changes and restart the application.

Tips: The open source community has better electron- Webpack, HMR for both Renderer and main Processes

The production environment compiles main and renderer sequentially using WebPack. After compiling, use the electron builder package. This enables one-click packaging.

Due to the lack of toolchain, one-click publishing is not possible, so it has to be packaged and published manually (more on that later).

Here are the full scripts.

{
  "scripts": {
    "start": "run-p -c compile:dev typecheck:watch app:watch"."dist": "npm run compile:prod && electron-builder build --win --mac"."compile:dev": "webpack-dev-server --config scripts/webpack.dev.ts"."compile:prod": "npm run clean && webpack --config scripts/webpack.prod.ts"."app": "electron ."."app:watch": "nodemon --watch 'dest/main.js' --exec npm run app"."clean": "rimraf dest dist"."typecheck": "tsc --noEmit"."typecheck:watch": "tsc --noEmit --watch"."lint": "eslint src --ext .ts,.js --fix"."release:patch": "standard-version --release-as patch && git push --follow-tags origin master && npm run dist"."release:minor": "standard-version --release-as minor && git push --follow-tags origin master && npm run dist"."release:major": "standard-version --release-as major && git push --follow-tags origin master && npm run dist"."repush": "git push --follow-tags origin master && npm run dist"}},Copy the code

Third, directory structure

1. Project directory structure

The SRC ├ ─ ─ lib │ ├ ─ ─ cube │ ├ ─ ─ the databases │ ├ ─ ─ enviroment │ ├ ─ ─ files │ ├ ─ ─local- storage │ ├ ─ ─log│ ├ ─ ─ shell │ ├ ─ ─ stores │ ├ ─ ─ the update │ ├ ─ ─ the validator │ └ ─ ─ watcher ├ ─ ─ the main │ ├ ─ ─ app - window. Ts │ ├ ─ ─ the event - bus. Ts │ ├ ─ ─ index. Ts │ ├ ─ ─ the rid_device_info_keyboard │ └ ─ ─ menu ├ ─ ─ models │ ├ ─ ─ popup. Ts │ └ ─ ─ project. The ts └ ─ ─ the renderer ├ ─ ─ App. The TSX ├ ─ ─ assets ├── ├─ ├─ index.html ├─ index.tsx ├─ Pages ├─ typesCopy the code

Mimics Desktop in directory structure. The main directory contains code related to the main process, including application entry, window creation, menu, shortcut keys, etc. The Renderer directory is the code for the entire UI rendering layer. The lib directory is business logic code that has nothing to do with the UI and nothing strongly to do with main. Models houses domain models.

2. The CSS specification

There are no csS-modules schemes used in the React project. Instead, modularity is implemented using namespace-forming specifications such as BEM, which have the advantage of better style coverage.

In terms of file organization, an independent React component and an independent style file are adopted. In this way, when we want to modify the style of a component during reconstruction, we only need to find the corresponding style file for modification, which improves the efficiency of reconstruction.

Stylesheets ├ ─ ─ common. Styl ├ ─ ─ components │ ├ ─ ─ editor. Styl │ ├ ─ ─ the empty - guide. Styl │ ├ ─ ─ the find - in - page. Styl │ ├ ─ ─ Styl │ ├─ Sidebar. Styl │ ├─source├─ activities. styl ├─ activities. styl ├─ activities. styl ├─ activities. styl ├─ activities. stylCopy the code

Iii. IPC communication

InterProcess Communication (IPC) refers to the dissemination or exchange of information between different processes.

Electron’s main and renderer processes communicate through ipcMain and ipcRenderer provided by Electron.

1. The main side

To send a message to a window renderer in main, use window.webcontents.send. To listen for renderer messages on the main side, use ipcmain.on.

// In the main process.
const { ipcMain } = require('electron')
ipcMain.on('asynchronous-message', (event, arg) => {
  console.log(arg) // prints "ping"
  event.reply('asynchronous-reply'.'pong')
})

ipcMain.on('synchronous-message', (event, arg) => {
  console.log(arg) // prints "ping"
  event.returnValue = 'pong'
})
Copy the code

2. The renderer

To reply to a synchronization message, use event.returnValue. The return value of the synchronization message can be read directly. You can reply to asynchronous messages using event.reply. The renderer listens to the returned channel for the return value.

// In the renderer process.
const { ipcRenderer } = require('electron')
console.log(ipcRenderer.sendSync('synchronous-message'.'ping')) // prints "pong"

ipcRenderer.on('asynchronous-reply'.(event, arg) = > {
  console.log(arg) // prints "pong"
})
ipcRenderer.send('asynchronous-message'.'ping')
Copy the code

You can see that the renderer can use ipcrenderer.send to send asynchronous messages to the main process. Send the synchronization message with ipcrenderer. sendSync.

Data persistence and state management

1. Complex data persistence

There are many options for data persistence, such as the storage scheme based on JSON files such as electron- Store. For more complex applications lowDB, NEDB, SQLite, etc.

Initially I used the electron store, and there was a persistent belief that reading and writing to the disk would only be done in the main process, and the Renderer process would only render the interface. So it was originally designed that when the renderer process rendered data or updated data, it needed to go through IPC to the main process to complete the final disk read and write. In addition to normal read/write conditions, we also need to consider the disk read/write exception, which leads to abnormal data flow around. You also need to maintain the ID generation yourself. After borrowing code from Desktop, the data persistence section was refactored, and Dexie was also used as a wrapper to the browser’s standard indexedDB database. From its Readme, you can see that it solves three main problems with indexedDB:

  1. Ambiguous exception handling
  2. The query is very bad
  3. Complex code
import Dexie from 'dexie';

export interfaceIDatabaseProject { id? :number;
  name: string;
  filePath: string;
}

export class ProjectsDatabase extends Dexie {
  public projects: Dexie.Table<IDatabaseProject, number>;
  constructor() {
    super('ProjectsDatabase');

    this.version(1).stores({
      projects: '++id,&name,&filePath'});this.projects = this.table('projects'); }}Copy the code

Inherit Dexie to implement our own database classes, declare the database version in the constructor, the schema of the table, and so on. For details, see the Official Dexie documentation.

2. Simple data persistence

Some flag bits of UI state (such as whether a popover is displayed) are stored in localStorage. In the process of looking at the source of the Desktop, they found that the number, Boolean type of data get, set for a simple encapsulation. It is very convenient to use. Here is the processing of Boolean data.

export function getBoolean(key: string): boolean | undefined export function getBoolean(key: string, defaultValue: boolean): boolean export function getBoolean( key: string, defaultValue? : boolean ): boolean | undefined { const value = localStorage.getItem(key) if (value === null) { return defaultValue } if (value === '1' || value === 'true') { return true } if (value === '0' || value === 'false') { return false } return defaultValue } export function setBoolean(key: string, value: boolean) { localStorage.setItem(key, value ? '1', '0')}Copy the code

The source code can be found in the

Five, function realization

1. Synchronize the disk and editor versions in real time

In general, what we edit in the editor is actually a copy of the disk file that the editor reads into memory. Therefore, if the files on the disk are changed, such as Git switching branches causing file changes, or deleting disk files, renaming, etc., it will cause inconsistency between the memory version and the disk version, that is, the disk version is ahead of the memory version, this time may cause conflicts. The solution to this problem is simple: use fs.watch/watchFile to listen for the currently edited file and re-read the disk version and update the memory version for synchronization if any changes occur. But the Fs. watch API is not engineering out of the box, with many compatibility issues and some bugs. For instance

Node.js fs.watch:

  • Doesn’t report filenames on MacOS.
  • Doesn’t report events at all when using editors like Sublime on MacOS.
  • Often reports events twice.
  • Emits most changes as rename.
  • Does not provide an easy way to recursively watch file trees.

Node.js fs.watchFile:

  • Almost as bad at event handling.
  • Also does not provide any recursive watching.
  • Results in high CPU utilization.

The points listed above come from Chokidar, which is a Node module that provides out-of-the-box listening for file changes. Simply listen for events like Add, unlink, and change to read the latest version of the text to the editor to achieve disk/editor version synchronization.

2. Context-Menu

Desktop contextMenu (right-click menu) based on the implementation of native IPC, more around.

The first thing we need to know is that the Menu class is main Process only.

Bind the onContextMenu event to the JsX. Element that requires contextMenu. Create Array, bind events for each MenuItem object, pass the object to main via IPC, and assign the MenuItem Array to a global object, which is temporarily stored. When the main process constructs the real instance of MenuItem, binds the click event of MenuItem, and records the serial number index of MenuItem when the click event of MenuItem is triggered. Index is then passed to the renderer process via event.sender.send. The Renderer process takes the index and executes the bound event by retrieving the individual MenuItem from the previously saved global object.

OnContextMenu => showContextualMenu ipcRenderer.send) => icpMain => menu.popup() => MenuItem.onClick(index) => event.sernder.send(index) => MenuItem.action()Copy the code

So I use remote objects in my application to mask the above complex IPC communication. The renderer process completes the Menu construction display and event binding firing.

import { remote } from 'electron';
const { MenuItem, dialog, getCurrentWindow, Menu } = remote;

const onContextMenu = (project: Project) = > {
  const menu = new Menu();

  const menus = [
    new MenuItem({
      label: 'Open in terminal',
      visible: __DARWIN__,
      click() {
        const accessor = newFileAccessor(project.filePath); accessor.openInTerminal(); }}),new MenuItem({
      label: 'open it in vscode',
      click() {
        const accessor = newFileAccessor(project.filePath); accessor.openInVscode(); }}),]; menus.forEach(menu.append); menu.popup({window: getCurrentWindow() });
};
Copy the code

Six, logs,

Good logging is very important in both development and production environments. It can assist development by recording the data changes behind UI state transitions and the branch direction of the process.

See Desktop, their logs are based on the log library: Winston.

Both the main and renderer processes provide global log objects with consistent interfaces. They are DEBUG, INFO, WARN, and ERROR. Renderer simply encapsulates the debug, INFO, warn, and error methods on the window.console object. Logs printed to the browser console are also passed through IPC to the main process, which manages the logs.

The Main process receives logs from the Renderer process and logs from the Main process itself. Two Transports are set up. Winston. Transports. The Console and Winston. Transports. DailyRotateFile respectively for the log information to print on the terminal Console and stored in the disk file. DailyRotateFile sets a maximum storage limit of 14 days, measured in days.

The log installation module is introduced when the main and renderer processes start, respectively. Because the log methods are globally exposed, they only need to be introduced once at process startup. You also need to add a type declaration for the log method in the TS environment.

Package, publish and update

The open source world already has a very sophisticated packaged and distributed tool called Electron Builder. It integrates multi-platform packaging, signing, automatic updates, publishing to Github and other platforms all in one.

Since this tool can only be used on the Intranet, it cannot be published to Github, and it cannot be signed without Apple developer tools, it can only be packaged on the machine using electron Builder, and then it can only be manually packaged and uploaded, and users can only manually download the installation package to cover the installation. Automatic updates cannot be implemented like VSCODE.

Since automatic update cannot be implemented, how can I notify users to download the latest version installation package after the latest version is delivered? On the user side, a request can be made each time an application is launched to see if there is a version update, or an entry can be provided in the application menu bar to allow users to manually trigger an update query. After querying the latest version of the server, use sermver to compare whether the local version is earlier than the server version. If yes, the server sends a notification to prompt users to download the update.

How do you implement this function under limited conditions?

Three elements are necessary for this functionality: the server side identifies the latest version of the readable file; Cloud space hosting various versions of installation packages; Apply update logic in your code.

The server is marked with the latest version of the readable file: package.json is updated every time we package, so we simply upload package.json to some unauthenticated CDN and request the file when we update it.

A cloud space that hosts each version of the installation package: this can be done using a cloud disk, which can generate a shared link to manually copy into the Notes of the Tag for that version of Gitlab.

Update logic in application code:

import got from 'got';
import semver from 'semver';
import { app, remote, BrowserWindow } from 'electron';

const realApp = app || remote.app;
const currentVersion = realApp.getVersion();

export async function checkForUpdates(window: BrowserWindow, silent: boolean = false) {
  const url = `http://yourcdn/package.json? t=${Date.now()}`;
  try {
    const response = await got(url);
    const pkg = JSON.parse(response.body);
    log.debug('Check for updates, cloud version:', pkg.version);
    log.debug('Current version', currentVersion);
    if (semver.lt(currentVersion, pkg.version)) {
      window.webContents.send('update-available', pkg.version);
    } else {
      window.webContents.send('update-not-available', silent); }}catch (error) {
    window.webContents.send('update-error', silent); }}Copy the code

This method is called when the main application process starts and when the user clicks the application menu to check for updates, respectively, to tell the UI process to send notifications. We expect updates at startup of the main application process to be silent in the event of a failure or no update to disturb the user, so we can provide a silent parameter in the IPC pipeline. After detecting the update, the user can be notified. After clicking the update, the user can jump to the latest version of Gitlab tags and guide the user to download the latest version for manual installation.

Eight, other

1. devtools

The Renderer side of Electron application was also debuggable using Chrome DevTools. For React, the DevTools extension for Mobx and other boxes can also be installed using the electronic-devtools-installer. After the application window is created, the auto-devtools-installer is called to install mobx and React.

const { default: installExtension, MOBX_DEVTOOLS, REACT_DEVELOPER_TOOLS } = require('electron-devtools-installer');
const extensions = [REACT_DEVELOPER_TOOLS, MOBX_DEVTOOLS];
for (const extension of extensions) {
  try {
    installExtension(extension);
  } catch (e) {
    // log.error(e);}}Copy the code

2. Keep the window size

For desktop applications, a common requirement is to close and then reopen, need to restore the last opened window size, location. The implementation of this is relatively simple, listen to the window resize event, the window information recorded to the current user’s application data folder, that is, app.getPath(appData). The next time you launch the application create window, read the Settings window from this file. The open source community already has a library that encapsulates this functionality: electron-window-state

const windowStateKeeper = require('electron-window-state');
let win;

app.on('ready'.function () {
  let mainWindowState = windowStateKeeper({
    defaultWidth: 1000.defaultHeight: 800
  });

  win = new BrowserWindow({
    'x': mainWindowState.x,
    'y': mainWindowState.y,
    'width': mainWindowState.width,
    'height': mainWindowState.height
  });

  mainWindowState.manage(win);
});
Copy the code

Just provide the default window size, and the electron-window-state will take care of the rest.


If you think this article is valuable to you, please like it and pay attention to our official website and WecTeam. We have quality articles every week: