preface

GithubBlog:github.com/Nealyang/Pe…

The background is as follows:

  • Pmlci source scaffolding: https://mp.weixin.qq.com/s/JRF4GjYqXw1f6jGqcYofnQ

With the provision of scaffolding, as well as the addition of pages and modules of functional encapsulation.

After all, one more layer of specification adds another layer of constraint. The essence of architecture is to allow developers to focus more on business development and not care about anything else. For example, some module configurations initialized by scaffolding above, asynchronous loading, and even some business hooks defined and retained in the initialization schema.

As above reason, I want to be able to provide a set of visual operation (create project, choose depends on, add a page, select the required material, configuration, material properties, etc.), in a word is to let the user for source development, only need to write the corresponding business module components, without management structure is how to organize the distribution module and a state, in addition to business module coding, Everything else is visual.

Since 100% of my classmates have vscode as their bread and butter, vscode extinction is naturally my first choice. The plan will provide a series of plug-ins such as creating projects, adding pages, module configuration, page configuration, and adding modules. Follow-up stage progress, and then issued a summary. Ahem, yes, this will be a source workbench kick

Up to now, 90% of the scaffolding of the project has been basically built, and this is the first stage summary.

Results show

The extensions folder is the vscode plug-in folder, the packages folder is the common component folder, scripts are the release, build, and development scripts, and other project configurations.

Of course, this is not primarily a demonstration of product functionality, ga ga ~

packages.json scripts


  "scripts": {
    "publish": "lerna list && publish:package"."publish-beta": "lerna list && npm run publish-beta:package"."start":"tnpm run start:packages && tnpm run start:extensions"."start:packages": "tnpm run setup:packages && tnpm run packages:watch"."start:extensions":"tnpm run extensions:link"."commit": "git-cz"."env": "node ./scripts/env.js"."packages:link": "lerna link"."packages:install": "rm -rf node_modules && rm -rf ./packages/*/node_modules && rm -rf ./packages/*/package-lock.json && SASS_BINARY_SITE=https://npm.taobao.org/mirrors/node-sass/ yarn install --registry=http://registry.npm.taobao.org"."packages:clean": "rm -rf ./packages/*/lib"."packages:watch": "ts-node ./scripts/watch.ts"."packages:build": "npm run packages:clean && ts-node ./scripts/build.ts"."setup:packages": "npm run packages:install && lerna clean --yes && npm run packages:build && npm run packages:link "."publish-beta:package": "ts-node ./scripts/publish-beta-package.ts"."publish:package": "ts-node ./scripts/publish-package.ts"."extensions:install": " rm -rf ./extensions/*/node_modules && rm -rf ./extensions/*/package-lock.json && rm -rf ./extensions/*/web/node_modules && rm -rf ./extensions/*/web/package-lock.json && ts-node ./scripts/extension-deps-install.ts"."extensions:link": "ts-node ./scripts/extension-link-package.ts"
  }
Copy the code

Scripts are not added completely, NPM run publish:package NPM run publish:package NPM run publish:package

Selection of the architecture

For now it is to bepmCliAll functionality is packaged as plug-ins, and then the architecture configuration is replaced by visualization during coding. So there must be not just one plug-in, but one based on the source architectureAction setMore than:extensions. There are a lot of similar functionality packages in plug-ins. For instance fromgitlabTo read the underlying file,vscodeWebViewCommunication,ASTBasic packaging, etc., so necessarily need to rely on a lot ofpackagesIn order to develop the unified management of efficiency and collection, it must be based onlernamonorepoProject structure.

I will not mention some pit mining about Lerna, mainly because I just read most practical articles and official documents on the market, lacking some of my own practice (after all, I feel that more research can not solve much pain, so I don’t want to spend energy). The final Monorepo is implemented based on YARN workspace. Through Lerna Link, soft chain package and lerna release package were compared, so I wrote some scripts for package and release to pre-release and online by referring to App Works.

Project workflows and coding constraints pass through general configurations such as HusKY, Lint-staged, Git-CZ, ESLint, Prettier, etc.

The code is ts coded, so there are many common configurations in extensions and packages that can be extracted and placed in the project root directory (as shown in the project directory screenshot above).

practice

Initialize it by lerna init, lerna create XXX and I won’t talk about that here. It’s a directory structure with packages and package.json files.

The project architecture

The package structure

The above structural instructions are all in the picture

The script package

A scripts folder is placed in the root directory of the project, which contains scripts for release, development, and dependent installations.

getPakcageInfo.ts

Used to get relevant publish information from Packages. ShouldPublish compares the local version with the online version to determine if the master needs to publish

/* * @author: 一凨 * @date: 2021-06-07 18:47:32 * @last Modified by: 一凨 * @last Modified Time: 2021-06-07 19:12:28 */
import { existsSync, readdirSync, readFileSync } from 'fs';
import { join } from 'path';
import { getLatestVersion } from 'ice-npm-utils';

const TARGET_DIRECTORY = join(__dirname, '.. /.. /packages');
// Define the structure of the information to be retrieved
export interface IPackageInfo {
  name: string;
  directory: string;
  localVersion: string;
  mainFile: string; // package.json main file
  shouldPublish: boolean;
}
// Check whether the package builds successfully
function checkBuildSuccess(directory: string, mainFile: string) :boolean {
  return existsSync(join(directory, mainFile));
}
// Check whether the latest version online is the same as the local version
function checkVersionExists(pkg: string, version: string) :Promise<boolean> {
  return getLatestVersion(pkg)
    .then((latestVersion) = > version === latestVersion)
    .catch(() = > false);
}

export async function getPackageInfos () :Promise<IPackageInfo[] >{
  const packageInfos: IPackageInfo[] = [];
  if(! existsSync(TARGET_DIRECTORY)) {console.log(`[ERROR] Directory ${TARGET_DIRECTORY}not exist! `);
  } else {
    // Get all packages directories and iterate through package.json
    const packageFolders: string[] = readdirSync(TARGET_DIRECTORY).filter((filename) = > filename[0]! = ='. ');
    console.log('[PUBLISH] Start check with following packages:');
    await Promise.all(
      packageFolders.map(async (packageFolder) => {
        const directory = join(TARGET_DIRECTORY, packageFolder);
        const packageInfoPath = join(directory, 'package.json');

        // Process package info.
        if (existsSync(packageInfoPath)) {
          const packageInfo = JSON.parse(readFileSync(packageInfoPath, 'utf8'));
          const packageName = packageInfo.name || packageFolder;

          console.log(` -${packageName}`);
					// Retrieve information from package.json
          try {
            packageInfos.push({
              name: packageName,
              directory,
              localVersion: packageInfo.version,
              mainFile: packageInfo.main,
              // If localVersion not exist, publish it
              shouldPublish: checkBuildSuccess(directory, packageInfo.main) && ! (await checkVersionExists(packageName, packageInfo.version)),
            });
          } catch (e) {
            console.log(`[ERROR] get ${packageName} information failed: `, e); }}else {
          console.log(`[ERROR] ${packageFolder}'s package.json not found.`); }})); }return packageInfos;
}
Copy the code

The explanation of the code is in the comments. What the core does is read the package.json information of each package from packages and send it back in the required format for distribution.

publish-beta-package

/* * @author: 一凨 * @date: 2021-06-07 18:45:51 * @last Modified by: 一凨 * @last Modified time: 2021-06-07 19:29:26 */
import * as path from 'path';
import * as fs from 'fs-extra';
import { spawnSync } from 'child_process';
import { IPackageInfo, getPackageInfos } from './fn/getPackageInfos';

const BETA_REG = /([^-]+)-beta\.(\d+)/; / / '1.0.0 - beta. 1'

interface IBetaPackageInfo extends IPackageInfo {
  betaVersion: string;
}

function setBetaVersionInfo(packageInfo: IPackageInfo) :IBetaPackageInfo {
  const { name, localVersion } = packageInfo;

  let version = localVersion;

  if(! BETA_REG.test(localVersion)) {// If localVersion is not beta version, disk it!
    let betaVersion = 1;
    // Get information about the dist-tag of the package
    const childProcess = spawnSync('npm'['show', name, 'dist-tags'.'--json'] and {encoding: 'utf-8'});const distTags = JSON.parse(childProcess.stdout || "{}") | | {};const matched = (distTags.beta || ' ').match(BETA_REG);

    / / 1.0.0 - beta. 1 - > [1.0.0 - beta. "1", "1.0.0", "1"] - > 1.0.0 - beta. 2
    if (matched && matched[1] === localVersion && matched[2]) {
      // Disk version, +1
      betaVersion = Number(matched[2]) + 1;
    }
    version += `-beta.${betaVersion}`;
  }

  return Object.assign({}, packageInfo, { betaVersion: version });
}

// Write the corrected betaVersion to the corresponding package.json
function updatePackageJson(betaPackageInfos: IBetaPackageInfo[]) :void {
  betaPackageInfos.forEach((betaPackageInfo: IBetaPackageInfo) = > {
    const { directory, betaVersion } = betaPackageInfo;

    const packageFile = path.join(directory, 'package.json');
    const packageData = fs.readJsonSync(packageFile);

    packageData.version = betaVersion;

    for (let i = 0; i < betaPackageInfos.length; i++) {
      const dependenceName = betaPackageInfos[i].name;
      const dependenceVersion = betaPackageInfos[i].betaVersion;

      if (packageData.dependencies && packageData.dependencies[dependenceName]) {
        packageData.dependencies[dependenceName] = dependenceVersion;
      } else if (packageData.devDependencies && packageData.devDependencies[dependenceName]) {
        packageData.devDependencies[dependenceName] = dependenceVersion;
      }
    }

    fs.writeFileSync(packageFile, JSON.stringify(packageData, null.2));
  });
}
// publish --tag=beta
function publish(pkg: string, betaVersion: string, directory: string) :void {
  console.log('[PUBLISH BETA]'.`${pkg}@${betaVersion}`);
  spawnSync('npm'['publish'.'--tag=beta'] and {stdio: 'inherit'.cwd: directory,
  });
}

// Import file
console.log('[PUBLISH BETA] Start:');
getPackageInfos().then((packageInfos: IPackageInfo[]) = > {
  const shouldPublishPackages = packageInfos
    .filter((packageInfo) = > packageInfo.shouldPublish)
    .map((packageInfo) = > setBetaVersionInfo(packageInfo));

  updatePackageJson(shouldPublishPackages);

  // Publish
  let publishedCount = 0;
  const publishedPackages = [];


  shouldPublishPackages.forEach((packageInfo) = > {
    const { name, directory, betaVersion } = packageInfo;
    publishedCount++;
    // Print the information about this release
    console.log(` -${name}@${betaVersion}- `);
    publish(name, betaVersion, directory);
    publishedPackages.push(`${name}:${betaVersion}`);
  });

  console.log(`[PUBLISH PACKAGE BETA] Complete (count=${publishedCount}) : `);
  console.log(`${publishedPackages.join('\n')}`);

});

Copy the code

The basic functionality is in the comments (this sentence will not be described later), to sum up the scripting functions:

  • Get all the local packageInfo
  • Compare the online (published) information to correct the required version information for this release
  • Add (write) the corrected version information to package.json in the local corresponding package
  • The script is invoked to perform the publication

Publish-package is very simple to write, just call NPM publish, of course, but also need some basic online verification, such as shouldPublish. Needless to say!

Note that when publishing, note the login (NPM whoami) and the corresponding organization permissions if you also use @xxx/

watch

NSFW is used to monitor local files. There are changes, we compile is done!

 /* * @author: 一凨 * @date: 2021-06-07 20:16:09 * @last Modified by: 一凨 * @last Modified time: 2021-06-10 17:19:05 */
import * as glob from 'glob';
import * as path from 'path';
import * as fs from 'fs-extra';
import { run } from './fn/shell';


// eslint-disable-next-line @typescript-eslint/no-var-requires
const nsfw = require('nsfw');

async function watchFiles(cwd, ext) {
  const files = glob.sync(ext, { cwd, nodir: true });

  const fileSet = new Set(a);/* eslint no-restricted-syntax:0 */
  for (const file of files) {
    /* eslint no-await-in-loop:0 */
    await copyOneFile(file, cwd);
    fileSet.add(path.join(cwd, file));
  }

  const watcher = await nsfw(cwd, (event) = > {
    event.forEach((e) = > {
      if (
        e.action === nsfw.actions.CREATED ||
        e.action === nsfw.actions.MODIFIED ||
        e.action === nsfw.actions.RENAMED
      ) {
        constfilePath = e.newFile ? path.join(e.directory, e.newFile!) : path.join(e.directory, e.file!) ;if (fileSet.has(filePath)) {
          console.log('non-ts change detected:', filePath); copyOneFile(path.relative(cwd, filePath), cwd); }}}); }); watcher.start(); } watchFiles(path.join(__dirname,'.. /packages'), '*/src/**/! (*.ts|*.tsx)').catch((e) = > {
  console.trace(e);
  process.exit(128);
});

// All the code above this is to solve the problem that TSC does not support copy of non-.ts /.tsx files
async function tscWatcher() {
  await run('npx tsc --build ./tsconfig.json -w');
}

tscWatcher();

async function copyOneFile(file, cwd) {
  const from = path.join(cwd, file);
  const to = path.join(cwd, file.replace(/src\//.'/lib/'));
  await fs.copy(from, to);
}
Copy the code

extensions-deps-install

Because our workspace is under the Packages directory, we cannot install all dependencies directly through YARN for the plug-ins and Web pages under the Extensions directory. We provide a script for installing dependencies at will. Basically, go to the project directory and execute NPM I

import * as path from 'path';
import * as fse from 'fs-extra';
import * as spawn from 'cross-spawn';

export default function () {
  const extensionsPath = path.join(__dirname, '.. '.'.. '.'extensions');
  const extensionFiles = fse.readdirSync(extensionsPath);
  const installCommonds = ['install'];
  if(! process.env.CI) {// Concatenate parameters
    installCommonds.push('--no-package-lock');
    installCommonds.push('--registry');
    installCommonds.push(process.env.REGISTRY ? process.env.REGISTRY : 'http://registry.npm.taobao.org');
  }

  for (let i = 0; i < extensionFiles.length; i++) {
    // Iterate through the installation and continue to install dependencies in the Web page if there is a Web directory
    const cwd = path.join(extensionsPath, extensionFiles[i]);
    // eslint-disable-next-line quotes
    console.log("Installing extension's dependencies", cwd);

    spawn.sync('tnpm', installCommonds, {
      stdio: 'inherit',
      cwd,
    });
    const webviewPath = path.join(cwd, 'web');
    if (fse.existsSync(webviewPath)) {
      // eslint-disable-next-line quotes
      console.log("Installing extension webview's dependencies", webviewPath);
      spawn.sync('tnpm', installCommonds, {
        stdio: 'inherit'.cwd: webviewPath, }); }}}Copy the code

Note that scripts are ts encoded, so npmScripts uses TS-Node to execute

extension-link-package

Delete locally related packages and recursively look up (application level) to the package behind the corresponding soft chain

import * as path from 'path';
import * as fse from 'fs-extra';
import { run } from './fn/shell';

(async function () {
  const extensionsPath = path.join(__dirname, '.. /extensions');
  const extensionFiles = await fse.readdir(extensionsPath);
	// Get the list of plug-ins under Extensions and run remove one by one
  return await Promise.all(
    extensionFiles.map(async (extensionFile) => {
      const cwd = path.join(extensionsPath, extensionFile);
      if (fse.existsSync(cwd)) {
        // link packages to extension
        if(! process.env.CI) {await removePmworks(cwd);
        }
        const webviewPath = path.join(cwd, 'web');
        if (fse.existsSync(webviewPath)) {
          // link packages to extension webview
          if(! process.env.CI) {awaitremovePmworks(webviewPath); }}}}),); })().catch((e) = > {
  console.trace(e);
  process.exit(128);
});

// Remove dependencies under @pmWorks
async function removePmworks(cwd: string) {
  const cwdStat = await fse.stat(cwd);
  if (cwdStat.isDirectory()) {
    await run(`rm -rf ${path.join(cwd, 'node_modules'.'@pmworks')}`); }}Copy the code

A small summary

The core script is as above, in fact, are relatively simple and direct functions. There is nothing written about the extensions release, but you can borrow it from AppWorks. We’ll update that when we release plug-ins.

After the construction of a project is completed, it is basically ready to start. I’ll take creating a project as an example (focus on the infrastructure part and leave the details of plug-in functionality and implementation to the second stage).

Vscode Extensions (vscode- WebView extensions)

We’ll initialize the plugin using Yo Code in the Extensions folder. The basic knowledge of specific, refer to the official document: code.visualstudio.com/api

After that, we have a basic architecture of the project, a set of packages to manage, and we are ready to move on to our development phase.

After all, the purpose of our plug-in is to visualize a series of operations, so vscode’s buttons and commands are definitely not enough for us, we need an operation interface: webView. This is an overall interaction with the webView plugin:

  • Common-xxx(utils)Is responsible for the encapsulation of some common functionality across the project level
  • Extension-utilsAre methods libraries extracted for a plug-in, such asproject-utilscreateProjectA method library used to initialize a project, similar to onecontroller
  • extension-serviceIs carryingvscodewebViewSome methods of communication extraction, as the name implies:service

There are two views: vscode-extension and extension-webView

For example! Here is an example of initializing a project scaffolding

About vscode extension with WebView related basic concept can be seen here: code.visualstudio.com/api/extensi…

WebView

In fact, WebView does not have too much to prepare, is to prepare HTML, JavaScript and CSS front three big pieces on the line.

Here I use ice scaffolding to initialize out of the project: NPM init ICE

Then modify the outputDir configuration in build.json and specify mpa mode

{
  "mpa": true."vendor": false."publicPath": ". /"."outputDir": ".. /build"."plugins": [["build-plugin-fusion",
      {
        "themePackage": "@alifd/theme-design-pro"}], ["build-plugin-moment-locales",
      {
        "locales": [
          "zh-cn"]}],"@ali/build-plugin-ice-def"]}Copy the code

After finishing the code, we can get our three big pieces.

For more information on ICE, please visit the official documentation

Extensions


import * as vscode from 'vscode';
import { getHtmlFroWebview, connectService } from "@pmworks/vscode-webview";
import { DEV_WORKS_ICON } from "@pmworks/constants";
import services from './services';

export function activate(context: vscode.ExtensionContext) {
	const { extensionPath } = context;

	let projectCreatorPanel: vscode.WebviewPanel | undefined;

	const activeProjectCreator = () = > {
		const columnToShowIn = vscode.window.activeTextEditor ? vscode.window.activeTextEditor.viewColumn : undefined;
		if (projectCreatorPanel) {
			projectCreatorPanel.reveal(columnToShowIn)
		} else {
			projectCreatorPanel = vscode.window.createWebviewPanel('BeeDev'.'Initializing the source code architecture', columnToShowIn || vscode.ViewColumn.One, {
				enableScripts: true.retainContextWhenHidden: true}); } projectCreatorPanel.webview.html = getHtmlFroWebview(extensionPath,'projectcreator'.false);
		projectCreatorPanel.iconPath = vscode.Uri.parse(DEV_WORKS_ICON);
		projectCreatorPanel.onDidDispose(
			() = > {
				projectCreatorPanel = undefined;
			},
			null,
			context.subscriptions,
		);
		connectService(projectCreatorPanel, context, { services });
	}

	let disposable = vscode.commands.registerCommand('devworks-project-creator.createProject.start', activeProjectCreator);

	context.subscriptions.push(disposable);
}

export function deactivate() {}Copy the code

Here, too, it’s all routine, registering commands and related callbacks, initializing the WebView. So gethmlFrowebView


/** * Add security protocol to local resource *@param Url Local resource path *@returns Secure path */ with vscode-resource protocol
function originResourceProcess(url: string) {
  return vscode.Uri.file(url).with({ scheme: 'vscode-resource' });
}

export const getHtmlFroWebview = (
  extensionPath: string.entryName: string, needVendor? :boolean, cdnBasePath? :string, extraHtml? :string, resourceProcess? :(url: string) = > vscode.Uri,): string= > {
  resourceProcess = resourceProcess || originResourceProcess;
  const localBasePath = path.join(extensionPath, 'build');
  const rootPath = cdnBasePath || localBasePath;
  const scriptPath = path.join(rootPath, `js/${entryName}.js`);
  const scriptUri = cdnBasePath ?
    scriptPath :
    resourceProcess(scriptPath);
  const stylePath = path.join(rootPath, `css/${entryName}.css`);
  const styleUri = cdnBasePath ?
    stylePath :
    resourceProcess(stylePath);
  // vendor for MPA
  const vendorStylePath = path.join(rootPath, 'css/vendor.css');
  const vendorStyleUri = cdnBasePath
    ? vendorStylePath
    : resourceProcess(vendorStylePath);
  const vendorScriptPath = path.join(rootPath, 'js/vendor.js');
  const vendorScriptUri = cdnBasePath
    ? vendorScriptPath
    : resourceProcess(vendorScriptPath);

  // Use a nonce to whitelist which scripts can be run
  const nonce = getNonce();
  return ` <! DOCTYPE html> <html> <head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width,initial-scale=1,shrink-to-fit=no"> <meta name="theme-color" content="#000000"> <title>Iceworks</title> <link rel="stylesheet" type="text/css" href="${styleUri}">
    ${extraHtml || ' '}
    ` +
    (needVendor ? `<link rel="stylesheet" type="text/css" href="${vendorStyleUri}"/ > ` : ' ') +
    `   
       
      
`
+ (needVendor ? `<script nonce="${nonce}" src="${vendorScriptUri}"></script>` : ' ') + `<script nonce="${nonce}" src="${scriptUri}"></script> </body> </html>`; } function getNonce() :string { let text = ' '; const possible = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789'; for (let i = 0; i < 32; i++) { text += possible.charAt(Math.floor(Math.random() * possible.length)); } return text; } Copy the code

This method is located in packages/vscode-webview/vscode.ts, where you get a piece of HTML and add native resources to the VScode protocol. Support vendor, extraHtml, and more

So far, we’ve been able to invoke our WebView in vscode.

communication

Then it’s time to solve the problem of vscode communicating with WebView. The communication here is very similar to pubSub:

  • The plugin sends a message to the WebView

    panel.webview.postMessage({text:"Hi, this is a message from vscode."});
    Copy the code
  • The WebView receives the message

    window.addEventListener('message'.event= >{
    	const message = event.data;
    	console.log('WebView receives the following message:${message}`);
    })
    Copy the code
  • The WebView sends a message to the plug-in

    vscode.postMessage({text:"Hello, this is a message from the webView."});
    Copy the code
  • Plug-in side acceptance

    panel.webview.onDidReceiveMessage(msg= >{
    	console.log(The message received by the plug-in:${msg}`)},undefined,context.subscriptions);
    Copy the code

This communication mechanism is too fragmented, and in real projects, webView is more like our View layer. So in theory it just calls the Controller interface through the service to do the underlying operation and tell me the result:

For example, when creating a project, you need to let the user select the directory to create. Click the click Handle on the HTML page should be as follows:

  const getAppPath = async() = > {const projectPath = await callService('project'.'getFolderPath'.'ok');
    setAppPath(projectPath);
  };
Copy the code

The first parameter of the callService is the service class, the second parameter is the name of the method to be called in the class, and the next parameter is the corresponding method.

As such, we encapsulate a callService method:

// packages/vscode-webview/webview.ts

// @ts-ignore
export const vscode = typeof acquireVsCodeApi === 'function' ? acquireVsCodeApi() : null;

export const callService = function (service: string, method: string. args) {
  // return promise
  return new Promise((resolve, reject) = > {
    // Generate the corresponding eventId
    const eventId = setTimeout(() = >{});console.log('call vscode extension service:${service} ${method} ${eventId} ${args}`);

    // received a message from vscode, usually after processing the webView requirements
    const handler = event= > {
      const msg = event.data;
      console.log(`webview receive vscode message:}`, msg);
      if (msg.eventId === eventId) {// Remove the corresponding eventID, indicating that the communication is over, can remove (end) the communication
        window.removeEventListener('message', handler);
        msg.errorMessage ? reject(new Error(msg.errorMessage)) : resolve(msg.result); }}// webview accepts a message from vscode
    window.addEventListener('message', handler);

    // WebView sends a message to vscode
    vscode.postMessage({
      service,
      method,
      eventId,
      args
    });

  });
}
Copy the code

Webview layer completes the encapsulation of sending time request, receiving time request and removeListener time request. So we can add the corresponding webView’s service.methodName to the extension.

Here we have encapsulated a method called connectService.

connectService(projectCreatorPanel, context, { services });
Copy the code

The projectCreatorPanel above is an “instance” of the Created WebviewPanel, whereas services can be understood as objects with multiple classes

const services = {
	project: {getFolderPath(. args){
			//xxx
		},
		xxx
	},
	xxx:{}
}
Copy the code

The connectService method is as follows:

export function connectService(webviewPanel: vscode.WebviewPanel, context: vscode.ExtensionContext, options: IConnectServiceOptions) {
  const { subscriptions } = context;
  const { webview } = webviewPanel;
  const { services } = options;
  webview.onDidReceiveMessage(async (message: IMessage) => {
    const { service, method, eventId, args } = message;
    const api = services && services[service] && services[service][method];
    console.log('onDidReceiveMessage', message);
    if (api) {
      try {
        const fillApiArgLength = api.length - args.length;
        const newArgs = fillApiArgLength > 0 ? args.concat(Array(fillApiArgLength).fill(undefined)) : args;
        const result = awaitapi(... newArgs, context, webviewPanel);console.log('invoke service result', result);
        webview.postMessage({ eventId, result });
      } catch (err) {
        console.error('invoke service error', err);
        webview.postMessage({ eventId, errorMessage: err.message }); }}else {
      vscode.window.showErrorMessage(`invalid command ${message}`); }},undefined, subscriptions);
}
Copy the code

The above code is also relatively simple, which is to register the listener function, and then as long as the message from the WebView is detected, it will fetch the corresponding method of a service under the Services to execute, and pass in the parameters passed by the WebView.

Extension services were introduced here

The @PMWorks /project-service package encapsulates only some basic method calls. The core processing logic, such as downloading the corresponding gitRpo and parsing local files, is carried out in the corresponding extension-utils. The service is just called.

A small problem

The basic process packaging has been completed above, the rest is to write the concrete logic. However, in real development, the web page needs to take the parameters passed in by vscode, and in web page development, vscode plug-ins cannot read uncompiled code. How to solve it?

Inside the webView, a layer of callService is encapsulated for local Web page development

Follow-up prospects

Up to now, I have basically introduced some development summary of these two weeks except business work. Next, I need to overhaul the API of vscode plug-in and get started. Of course, before this, another very urgent task is to upgrade the source code structure sorted last year, to the raX system in the group now some capabilities.

In returning to the development of the plug-in architecture (BeeDev source workbench), the following steps are required:

  • Initialize the source code architecture
  • Create a page, drag and drop the relevant H5 source material (requires the whole material background) to generate the initialization page
  • Create modules, visually configure module loading categories, etc

If you have enough energy, you actually need a Node background, so that you can get through the server and local capabilities.

Ok, don’t YY, let’s just summarize the next milestone

As for the project source…

reference

  • Appworks: appworks. Site /
  • Vscode extension api:code.visualstudio.com/api
  • monorepo&leran:github.com/lerna/lerna

other

Pay attention to wechat public number [full stack front selection], push selected articles every day ~