Development of multilanguage conversion tool notes
It’s a bit clickbait. The article is just a development summary.
preface
Recently, our company has a multi-language requirement. Considering that manual replacement one by one is not a good method, we found out how to do international reconstruction of byte front-end based on AST by searching for information. , this article is just a good solution to the needs. So as soon as you read the article, you’re ready to get started.
Front is introduced
jscodeshift
Jscodeshift is a tool set that can rebuild JS and TS files. By wrapping recast(an AST-to-AST conversion tool), we provide a number of semantic apis to manipulate the AST and export the desired effects.
Advantages of jscodeshift:
- Also supports JavaScript or TypeScript parsing
- The API is simple, and code changes can be made through only a few apis
- A website that provides visual code->AST: AstExplorer. Simply paste in our JS or TS code to convert it into an AST syntax tree. Using this syntax tree and the API, it is easy to find the code snippet specified by the file through the JS code to modify.
Jscodeshift – “play AST like jQuery” jscodeshift is between old and new apis.
starts
1. Find all the scenarios
Here are some of the most common situations in which Chinese appears in code (to be added..
function A(c: string) {
// After experiments, comments do not appear in the AST
let templateStr = 'Template string appears in Chinese${x}beauty${y}Jude Jude `
const zh = 'Chinese'
const objStr = {
c: 'test'
}
return <Exception
type="404"
linkElement={'as attribute ${x} hi `}desc={a}
backText="Appear as an attribute"
/>
}
Copy the code
Next, let’s look at several forms to find the ast.type(astExplorer is used with console output).
Template strings correspond to: TemplateLiteral; Plain literals: StringLiteral;Copy the code
Find the corresponding data set:
module.exports = function (file, api) {
const j = api.jscodeshift
const root = j(file.source)
// Find the template string collection,
root.find(j.TemplateLiteral)
// To find a collection of ordinary literals, you can pass in a second argument to filter the returned collection
root.find(j.Literal, p= > /[\u4e00-\u9fa5]/.test(p.value))
// Return the file
return root.toSource()
}
Copy the code
2. Handle different types of scenarios
After finding the corresponding collection, we need to use forEach to traverse the collection, find the corresponding text, use replaceWith to replace and replace, return the replacement result, and generate the corresponding Chinese annotation
let i = 0
module.exports = function (file, api) {
const j = api.jscodeshift
const root = j(file.source)
The find method returns a collection of nodes
// Find the template string collection,
root.find(j.TemplateLiteral).forEach((path) = > {
let value = ' ';
let hash = i++;
// The data in Quasis is an array of sliced template strings, i.e., strings.
Expressions is an array of type = Identifier in the template string and an array of variable names in the template string
// Generate the original template string eg: value = 'as an attribute ${x} hi'
path.node.quasis.forEach((item, index) = > {
if (index >= path.node.expressions.length) {
value += item.value.raw;
} else {
value = `${value + item.value.raw}{${path.node.expressions[index].name}} `; }});// Fetch an array of variables
const obj = path.node.expressions.map((item) = > item.name);
j(path).replaceWith((p) = > {
// Set the corresponding intl method, eg: intl.get('moment', {day, hours})
p.node.raw = `intl.get(${hash}, {${obj}}) `;
return p.node.raw;
});
const comment = j.commentBlock(`${hash}: ${value}`.true.false);
// Generate block comments
// 1: as an attribute ${x} Hi
const comments = (path.node.comments = path.node.comments || []);
comments.push(comment);
});
// To find a collection of ordinary literals, you can pass in a second argument to filter the returned collection
root.find(j.Literal, p= > /[\u4e00-\u9fa5]/.test(p.value)).forEach((path) = > {
// Get the original value
const value = path.node.raw || path.node.value;
let hash = i++ ;
j(path).replaceWith((p) = > {
// Handle different node types. Set the value after the replacement
if (p.node.type === 'JSXText') {
p.node.raw = `{intl.get(${hash})} `;
} else if (p.parentPath.node.type === 'JSXAttribute') {
p.node.raw = `{intl.get(${hash})} `;
} else {
p.node.raw = `intl.get(${hash}) `;
}
return p.node.raw;
});
// Generate comments
// ** 1: */
const comments = (path.node.comments = path.node.comments || []);
// Generate block comments
const comment = j.commentBlock(`${hash}: ${value}`.true.false);
comments.push(comment);
});
}
// Return the file
return root.toSource()
}
Copy the code
3. If there are fields that need to be processed, introduce corresponding multilingual methods
.const Literal = root.find(j.Literal, (p) = > /[\u4e00-\u9fa5]/.test(p.value));
const TemplateLiteral = root.find(j.TemplateLiteral);
if (Literal.length || TemplateLiteral.length) {
// eg: import {intl} from 'intl';
j(root.find(j.Declaration).at(0).get()).insertBefore(j.importDeclaration(
[j.importSpecifier(j.identifier("intl"))],
j.literal("intl"))); .Copy the code
You’re done
4. Handle multiple files
The above code deals with a single file. When you refactor your code, you usually need to refactor the entire file, or multiple files, using Node. We use globby to get address, file using the fs. ReadFileSync, fs. WriteFileSync to read and write files
const { resolve } = require('path');
const { sync } = require('globby');
const { readFileSync, writeFileSync } = require('fs');
const { transformer } = require('./transform');
function start() {
// Get the current working directory SRC
const root = resolve(process.cwd(), 'src');
// get all the ts, TSX, js, JSX files in the SRC directory
The sync method returns an array of file names
const files = sync([`${root}/ * * /! (*.d).{ts,tsx,js,jsx}`] and {dot: true.ignore: `${root}/.umi/**`,
}).map((x) = > resolve(x));
const filesLen = files.length;
for (let i = 0; i < filesLen; i += 1) {
const file = files[i];
console.log('file: ', file);
const index = file.lastIndexOf('. ');
const parser = file.substr(index + 1);
// Read the contents of the file
const content = readFileSync(file, 'utf-8');
// for the above conversion method
const resContent = transformer(content, parser);
下入文件
writeFileSync(file, resContent, 'utf8'); }}module.exports = start
Copy the code
The above Chinese translation and corresponding multi-language reconstruction is complete.
extension
We can use the Commander plugin to develop multi-language conversion scaffolding. Something like this
#! /usr/bin/env node
// Solve the problem that different users have different paths to node, allowing the system to dynamically find node to execute your script files
const { program } = require("commander")
const pkg = require('.. /.. /package.json')
const start = require("..")
program
.version(pkg.version)
.option('-s --start'.'start translate', start)
program.on('--help'.function(){
console.log(' ');
console.log('use translate ! ');
});
program.parse(process.argv)
Copy the code
The last
This is my own package of code batch reconstruction scaffolding, but also very rough, we can play as a demo, also please give a little brother to show one or two, or a start, HHH. Github address: translate-mod
Thank you!!