【 Write in front 】
In our last article, Build your own scaffolding part 1, we introduced some common scaffolding configurations. Having completed the dev-Server and multi-file format load handling capabilities, in this section we will continue to refine the configuration to complete all the features of the base version. The full code can be found here on Github. In the meantime, let’s review our goals.
Basic version | The standard version |
---|---|
dev-server Dealing with HTML/JS/VUE/CSS/LESS/SASS/IMG/JPG and other files The routing configuration The mock data The agent interface Packaging analysis Unit testing Compile build |
dev-server Dealing with HTML/JS/VUE/CSS/LESS/SASS/IMG/JPG and other files The mock data The agent interface Packaging analysis Unit testing Compile build Basic layout Navigation configuration Support for TS Support the Markdown Style to check |
[See here]
In the previous section, webpack.config.base.js was almost done. We need to make some differentiated configurations for development and production environments. As originally designed, we put the configuration required by the development environment into webpack.config.dev.js, and the configuration required by the production environment into webpack.config.prod.js. Looking at our design goals, it’s not hard to see that mock data, interface proxies, and packaging analysis are specific requirements for development scenarios, not production environments. So let’s do the r&d scenario first.
Mock data Mocks are simple: intercept the request, load the mock data preconfigured by the user, and return it as a response. Common methods in the community can be divided into two categories. One is to integrate intercepting hooks directly at the client level. For example, AXIos has a corresponding Axios-mock-Adapter as a plug-in to mock data. The other is intercepting requests at the server middleware level. The difference between the two methods is that the first method does not actually make ajax requests, that is, network records are not visible in the network console. The second method makes an Ajax request, but the request is intercepted at the local server. The request is not actually sent from the local machine, but there is a record in the Network panel. Here, I chose the second option for the obvious reason that we want to see the request log from the Network panel of user agents such as Chrome/Firefox for debugging and troubleshooting. A query of Webpack-dev-Server’s Github shows that its implementation is Express-based. Express’s middleware mechanism allows us to register our logic into the call chain. Since I didn’t see a particularly good community plugin, I decided to implement one myself. First, its final use should be simple, like this:
/ * * *@file mock/api/user.js
* @description mock api for user model
* @author nimingdexiaohai([email protected])
*/
module.exports = {
queryUser: {
url: /\/user\/\d$/,
method: 'get'.status: 200.response: {
success: true.message: 'get user info success'.data: {
id: 1.name: 'John'}}},listUsers: {
url: /\/user$/,
method: 'get'.status: 200.response: {
success: true.message: 'list user info success'.data: {
users: [{id: 1.name: 'John' },
{ id: 2.name: 'Sharon'}]}}}};Copy the code
Two mock configurations are given in the sample, one to get information for a specific user and the other to get a list of all users. The field semantics are clear:
// Path of the route to be intercepted. String and regular matching are supported
url: string/reg
// The type of method to intercept
method: get/post/put/delete/option
// The status of the response
status: HTTP_STATUS_CODE
// The response structure supports object structures and methods to dynamically generate data
response: Object/Function
Copy the code
Next, we implement the interception of the route and load the response data. It should provide an instance of the singleton pattern and should have exposed methods for setting state, registering configuration, loading hooks, and so on. Let’s take a look at the final implementation first, and then explain one by one:
/ * * *@file mock/lib.js
* @description export mock
* @author nimingdexiaohai([email protected])
*/
const _ = require('lodash');
const Mock = {
on: true.rules: [],}; Mock.Use =() = > {
Mock.on = true;
};
Mock.Restore = () = > {
Mock.on = false;
};
Mock.Reset = () = > {
Mock.on = false;
Mock.rules = [];
};
Mock.Register = (c) = > {
c['on'] = c.hasOwnProperty('on')? c['on'] : true;
Mock.rules.push(c);
};
Mock.findRule = (path, method) = > {
const result = {willMock: false.rule: null};
if (Mock.on) {
for (let r of Mock.rules) {
if (r.on
&& r.method.toUpperCase() === method
&& ((_.isString(r.url) && r.url === path) || (_.isRegExp(r.url) && r.url.test(path)))
) {
result.willMock = true;
result.rule = r;
break; }}}return result;
};
Mock.LoadMock = (req, res, next, app, server, compiler) = > {
if (req && req.path && req.method) {
const {willMock, rule} = Mock.findRule(req.path, req.method);
if (willMock) {
console.log(` [${req.method.toUpperCase()}]${req.path}mocked by mocker... `);
if (_.isFunction(rule.response)) {
res.status(rule.status).send(rule.response(req, res));
}
else {
res.status(rule.status).send(rule.response);
}
return;
}
}
next();
};
module.exports = Mock;
Copy the code
We define a Mock object whose properties include fields mock. on and mock. rules, which control the Mock execution status and the routing interception configuration, respectively. Use, Mock.Restore, and Mock.Reset are modifications to the effective state, and Moc. Register and moc.loadMock are interfaces that Register the route interception rules and load the main logic of the Mock. The derived method uses the big hump naming scheme and the internal method uses the small hump naming scheme. Mock.LoadMock passes in request and response objects to judge the method and route of the request. Once the rule configured by the user is matched, the response data loaded with the configuration is returned to the user agent and directly returns to block the subsequent call chain. Otherwise, execute next() to proceed to the next middleware logic. Response here can be configured as a method to dynamically generate data or remotely load data. At this point, you have the main logic for your mock data. So how exactly does this middleware register? After querying the documentation, we found that devServer.before exposed the interface to allow our own logic to be executed before all other middleware, so we just put the interception logic here.
// webpack.config.dev.js
const MockUp = require('.. /mock/index');
devServer: {
before: function(app, server, compiler) {
MockUp.registerAll();
server.use(function(req, res, next) { MockUp.loadMock(req, res, next); }); }},Copy the code
The mockup. registerAll method is first implemented in devServer.before, which does not appear in our definition of Mock. In order to better decouple the application from the library logic, and to make it easier to use plug-ins, we have encapsulated the Mock export method. Its concrete implementation is as follows:
/ * * *@file mock/index.js
* @description mock api entry
* @author nimingdexiaohai([email protected])
*/
const Mock = require('./lib');
const User = require('./api/user');
const apiModels = [User];
module.exports = {
registerAll: () = > {
apiModels.forEach(model= > {
Object.keys(model).forEach(key= > {
Mock.Register(model[key]);
});
});
},
loadMock: (req, res, next, app, server, compiler) = >{ Mock.LoadMock(req, res, next, app, server, compiler); }};Copy the code
At this point, the overall context of the mock becomes clearer. It is good practice to define a rule configuration that divides mocks by entity Model objects as a module, so there is an organization under the Mock folder like this:Of course, you can split it under the Model, depending on your business needs. thanapi/user/common.js
andapi/user/vip.js
Everything is adjustable.
Interface proxying Interface proxying is a common development requirement that has been supported by many excellent community open source plug-ins. A better known example is HTTP-proxy-Middleware, which can be used directly. Also because Webpack-dev-server is implemented based on Express, it can be registered with devServer’s middleware call chain in the same way as mock. In addition, DevServer.proxy integrates exposed proxy interfaces by default. We can use it directly without introducing any third party plugins, the relevant documentation can be referred to here. It works like this:
/ * * *@file proxy.js
*
* @author nimingdexiaohai([email protected])
* @see https://webpack.js.org/configuration/dev-server/#devserverproxy
* @see https://github.com/chimurai/http-proxy-middleware
*/
module.exports = {
'/api': {
target: 'http://localhost:3000'.changeOrigin: true.pathRewrite: { '^/api': '/path'}}};Copy the code
Packaging analysis In many scenarios, the output package size is too large when the product is released online, resulting in slow initial loading. This improvement generally depends on several aspects: enabling Gzip compression, using CDN, and splitting output packages. We’ll just focus on the third part here. Split code is a relatively complete system category, it contains a variety of split principles and operation methods. This will be explained in more detail in the configuration of the production environment. The package volume on the line is strictly controlled, but the optimized path is designed in advance, so we need a tool to analyze the volume and structure of the packaged output. Webpack-bundle-analyzer is a well-known open source implementation that visually shows the Size and dependencies of various output packages.
// webpack.config.dev.js
const webpack = require('webpack');
const { merge } = require('webpack-merge');
const { BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');
const baseWebpackConfig = require('./webpack.config.base.js');
module.exports = merge(baseWebpackConfig, {
plugins: [
new BundleAnalyzerPlugin({
// 'disabled' will close analyser
analyzerMode: 'server',})]});Copy the code
The plug-in will launch along with itdev
The script synchronously starts a single server and shows the various build packages and dependencies. Hover over a square to see its size information.
Other configuration In addition to the above configuration, the development environment also needs to have hot loading, sourceMap, and other functions. The complete development environment configuration is as follows:
/ * * *@file webpack.config.dev.js
* @author nimingdexiaohai([email protected])
*/
const webpack = require('webpack');
const MockUp = require('.. /mock/index');
const proxy = require('.. /src/common/proxy');
const { merge } = require('webpack-merge');
const { BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');
const baseWebpackConfig = require('./webpack.config.base.js');
module.exports = merge(baseWebpackConfig, {
mode: 'development'.devtool: 'eval-cheap-source-map'.stats: 'minimal'.plugins: [
new BundleAnalyzerPlugin({
// 'disabled' will close analyser
analyzerMode: 'server',}).new webpack.HotModuleReplacementPlugin(),
],
devServer: {
hot: true.open: true.before: function(app, server, compiler) {
MockUp.registerAll();
server.use(function(req, res, next) {
MockUp.loadMock(req, res, next);
});
},
// proxy: proxy}});Copy the code
Webpack HotModuleReplacementPlugin is enable heat load function, it is webpack ^ 5 latest writing. Devtool configures the sourceMap generation mode. Depending on the value, the speed of the package construction is different, and the granularity of the generated map information is different. The value of this field has also changed in webpack^5, as you can see in the official devtool documentation. The purpose of sourceMap is explained in detail in another article called UNDERSCORE. Js source code Parsing (1), which will not be covered here.
The production configuration is relatively simple, with the output build pack and a code split configuration. However, in order to design the split principle during development, we extracted the configuration of the split code into webapck.config.base.js.
/ * * *@file webpack.config.base.js
* @author nimingdexiaohai([email protected])
*/
const path = require('path');
const {CleanWebpackPlugin} = require('clean-webpack-plugin');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
const config = require('./config');
module.exports = {
...
plugins: [
new CleanWebpackPlugin(),
new MiniCssExtractPlugin({
filename: utils.genFilePathWithName('[name].css')})],optimization: {
splitChunks: {
cacheGroups: {
vendor: {
test: /[\\/]node_modules[\\/]/,
name: 'vendor'.chunks: 'initial'.priority: -10
},
default: {
minChunks: 2.priority: -20.reuseExistingChunk: true}}}}};Copy the code
optimization.splitChunks.cacheGroups
The priority field defines the priority, indicating which rule is used to process the resource if the resource meets the match of both groups. Here is thenode_modules
All three dependencies are packaged invendor.js
, initial indicates that only the three-party files shared with the import file dependency are extracted. Default is another default group, indicating that the rule is used by those who do not match the Vendor group. MinChunks represent extracting a dependency into a single package when it is referenced by more than two resource files simultaneously.yarn dev
Start scaffolding and accesshttp://localhost:8080/webpack-dev-server
You can see the split package structure:
Of course, there are some files generated by devServer to support runtime debugging, but it doesn’t affect our ability to focus and evaluate the split effect of our configuration. For other use of configuration items for codeSplitting, see the codeSplitting feature and the split-chunk-plugin. Therefore, the production environment configuration is left with a mode field configuration that identifies its environment scenario:
/ * * *@file webpack.config.prod.js
* @author nimingdexiaohai([email protected])
*/
const config = require('./config');
const webpack = require('webpack');
const { merge } = require('webpack-merge');
const baseWebpackConfig = require('./webpack.config.base.js');
module.exports = merge(baseWebpackConfig, {
mode: 'production'.stats: 'minimal'});Copy the code
The test completes the basic configuration of the development environment and production environment. We also need to integrate a testing framework to support unit testing of some core logic.
"scripts": {
"dev": "cross-env NODE_ENV=development webpack serve --config ./build/webpack.config.dev.js"."build": "cross-env NODE_ENV=production webpack --config ./build/webpack.config.prod.js --progress --color"."test": "cross-env NODE_ENV=test karma start test/karma.conf.js"
},
Copy the code
We chose Karma + Mocha + karma-spec-reporter to build the testing framework. Karma is a tool for running test cases. Mocha is a testing framework that supports running in both Node and Browser environments. Karma -spec-reporter is used to generate test results reports. During the integration process, it was discovered that Karma-Webpack does not yet support WebPack ^5. So the test integration wasn’t completely complete, and since I didn’t want to compromise and downgrade to Webpack 4, I decided to wait for WebPack ^5 support before making any further moves. Support programs can be found at github.com/ryanclark/k…
【 summary 】
At this point, the basic scaffolding is almost complete. Welcome to communicate ~
Wechat update: