Continue to update, the content of the article is longer, it is suggested to collect food, if you think the content is ok, please like oh. This article summarizes the hook education – big front-end high salary training camp (PS: Class let me add ~)
Engineering Overview
Front-end engineering is a means of using tools to improve efficiency and reduce costs by following a point of standard and specification. The main reason for the emergence of front-end engineering is that with the development of front-end technology, front-end function requirements are gradually increasing, and business logic is increasingly complicated.
Problems solved by engineering
Technology exists to solve problems. List the problems with front-end development. The first is environmental compatibility.
- Want to use the new ES6+ features, but compatibility issues
- You want to use Less/Sass/PostCSS to improve the compilability of CSS, but the runtime does not support it directly
- You want to use a modular approach to improve project maintainability, but the runtime environment does not directly support it
Repetitive input labor costs and problems during development
- You need to manually compress code and resource files before deployment goes live
- The deployment process requires manual uploading of code to the server
- Manual repeated operation, prone to error
- Multi-person collaborative development, code styles are different, the quality of the code can not be guaranteed
- Some functional development needs to wait for the back-end service interface to be completed ahead of time
Engineering performance
All means aimed at improving efficiency, reducing cost and ensuring quality belong to engineering. All repetitive tasks should be automated.
- Create projects: For different projects, each time after the technology selection is determined, the project structure needs to be manually created to create specific types of documents
- Coding: format code, verify code style; Compile/build/package
- Preview/test: Web Server/Mock; Live Reloading/HMR; Source Map
- Commit: Git Hooks; Lint – staged; Continuous integration
- Deployment: CI/CD; Automatic release
Engineering is not a tool
Some tools are too powerful at this stage, such as WebPack, which can help us automate the process from project creation to code writing, format checking, code compilation and so on. This leads many novices to think that WebPack stands for engineering. In fact, this is not the case. Tools are not the core of engineering. The core of engineering should be a kind of planning or architecture of the project, and tools are just a means to help us implement such planning or architecture in project development. With a simple project as an example, carries out the engineering should be as the diagram below, the first thing first, overall planning project workflow architecture, for example: the organizational structure of the file, the source code development paradigm, the development paradigm is refers to what kind of grammar, we use what kind of specification, what kind of standard to write our code; And then how do you separate the front and back ends, ajax or the middle tier? So these are some of the planning that we do at the beginning of the project, and then after we have that planning, we think about what tools we’re going to use and what configuration options we’re going to use to implement the engineering planning.Some mature engineering integration
Engineering with Node.js
When it comes to engineering, Node.js will never get away from it. It’s all about engineering. Some people say that Ajax has brought new vitality to the front end, so Node.js not only provides a new stage for javascript for the front end, but more importantly, makes the whole front end industry carry on an industrial revolution. It can be said that there would be no front-end without Node.js. Almost all are developed using Node.js, and front-end engineering is strongly driven by Node.js.
Scaffolding tool
Scaffolding tools: the originator of front-end engineering. The essence is to create the project infrastructure and provide project specifications and conventions. Past project development, technology after confirmation. The project structure is almost always the same.
- Same organizational structure
- Same development paradigm
- Same module dependencies
- Same tool configuration
- Same basic code
This leads to a lot of repetitive manual work if different projects choose the same technology. Scaffolding tools can solve these problems.
Commonly used scaffolding tools
- React project: create-react-app
- Vue project: Vue – CLI
- Angular project: Angular-CLI
However, these scaffolding tools are stack-specific and have limitations. Another type of scaffolding tool is the general-purpose scaffold tool, such as Yeoman, which generates a corresponding project structure from a template and is flexible and easily extensible.
Yeoman basic use
- Install Yo at the global level
$npm install yo --global or yarn global add yo
Copy the code
- Install the corresponding generator
npm install generator-node --global or yarn global add generator-node
Copy the code
- Run the generator through YO
mkdir my-module
cd my-module
yo node
Copy the code
Yeoman Sub Generator
Sometimes we don’t need to create a finished project. Instead, we need to add certain types of files to our existing projects, such as readme files to our projects, or esLint or bable configuration files, which have some basic code that could be wrong if written by hand. We can improve our development efficiency by using generators to automatically generate for us, and Yeoman’s Sub Generator gives us a feature to automatically generate specific files.
A custom Generator
Different generators can be used to generate different projects, which means that we can create our own Generator to generate a custom project structure, even though there are already many on the market, it is necessary to create our own Generator. In the real development process, some basic code and business code will be repeated, and some work will be repeated when creating similar projects. Therefore, we can use our own Generator to generate these common parts in scaffolding.
Creating a Generator Module
Generators have a specific structure that looks like this, with the Generator code placed under appIf you need to provide more than one Sub Generator, you can create a new Generator directory under your app’s statistics directoryNote that Yeoman’s Generator must be in gennerator-[name] format, and if you do not name your Generator in this format, Yeoman will not be able to find your Generator when working later.
- A simple example
Create a generator sample folder and create app/index.js under the generator folder. Install yeoman-generator and add some code to index.js. 3. Execute Yo Sample in the new project to generate temp. TXT file in the new project file.
const Generator = require('yeoman-generator');
module.exports = class extends Generator {
writing () {
this.fs.write(this.destinationPath('temp.txt'),
Math.random().toString()
)
}
}
Copy the code
- The generator generates files in the project as templates.
const Generator = require('yeoman-generator');
module.exports = class extends Generator {
writing () {
// Template file path
const entry = this.templatePath('foo.txt');
// Outputs the destination path
const ouput = this.destinationPath('foo.txt')
// Template file context
const content = {title: 'hello world' ,success: false}
this.fs.copyTpl(entry, ouput, content); }}Copy the code
- Receiving user input
Yeoman provides a heavily-guarded method, in which the Fred Prompt () method can be invoked to issue a command line query to the user.
const Generator = require('yeoman-generator');
module.exports = class extends Generator {
prompting() {
return this.prompt([
{
type: 'input'.// User interaction mode
name: 'name'.// The resulting key (EJS)
message: 'Your project name'.// Question content
default: this.appname // appName is the name of the project generated directory
}
])
.then(answers => {
this.answers = answers
})
}
writing () {
// Template file path
const entry = this.templatePath('bar.html');
// Outputs the destination path
const ouput = this.destinationPath('bar.html')
// Template file context
const content = this.answers
this.fs.copyTpl(entry, ouput, content); }}Copy the code
Introduction of the Plop
A small and beautiful scaffolding tool. When we develop projects, we often need to create different page files, but each page file basically has the same files, such as JS/CSS, and these files often have some basic code. If every page needs to be written by hand, it is laborious and does not guarantee code uniformity. And ploP scaffolding can help us solve that problem.
- Method of use
Create a template folder and add required template files. 3. Create a plopfile.js file in the root directory of the project and define the scaffolding task
// Plop entry file, need to export a function
// This function takes a plop object that is used to create a generator task
module.exports = plop => {
plop.setGenerator('component', {
description: 'create a component',
prompts: [
{
type: 'input',
name: 'name',
message: 'component name'.default: 'Mycomponent'
}
],
actions: [
{
type: 'add'.// adds the file
path: 'src/components/{{name}}/{{name}}.js'.// Target file
templateFile: 'plop-templates/component.hbs'.// Specify a template file
},
{
type: 'add'.// adds the file
path: 'src/components/{{name}}/{{name}}.scss'.// Target file
templateFile: 'plop-templates/component.css.hbs'.// Specify a template file
},
{
type: 'add'.// adds the file
path: 'src/components/{{name}}/{{name}}.test.js'.// Target file
templateFile: 'plop-templates/component.test.hbs'.// Specify a template file})}}]Copy the code
How scaffolding works
- Ask the user questions from the command line
- Generate files based on the results of user answers
#! /usr/bin/env node// Scaffolding working process:
// 1. Ask the user questions on the cli
// 2. Generate a file based on the user's answer
const fs = require('fs')
const inquirer = require('inquirer');
const path = require('path');
const ejs = require('ejs');
inquirer.prompt([
{
type: 'input',
name: 'name',
message: 'Project name'
}
]).then(answers => {
// Template directory
const tmplDir = path.join(__dirname, 'templates')
// Target directory
const destDir = process.cwd()
// Convert all files under the template to the target directory
fs.readdir(tmplDir, (err, file) => {
if (err) throw err;
file.forEach(file => {
// Render files through the template engine
ejs.renderFile(path.join(tmplDir, file), answers, (err, result) => {
if (err) throw err;
console.log(result) fs.writeFileSync(path.join(destDir, file), result) }) }); })})Copy the code
Automated build
All repetitive tasks are supposed to be automated. Automate the build of source code into production code
- Convert SCSS stylesheets to CSS stylesheets using SASS
1Install sASS dependencies2/node_modules/.bin/sass SCSS /main. SCSS [Path of the file to be converted] CSS /style. CSS [Path of the target file]3And press enter. The corresponding CSS file is automatically generated.Copy the code
The above command line approach is not easy to use, and when working with other people on projects, they don’t know how to run your commands. To solve this problem, use NPM Scripts.
- NPM scripts to configure the conversion command
1Json file. Add the scripts property to the object, which defines the command to convert the file"scripts": {
"build": "sass scss/main.scss css/style.css --watch"
}
2On the CLI, run YARN build or NPM run build to complete the conversionCopy the code
Common automated build tools
NPM scripts can only handle simple build tasks, but for complex scenarios in code, it can be a struggle. This is where more specialized build tools are needed. At present, Grunt, Gulp and FIS are common construction tools in the market. (Where did Webpack go? Actually, Webpack is a module packaging tool that is out of the scope of this discussion).
- Grunt
- One of the earliest front-end build systems, the plugin ecosystem is so complete that, in official terms, it automates almost anything you want to do. However, because his working process is based on temporary files, there will be a lot of disk read operations, which leads to a slow build speed.
- Gulp
- Since Grunt is memory-based, it is faster than disk reads and writes. It also supports multiple build tasks at the same time, which is more efficient. It is easier to understand than Grunt, and the plugin ecosystem is very complete.
- FIS
- Is the baidu front end team launched the construction system. Relative to the first two characteristics of the micro kernel FIS is more like a bundle package, it put us in the project development of some typical requirements are integrated in the internal, such as: resource loading code, modular development, deployment, and even performance optimization, formal due to the characteristics of this kind of large, so the FIS quickly became popular in China.
In general, if you are a beginner, FIS is more suitable for you, and if your requirements are flexible, the first two may be more suitable for you. Novices want rules, but veterans want freedom
Grunt
Basic use of Grunt
Grunt dependencies need to be installed before use. After installation, a new gruntfile.js file needs to be created as the grunt entry file.
// Grunt entry file
// It is used to specify tasks that need to be performed automatically by grunt
// A function needs to be exported
// This function takes a grunt parameter and provides some API that can be used to create tasks
// use yarn grunt [taskName] or./node_modules/.bin/grunt [taskName], where taskName is the taskName you define
module.exports = grunt => {
grunt.registerTask('foo', () = > {// The first argument is the name of the task we defined
console.log('hello grunt')
})
grunt.registerTask('bar'.'Task Description', () = > {// The second argument is to define the description of the task. We can run YARN Grunt --help on the command line to see the description of the task we defined
console.log('other task')})// grunt. RegisterTask ('default', 'task description ', () => {// The default task, when you run the task does not need to specify the task name, generally used to bind specific tasks
// console.log('default task')
// })
grunt.registerTask('default'['foo'.'bar']);
// Asynchronous tasks
grunt.registerTask('async-task'.'Asynchronous task', function() { // Grunt asynchronous tasks require a completion identifier this.async(), which needs to be used internally, so arrow functions are not available
const done = this.async();
setTimeout(() => {
console.log('async task')
done();
}, 1000)}); }Copy the code
Grunt indicates that the task failed
We can make the task fail by executing return false in the middle of the task, but this way, if the previous task fails, the subsequent task will not continue. We can force all tasks by executing them with the –force parameter
module.exports = grunt => {
grunt.registerTask('failed', () => {
console.log('fail task');
return false;
})
grunt.registerTask('foo', () => {
console.log('foo task');
})
grunt.registerTask('bar', () => {
console.log('bar task');
})
grunt.registerTask('default'['foo'.'failed'.'bar'])}Copy the code
Return false does not mark an asynchronous task as a failed task, but we can mark it as done(false)
module.exports = grunt => {
grunt.registerTask('async-task', function() {
const done = this.async();
setTimeout(() => {
console.log('async task')
done(false)},1000)})}Copy the code
Grunt configuration option method
Grunt provides the initConfig method to initialize the configuration of grunt. During task registration, grunt. Config (key) is used to obtain the configuration information.
module.exports = grunt => {
grunt.initConfig({
foo: '123'
})
grunt.registerTask('foo', () => {
console.log(grunt.config('foo'))})}Copy the code
module.exports = grunt => {
grunt.initConfig({
foo: {
bar: '123'
}
})
grunt.registerTask('foo', () => {
console.log(grunt.config('foo.bar'))})}Copy the code
Grunt Multi-objective mission
Grunt provides a registerMultiTask method to define multi-objective tasks
module.exports = grunt => {
grunt.initConfig({
build: {
options: {
foo: 'foo'
},
css: {
options: {
foo: 'bar'
}
},
js: '2'
}
})
grunt.registerMultiTask('build', function() {
console.log(this.options())
console.log(`target:${this.target} data: ${this.data}`)
})
}
Copy the code
Multi-objective missions need to be noted
- Each task must add the corresponding target to initConfig, which must be an object
- This object is executed as a target except for options
- You can add options to the target to define a target-specific configuration that overrides the common configuration for multi-target tasks
- You can get the configuration from registerMultiTask through this.options(), get the target of the multi-target task through this.target, and get the data of the multi-target task through this.data
Use of the Grunt plugin
In gruntfile.js, configure yarn grunt clean. In gruntfile.js, configure yarn grunt clean
module.exports = grunt => {
grunt.initConfig({
clean: 'temp/*.js'
})
grunt.loadNpmTasks('grunt-contrib-clean')}Copy the code
Import our plug-in through grunt. LoadNpmTasks and define the target of the plug-in in initConfig
Grunt plugin and summary
- Grunt -sass (Convert SCSS to CSS file)
- Grunt – Babel (new JS feature converted to ES5)
- Load-grunt – Tasks (automatic loading of required plug-ins)
- Grunt -contrib-watch (listen to files and automatically execute build tasks when files change)
const sass = require('sass')
const loadGruntTasks = require('load-grunt-tasks');
module.exports = grunt => {
grunt.initConfig({
sass: {
options: {
implementation: sass,
sourceMap: true
},
main: {
files: {
'dist/css/main.css': 'src/scss/main.scss'
}
}
},
babel: {
options: {
presets: ['@babel/preset-env'],
sourceMap: true
},
main: {
files: {
'dist/js/app.js': 'src/js/app.js'
}
}
},
watch: {
js: {
files: ['src/js/*.js'],
tasks: 'babel'
},
css: {
files: ['src/css/*.scss'],
tasks: 'sass'}}})// grunt.loadNpmTasks('grunt-sass')
loadGruntTasks(grunt);
grunt.registerTask('default'['sass'.'babel'.'watch']) // You need to combine watch with other tasks because the watch task will not be built, and will only be rebuilt when the file changes
}
Copy the code
Gulp
Basic use of Gulp
- Install gulp dependencies first
- Create gulpfile.js in the project root directory to write the build task
- Run the build task from the command line through gulp
// Gulp entry file
exports.foo = () => {
console.log('foo')}Copy the code
Run yarn gulp foo on the command lineWe can see that our foo task is working properly, but we have an error that basically means foo task has not finished yet, and we forgot to send an asynchronous completion signal. Since the synchronous code mode has been removed in the latest gulp, our tasks are all asynchronous by default, so we need to manually add asynchronous identifiers to our tasks. See the code below for details
// Gulp entry file
exports.foo = done => {
console.log('foo')
done() // Indicate that the task is complete
}
Copy the code
The default task is default, which is similar to grunt. You do not need to specify the task name. You only need to run yarn gulp
exports.default = done => {
console.log('default')
done()
}
Copy the code
Note that gulp before version 4.0 defines tasks in a different way. You need to manually import the gulp module and define tasks in gulp.task mode.
const gulp = require('gulp')
gulp.task('bar', done => {
console.log('bar')
done()
})
Copy the code
Gulp composite tasks
Gulp provides two modules, Series (synchronous task execution) and PARALLEL (asynchronous task execution), which can be combined to execute tasks defined by us.
const { series, parallel } = require("gulp");
const task1 = done => {
setTimeout(() => {
console.log('task1 work')
done();
}, 1000)}const task2 = done => {
setTimeout(() => {
console.log('task2 work')
done();
}, 1000)}const task3 = done => {
setTimeout(() => {
console.log('task3 work')
done();
}, 1000)
}
exports.foo = series(task1, task2, task3)
exports.bar = parallel(task1, task2, task3)
Copy the code
As you can see, tasks combined with series are executed sequentially, and tasks combined with PARALLEL are executed simultaneously.
Gulp asynchronous task
Gulp also supports asynchronous JS solutions, including callback functions, promises, and Async await, just as we write asynchronous js solutions. In addition to this, gulp also has a stream mode for asynchronous operations.
- Callback function: error first, if the previous task fails, the subsequent task will not continue
const { series } = require('gulp')
const callback_error = done => {
console.log('callback work')
done(new Error('task fail~'))}const bar = done => {
console.log('bar work')
done()
}
const hello = done => {
console.log('hello work')
done()
}
exports.foo = series(bar, callback_error, hello)
Copy the code
- Promise: Return promise.resolve () to indicate the end of the task. Resolve does not need to be passed and will be ignored by gulp if it does.
exports.promise = () => {
console.log('promise work')
return Promise.resolve()
}
Copy the code
Reject is used to identify a task failure. Reject is passed in as a cause of failure, unlike resolve, which is passed in as a value.
exports.promise_error = () => {
console.log('promise fail')
return Promise.reject(new Error('Failed'))}Copy the code
- Async Await: Node version 8 or older is required
const time = seconds => {
return new Promise(resolve => {
setTimeout(resolve, seconds)
})
}
exports.async = async () => {
await time(1000)
console.log('async work')}Copy the code
After running YARN gulp Async, the result will be printed after 1 second.
- Gulp itself stream mode
const time = seconds => {
return new Promise(resolve => {
setTimeout(resolve, seconds)
})
}
exports.stream = () => {
const timeout = time(1000)
console.log('stream work')
return timeout
}
Copy the code
After running, you will see that the task will end after 1 second.
Core principles of the Gulp construction process
How gulp is built is very simple, as shown in this diagram.
- Read the file
- The compressed file
- Written to the file
Before gulp, we manually copied our source code to a zip site, then copied it back to our target files. Manual operation is time-consuming and error prone.
- For a simple example, compress the CSS file
const fs = require('fs');
const { Transform } = require('stream');
exports.default= () = > {// File read stream
const read = fs.createReadStream('normalize.css'); // The CSS file in the root directory before compression can be customized
// File conversion stream
const transform = new Transform({
transform: (chunk, encoding, callback) => {
// The core conversion process is implemented
// chunk => the contents read in the stream (Buffer)
const input = chunk.toString(); // Convert buffer to a string for compression
const output = input.replace(/\s+/g, '').replace(/\/\*.+? \*\//g, ''); // Compress CSS to remove comments and whitespace callback(null, output); Const write = fs.createWritestream (') const write = fs.createWritestream (')normalize.min.css') // Import compressed CSS content into the modified file
// Write the read file stream into the stream
read
.pipe(transform) / / conversion
.pipe(write); / / write
return read;
}
Copy the code
Gulp file manipulation API
In the example above, the file manipulation API we used is the node.js API, and the gulp module itself provides many more powerful and easier to use apis, which are responsible for the transformation flow of file processing, most of which are provided through independent plug-ins. The next step in the gulp build process is this
- Start by creating the file read stream with SRC
- In the use of plug-ins to achieve file conversion flow
- Finally, use dest to create a file and write it to the stream
const { src, dest } = require('gulp');
const cleanCSS = require('gulp-clean-css')
exports.default= () = > {return src('src/*.css') // File read, you can specify a file, or you can use a wildcard to match a file or all files
.pipe(cleanCSS()) // Transform compression
.pipe(dest('dist')) // File write, can create files
}
Copy the code
Compared to node.js stream, gulp provides a more powerful API and simpler code.
Gulp case
Compile style
const { src, dest } = require('gulp');
const sass = require('gulp-sass');
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// sass-- Convert SCSS files to CSS files. Setting outputStyle to expanded expands the content structure of CSS files
// dest-- writes to the file stream
const style = () => {
return src('src/assets/styles/*.scss', { base: 'src' })
.pipe(sass({ outputStyle: 'expanded' }))
.pipe(dest('dist'))}module.exports = {
style
}
Copy the code
The script compiler
const { src, dest } = require('gulp');
const babel = require('gulp-babel');
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Babel -- Convert some new ES6 + features to ES5, presets will convert all features, es6+ will not be converted to ES5 without this configuration
// dest-- writes to the file stream
const script = () => {
return src('src/assets/scripts/*.js', { base: 'src' })
.pipe(babel({presets: ['@babel/preset-env']}))
.pipe(dest('dist'))}module.exports = {
script
}
Copy the code
Page template compilation
const { src, dest } = require('gulp');
const swig = require('gulp-swig');
const data = {
menus: [
{
name: 'Home',
icon: 'aperture',
link: 'index.html'
},
{
name: 'Features',
link: 'features.html'
},
{
name: 'About',
link: 'about.html'
},
{
name: 'Contact',
link: The '#',
children: [
{
name: 'Twitter',
link: 'https://twitter.com/w_zce'
},
{
name: 'About',
link: 'https://weibo.com/zceme'
},
{
name: 'divider'
},
{
name: 'About',
link: 'https://github.com/zce'
}
]
}
],
pkg: require('./package.json'),
date: new Date()
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Swig -- Conversion stream, here I use the page template swig, so I use the swig plugin, data is the data passed into the template
// dest-- writes to the file stream
const page = () => {
return src('src/*.html', { base: 'src' })
.pipe(swig({ data }))
.pipe(dest('dist'));
}
module.exports = {
page
}
Copy the code
Tasks are combined for execution through GULP’s PARALLEL
As mentioned earlier, both Series and Parallel can combine task execution, but in our case, the three tasks are independent of each other, so we use Parallel to combine them
const { src, dest, parallel } = require('gulp');
const sass = require('gulp-sass');
const babel = require('gulp-babel');
const swig = require('gulp-swig');
const data = {
menus: [
{
name: 'Home',
icon: 'aperture',
link: 'index.html'
},
{
name: 'Features',
link: 'features.html'
},
{
name: 'About',
link: 'about.html'
},
{
name: 'Contact',
link: The '#',
children: [
{
name: 'Twitter',
link: 'https://twitter.com/w_zce'
},
{
name: 'About',
link: 'https://weibo.com/zceme'
},
{
name: 'divider'
},
{
name: 'About',
link: 'https://github.com/zce'
}
]
}
],
pkg: require('./package.json'),
date: new Date()
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// sass-- Convert SCSS files to CSS files. Setting outputStyle to expanded expands the content structure of CSS files
// dest-- writes to the file stream
const style = () => {
return src('src/assets/styles/*.scss', { base: 'src' })
.pipe(sass({ outputStyle: 'expanded' }))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Babel -- Convert some new ES6 + features to ES5, presets will convert all features, es6+ will not be converted to ES5 without this configuration
// dest-- writes to the file stream
const script = () => {
return src('src/assets/scripts/*.js', { base: 'src' })
.pipe(babel({presets: ['@babel/preset-env']}))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Swig -- Conversion stream, here I use the page template swig, so I use the swig plugin, data is the data passed into the template
// dest-- writes to the file stream
const page = () => {
return src('src/*.html', { base: 'src' })
.pipe(swig({ data }))
.pipe(dest('dist'));
}
// Combine tasks
const compile = parallel(style, script, page)
module.exports = {
style,
script,
page,
compile
}
Copy the code
Image and font file conversion
The image files of our local projects have some binary data, which is not needed in the production environment, so we can use the gulp-Imagemin plugin to remove the binary data and compress our images without loss compression.
const { src, dest } = require('gulp');
const imagemin =require('gulp-imagemin');
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const image = () => {
return src('src/assets/images/**', { base: 'src' })
.pipe(imagemin())
.pipe(dest('dist'));
}
module.exports = {
image
}
Copy the code
For fonts, there is nothing to do, just copy to the target directory, but some font files also contain SVG, so we can also use Imagemin to process font files.
const { src, dest} = require('gulp');
const imagemin =require('gulp-imagemin');
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const font = () => {
return src('src/assets/fonts/**', { base: 'src' })
.pipe(imagemin())
.pipe(dest('dist'));
}
module.exports = {
font
}
Copy the code
Gulp Other files and file clearing
In our project, there will be some website icon files. These files are not the main files of the project. These files are usually stored in the public folder
const { src, dest, parallel } = require('gulp');
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// dest-- writes to the file stream
const extra = () => {
return src('public/**', { base: 'public' })
.pipe(dest('dist'));
}
module.exports = {
extra
}
Copy the code
Before each build, we need to delete the existing dist file. We can use the del plugin to define a clean task to delete the file
const { src, dest, parallel, series } = require('gulp');
const sass = require('gulp-sass');
const babel = require('gulp-babel');
const swig = require('gulp-swig');
const imagemin =require('gulp-imagemin');
const del = require('del')
const data = {
menus: [
{
name: 'Home',
icon: 'aperture',
link: 'index.html'
},
{
name: 'Features',
link: 'features.html'
},
{
name: 'About',
link: 'about.html'
},
{
name: 'Contact',
link: The '#',
children: [
{
name: 'Twitter',
link: 'https://twitter.com/w_zce'
},
{
name: 'About',
link: 'https://weibo.com/zceme'
},
{
name: 'divider'
},
{
name: 'About',
link: 'https://github.com/zce'
}
]
}
],
pkg: require('./package.json'),
date: new Date()
}
const clean = () => {
return del(['dist'])}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// sass-- Convert SCSS files to CSS files. Setting outputStyle to expanded expands the content structure of CSS files
// dest-- writes to the file stream
const style = () => {
return src('src/assets/styles/*.scss', { base: 'src' })
.pipe(sass({ outputStyle: 'expanded' }))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Babel -- Convert some new ES6 + features to ES5, presets will convert all features, es6+ will not be converted to ES5 without this configuration
// dest-- writes to the file stream
const script = () => {
return src('src/assets/scripts/*.js', { base: 'src' })
.pipe(babel({presets: ['@babel/preset-env']}))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Swig -- Conversion stream, here I use the page template swig, so I use the swig plugin, data is the data passed into the template
// dest-- writes to the file stream
const page = () => {
return src('src/*.html', { base: 'src' })
.pipe(swig({ data }))
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const image = () => {
return src('src/assets/images/**', { base: 'src' })
.pipe(imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const font = () => {
return src('src/assets/fonts/**', { base: 'src' })
.pipe(imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// dest-- writes to the file stream
const extra = () => {
return src('public/**', { base: 'public' })
.pipe(dest('dist'));
}
// Combine tasks
const compile = parallel(style, script, page, image, font)
// Series is used here to combine clean with other tasks, because you need to wait for the file to be deleted before building, not at the same time
const build = series(clean, parallel(compile, extra));
module.exports = {
style,
script,
page,
image,
font,
extra,
compile,
build
}
Copy the code
Autoloading plug-in
Gulp provides a plugin for gulp-load-plugins. This plugin only needs to be imported, and when we use other gulp plugins, Gulp-load-plugins allow us to import dependencies automatically without having to do so manually.
const { src, dest, parallel, series } = require('gulp');
const loadPlugins = require('gulp-load-plugins');
const del = require('del');
const plugins = loadPlugins();
const data = {
menus: [
{
name: 'Home',
icon: 'aperture',
link: 'index.html'
},
{
name: 'Features',
link: 'features.html'
},
{
name: 'About',
link: 'about.html'
},
{
name: 'Contact',
link: The '#',
children: [
{
name: 'Twitter',
link: 'https://twitter.com/w_zce'
},
{
name: 'About',
link: 'https://weibo.com/zceme'
},
{
name: 'divider'
},
{
name: 'About',
link: 'https://github.com/zce'
}
]
}
],
pkg: require('./package.json'),
date: new Date()
}
const clean = () => {
return del(['dist'])}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// sass-- Convert SCSS files to CSS files. Setting outputStyle to expanded expands the content structure of CSS files
// dest-- writes to the file stream
const style = () => {
return src('src/assets/styles/*.scss', { base: 'src' })
.pipe(plugins.sass({ outputStyle: 'expanded' }))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Babel -- Convert some new ES6 + features to ES5, presets will convert all features, es6+ will not be converted to ES5 without this configuration
// dest-- writes to the file stream
const script = () => {
return src('src/assets/scripts/*.js', { base: 'src' })
.pipe(plugins.babel({presets: ['@babel/preset-env']}))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Swig -- Conversion stream, here I use the page template swig, so I use the swig plugin, data is the data passed into the template
// dest-- writes to the file stream
const page = () => {
return src('src/*.html', { base: 'src' })
.pipe(plugins.swig({ data }))
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const image = () => {
return src('src/assets/images/**', { base: 'src' })
.pipe(plugins.imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const font = () => {
return src('src/assets/fonts/**', { base: 'src' })
.pipe(plugins.imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// dest-- writes to the file stream
const extra = () => {
return src('public/**', { base: 'public' })
.pipe(dest('dist'));
}
// Combine tasks
const compile = parallel(style, script, page, image, font)
// Series is used here to combine clean with other tasks, because you need to wait for the file to be deleted before building, not at the same time
const build = series(clean, parallel(compile, extra));
module.exports = {
style,
script,
page,
image,
font,
extra,
compile,
build
}
Copy the code
Gulp hot update development server
The browser-sync module allows us to start the server locally and debug the page easily
const { src, dest, parallel, series } = require('gulp');
const loadPlugins = require('gulp-load-plugins');
const browserSync = require('browser-sync');
const del = require('del');
const plugins = loadPlugins();
const bs = browserSync.create(); // Create a service using the browser-sync create method, and initialize the service in the serve task by init
const data = {
menus: [
{
name: 'Home',
icon: 'aperture',
link: 'index.html'
},
{
name: 'Features',
link: 'features.html'
},
{
name: 'About',
link: 'about.html'
},
{
name: 'Contact',
link: The '#',
children: [
{
name: 'Twitter',
link: 'https://twitter.com/w_zce'
},
{
name: 'About',
link: 'https://weibo.com/zceme'
},
{
name: 'divider'
},
{
name: 'About',
link: 'https://github.com/zce'
}
]
}
],
pkg: require('./package.json'),
date: new Date()
}
const clean = () => {
return del(['dist'])}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// sass-- Convert SCSS files to CSS files. Setting outputStyle to expanded expands the content structure of CSS files
// dest-- writes to the file stream
const style = () => {
return src('src/assets/styles/*.scss', { base: 'src' })
.pipe(plugins.sass({ outputStyle: 'expanded' }))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Babel -- Convert some new ES6 + features to ES5, presets will convert all features, es6+ will not be converted to ES5 without this configuration
// dest-- writes to the file stream
const script = () => {
return src('src/assets/scripts/*.js', { base: 'src' })
.pipe(plugins.babel({presets: ['@babel/preset-env']}))
.pipe(dest('dist'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Swig -- Conversion stream, here I use the page template swig, so I use the swig plugin, data is the data passed into the template
// dest-- writes to the file stream
const page = () => {
return src('src/*.html', { base: 'src' })
.pipe(plugins.swig({ data }))
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const image = () => {
return src('src/assets/images/**', { base: 'src' })
.pipe(plugins.imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const font = () => {
return src('src/assets/fonts/**', { base: 'src' })
.pipe(plugins.imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// dest-- writes to the file stream
const extra = () => {
return src('public/**', { base: 'public' })
.pipe(dest('dist'));
}
/ / the server
const serve = () => {
bs.init({
notify: false.// Close a message indicating that the service is successful after the service is started
port: '2080'.// Service port
open: false.// Turn off the startup service to automatically open the browser
files: 'dist/**'.// files configuration, when the dependent file changes, automatically refresh the page, to achieve the effect of hot update
server: {
baseDir: 'dist'.// Service dependent files
routes: { The routes configuration is also dependent on the service we started, but it takes precedence over baseDir
'/node_modules': 'node_modules' // There is no node_modules in our dist file, so we can use routes to point the node_modules path in the HTML file to the node_modules path in the root directory of the project}}})}// Combine tasks
const compile = parallel(style, script, page, image, font)
// Series is used here to combine clean with other tasks, because you need to wait for the file to be deleted before building, not at the same time
const build = series(clean, parallel(compile, extra));
module.exports = {
style,
script,
page,
image,
font,
extra,
serve,
compile,
build
}
Copy the code
Gulp monitors changes and builds optimizations
Above we implemented the content changes in the DIST file and updated the page. Next we implement changes to the file below SRC to automatically update the page. Gulp provides the WATCH API to monitor our file changes. We can use the Watch to listen for file changes and execute corresponding build commands. This allows you to modify the contents of the file under SRC to automatically update the page.
const { src, dest, parallel, series, watch } = require('gulp');
const loadPlugins = require('gulp-load-plugins');
const browserSync = require('browser-sync');
const del = require('del');
const plugins = loadPlugins();
const bs = browserSync.create(); // Create a service using the browser-sync create method, and initialize the service in the serve task by init
const data = {
menus: [
{
name: 'Home',
icon: 'aperture',
link: 'index.html'
},
{
name: 'Features',
link: 'features.html'
},
{
name: 'About',
link: 'about.html'
},
{
name: 'Contact',
link: The '#',
children: [
{
name: 'Twitter',
link: 'https://twitter.com/w_zce'
},
{
name: 'About',
link: 'https://weibo.com/zceme'
},
{
name: 'divider'
},
{
name: 'About',
link: 'https://github.com/zce'
}
]
}
],
pkg: require('./package.json'),
date: new Date()
}
const clean = () => {
return del(['dist'])}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// sass-- Convert SCSS files to CSS files. Setting outputStyle to expanded expands the content structure of CSS files
// dest-- writes to the file stream
const style = () => {
return src('src/assets/styles/*.scss', { base: 'src' })
.pipe(plugins.sass({ outputStyle: 'expanded' }))
.pipe(dest('temp'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Babel -- Convert some new ES6 + features to ES5, presets will convert all features, es6+ will not be converted to ES5 without this configuration
// dest-- writes to the file stream
const script = () => {
return src('src/assets/scripts/*.js', { base: 'src' })
.pipe(plugins.babel({ presets: ['@babel/preset-env'] }))
.pipe(dest('temp'))}// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// Swig -- Conversion stream, here I use the page template swig, so I use the swig plugin, data is the data passed into the template
// dest-- writes to the file stream
const page = () => {
return src('src/*.html', { base: 'src' })
.pipe(plugins.swig({ data }))
.pipe(dest('temp'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const image = () => {
return src('src/assets/images/**', { base: 'src' })
.pipe(plugins.imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// imagemin-- Convert the stream to remove the binary data in the file
// dest-- writes to the file stream
const font = () => {
return src('src/assets/fonts/**', { base: 'src' })
.pipe(plugins.imagemin())
.pipe(dest('dist'));
}
// SRC -- reads the file stream. The base option preserves the directory structure after base when writing the file to the destination
// dest-- writes to the file stream
const extra = () => {
return src('public/**', { base: 'public' })
.pipe(dest('dist'));
}
// Package third-party files
const useref = () => {
return src('temp/*html', { base: 'temp' })
.pipe(plugins.useref({ searchPath: ['temp'.'. ']}))// HTML, js, CSS
.pipe(plugins.if(/\.html$/, plugins.htmlmin({
collapseWhitespace: true,
minifyCSS: true,
minifyJS: true
})))
.pipe(plugins.if(/\.js$/, plugins.uglify()))
.pipe(plugins.if(/\.css$/, plugins.cleanCss()))
.pipe(dest('dist')); Dist = dist; dist = dist; dist = dist
}
/ / the server
const serve = () => {
watch('src/assets/styles/*.scss', style)
watch('src/assets/scripts/*.js', script)
watch('src/*.html', page)
// Monitor images/fonts/public for changes. Use reload under serve for hot updates
watch([
'src/assets/images/**'.'src/assets/fonts/**'.'public/**'
], bs.reload)
bs.init({
notify: false.// Close a message indicating that the service is successful after the service is started
port: '2080'.// Service port
// open: false, // close the startup service automatically opens the browser
// files: 'dist/**', // files: 'dist/**', // files: 'dist/**', // files: 'dist/**', // files: 'dist/**', // files
server: {
baseDir: ['temp'.'src'.'public'].If the file we requested is not in dist, we will look for the following files in order
routes: { The routes configuration is also dependent on the service we started, but it takes precedence over baseDir
'/node_modules': 'node_modules' // There is no node_modules in our dist file, so we can use routes to point the node_modules path in the HTML file to the node_modules path in the root directory of the project}}})}// Combine tasks
const compile = parallel(style, script, page)
// Series is used here to combine clean with other tasks, because you need to wait for the file to be deleted before building, not at the same time
const build = series(clean, parallel(series(compile, useref), image, font, extra)); // Tasks performed before going online
const devlop = series(compile, serve);
module.exports = { // We only need to keep these 3 tasks in the actual development. Of course, it is better to write them in the scripts of package.json for developers to use, and then run the build tasks directly through YARN
clean,
devlop,
build
}
Copy the code
This article summarizes the hook education – big front-end high salary training camp (PS: Class let me add ~)