Gulp

Features (relative to Grunt) : efficient, easy to use

I. Basic use

  1. Install gulp dependencies
yarn add gulp -D
Copy the code
  1. Create gulp tool entry file gulpfile.js in the root directory

  1. Yarn gulp is used to run tasks
// Default task yarn gulp // Common task Yarn gulp task nameCopy the code

Gulp Create a composite task

const { series, parallel } = require('gulp')

// Series creates a serial task, while parallel creates a parallel task.
// Series parallel is a function

const task1 = done= > {
  setTimeout(() = > {
    console.log('task1 executes')
    done()
  }, 1000);
}
const task2 = done= > {
  setTimeout(() = > {
    console.log('task2 executes')
    done()
  }, 1000);
}
const task3 = done= > {
  setTimeout(() = > {
    console.log('task3 executes')
    done()
  }, 1000);
}

exports.seriesTask = series(task1, task2, task3) // Execute in sequence
exports.parallelTask = parallel(task1, task2, task3) // Execute simultaneously
Copy the code

Iii. Gulp asynchronous tasks

Asynchronous tasks in three ways

const { series, parallel } = require('gulp')
const fs = require('fs')
// Tasks in Gulp are asynchronous tasks
// How to notify Gulp that the current Gulp task is complete if the task in Gulp is asynchronous.

// 1. Notifies GULp that the task is over using the done callback parameter

exports.callback = done= > {
  console.log('Callback task executing')
  done()
}

exports.callback_err = done= > {
  console.log('callback_err')
  done()
}


// 1-1. Gulp tasks are error first. If the current task has an error task, subsequent tasks will not be executed.
exports.task = parallel(callback = done= > {
  console.log('Callback task executing')
  // Throw an error
  // done(new Error('hhh'))
}, callback_err = done= > {
  console.log('callback_err') // Will not be executed
  done()
})

/ / 2. Gulp also support Promise, again through the task function returns a fulfilled | | rejected state Promise objects, also can let Gulp know the current task is over.

// fulfilled
exports.promise = done= > {
  console.log('Promise task executing')
  return Promise.resolve() // Run the result,gulp knows that the current task is complete
}

// After the rejected state runs, an error is reported, and subsequent tasks will not be executed.

exports.promise_err = done= > {
  console.log('Promise_ERR task executing')
  return Promise.reject() // Run the result,gulp knows that the current task is complete
}

// Async await Promise syntax candy can also tell Gulp whether the task is finished

// Successful mission
exports.async = async done => {
  await console.log('hhhh')}// Failed task
exports.async = async done => {
  await console.log('Async task executing')
  throw new Error('hhhh')}// 3. When a file stream is read in nodeJS, it returns a file stream object and notifies Gulp that the current task is complete. After receiving the returned file stream object, Gulp checks the onEnd event for the object group.

exports.stream = () = > {
  const readStream = fs.createReadStream('./package.json')

  return readStream
}
Copy the code

Iv. Core working principles of Gulp construction

Whether it’s Gulp or any other automated build tool, the core workings of a build are based on the following steps

const { series, parallel } = require('gulp')
const fs = require('fs')
const { Transform } = require('stream')

// The core Gulp build process
// File compression, CSS, and Babel compilation are all dependent on the following process
// File read = "file processing =" file write

// Example: simulate file compression process
// Compress the package.json file and output it to dist directory.
exports.compression = () = > {
  // 1. Read the file stream
  const readStream = fs.createReadStream('./package.json')

  // Create a write stream
  const writeStream = fs.createWriteStream('./dist/package.js')


  // 2. Use the Stream module Transform to process the file Stream
  const transform = new Transform({
    transform(chunk, encoding, callback) {
      // Chunk is the content of the stream read in Buffer format
      console.log(chunk, 'Buffer')
      // Convert the byte stream to string
      let input = chunk.toString()
      console.log(input)
      // Convert the content to the desired output
      const output = input.replace(/\s+/g.' ')

      // Pass the result through the callback function
      // The first argument is the error object and the second is the output
      callback(null, output)
    }
  })

  // 3. Write the content
  readStream.pipe(transform).pipe(writeStream)
  return readStream
}
Copy the code

Results:

Gulp file operation API

1. Nodejs readfile/fs module in the difference between the writeFile and createReadStream/createWriteStream

The readFile /writeFile operation reads the contents of a file directly into the memory at one time and then writes them to the memory. Therefore, there is no pressure for small files. If the content is video, audio and other binary files, the volume of G is normal, this time if the use of direct reading into memory is not good, the operating system for each application allocated memory resources are limited, read too large files will lead to memory burst. CreateReadStream/createWriteStream is in the form of file stream to read and write, read a paragraph, write a paragraph, as long as time permits, we can read and write sooner or later.

2. API for file operations

Gulp also provides its own API for manipulating file streams, which is more powerful than nodeJS’s API for manipulating file streams.

const { src, dest } = require('gulp')
console.log(dest, 'dedddd')
const cleanCss = require('gulp-clean-css')
const rename = require('gulp-rename')
// API for Gulp file operation
// the SRC method creates a read stream
// The dest method creates a write stream
This can be done directly using third-party plugins, or you can create a transform stream using new transform

// Example: simulate file compression process
// Compress the files such as page. CSS, index. Js and output to dist directory.
// You need to use the gulp-clean-css to compress the file, gulp-rename rename

exports.default = () = > {
  // All CSS files can be read directly using wildcards
  const readStream = src('./css/*.scss')
  readStream
    .pipe(cleanCss())
    .pipe(rename({ extname: '.min.scss' }))
    .pipe(dest('dist'))

  return readStream
}
Copy the code

Gulp completes automated workflow for web applications

** All plugins in Gulp are functions that return a file conversion stream ** Github address for the following cases github.com/zce/zce-gul…

1. Style build

// Gulp entry file
// Basically all plugins in Gulp are functions that return a file conversion stream
const { src, dest } = require('gulp')
const sass = require('gulp-sass')
// A. Style of the build task
// 2. Install the sASS conversion stream plug-in, gulp-sass

const style = () = > {
  // 1. Create a read/write stream for the file
  SRC /assets/styles/*.scss; // The base option will keep the file read path like SRC /assets/styles/*.scss, and delete the base when output, creating the output path dist + assets/styles/*.scss
  return src('src/assets/styles/*.scss', {base: 'src'})
    // The sass conversion stream does not convert files that start with _ (underscore). OutputStyle: 'expanded' is used to configure the expansion of style curly braces
    .pipe(sass({ outputStyle: 'expanded' }))
    .pipe(dest('dist'))}module.exports = {
  style,
}
Copy the code

2. Compile the script file

const babel = require('gulp-babel')
// Show only the core code modules
// compile the script
// 2. Install the script conversion flow plug-in,
// The gulp-babel plugin is just a wrapper for the conversion stream. It will invoke the core Babel module to convert files.
// So @babel/core @babel/preset-env needs to be installed
// @babel/core @babel/preset-env difference
// @babel/core is the platform for JS conversions, and it provides the environment for conversions. For real syntax conversions, we need to rely on other Babel plug-ins such as @babel/preset-env
// @babel/preset-env contains the latest ECMASCRIPT syntax features, converting all the new syntax in the code.

const script = () = > {
  // 1. Create a file read/write stream
  return src('src/assets/scripts/*.js', {base: 'src'})
    .pipe(babel({presets: ['@babel/preset-env']}))
    .pipe(dest('dist'))}Copy the code

3. Compile the page file

Gulp – swig will HTML documents compiled according to the configuration data, and to the following figure of the {{PKG. Name | upper}} and so on

const swig = require('gulp-swig')
// compile the page
// 2. Install gulp-swig to convert the index file
const page = () = > {
  // 1. Create read/write streams
  // ** Indicates any subdirectory. For example, SRC /**/*.html means to read any HTML file in any subdirectory of the SRC directory
  return src('src/*.html', { base: 'src' })
  .pipe(swig(require('./swig-data.js')))
  .pipe(dest('dist'))}// 4. Combine the CSS, Page and script build tasks.
// With parallel, the packaged tasks do not depend on each other

const compile = parallel(script, page, style)
Copy the code

The compiled result after running

4. Image and font file conversion

// 5. Image and font icon construction
// 2. Install the gulp-imagemin conversion stream
const image = () = > {
  // 1. Create read/write streams
  SRC /assets/images/** This path represents all files under SRC /assetes/images/
  return src('src/assets/images/**', { base: 'src' })
    .pipe(imagemin())
    .pipe(dest('dist'))}const font = () = > {
  // 1. Create read/write streams
  SRC /assets/images/** This path represents all files under SRC /assetes/images/
  return src('src/assets/fonts/**', { base: 'src' })
    .pipe(imagemin())
    .pipe(dest('dist'))}Copy the code

5. Other documents and document clearance

// vi. Public files are packaged into dist directory && delete dist directory before build
const extra = () = > {
  // 1. Create read/write streams
  return src('public/**', {base: 'public'})
    .pipe(dest('dist'))}// Delete the task
const deleteDist = async() = > {return await del('dist')}// Optimize the build task
const build = series(deleteDist, parallel(extra, compile))
module.exports = {
  build,
}
Copy the code

6. Gulp automatically loads plug-ins

As the build task becomes more complex and the number of plug-ins used becomes more repetitive, you can use gulp-load-plugins to automatically load plug-in names that start with gulp-.

How the plug-in works:

  • This method will scan package.json and give the package name beginning with gulp- to the automatic import, and return an object that contains the following

Plug-ins.

  • If you take sass as an example, it’s {sass: sass}
  • The important thing to note is that if the plugins are named gulp-load-plugins, the plugins in the object are {loadPlugins: loadPlugins}

Code:

// Automatic loading of plug-ins
// 1. A reference to a plug-in is a method
const autoLoadPlugins = require('gulp-load-plugins')
// 2. Call this method to get an object containing all plug-ins
const plugins = autoLoadPlugins()
Copy the code

Replace the previous plug-in name

7. Hot update development server

With the browser-sync plugin we can get a server with hot updates

For details about browser-sync configurations, see the official website

1. Perform basic configurations

// 8, hot update development server
// 1. After the plug-in is introduced, call browserSync.create() to create a server
const bs = browserSync.create()
// 2. Create a task

const serve = () = > {
  // Initialize the web server configuration
  bs.init({
    server: {
      // The root directory of the website
      baseDir: 'dist'
      // The portal file or home page
      index: 'index.html' // The default value is index.html}})}Copy the code
// 启动服务器
yarn gulp serve
Copy the code

The website will automatically open

2. Proxy the request resource to node_modules in the current directory

The requested bootstrap.css did not find any style resourcesAdd the following configuration

const serve = () = > {
  // Initialize the web server configuration
  bs.init({
    server: {
      // The root directory of the website
      baseDir: 'dist'.// For routers, see /node_modules.... The requested resource is looked up from node_modules relative to the current working directory.
      routes: {
        '/node_modules': 'node_modules'}}})}Copy the code

3. Hot update

Hot updates are implemented through the files configuration

const serve = () = > {
  // Initialize the web server configuration
  bs.init({
    server: {
      // The root directory of the website
      baseDir: 'dist'.index: 'index.html'.// For routers, see /node_modules.... The requested resource is looked up from node_modules relative to the current working directory.
      routes: {
        '/node_modules': 'node_modules'}},// Browsersync can listen to files at work.
    // Type: Array | String
    files: 'dist/**'})}Copy the code

At this point you might see a problem. The browserSync server is listening to the dist directory, but the dist directory is a packaged file, and we want to listen for changes to the development code file under SRC. The requirement should be that when the resources under SRC change, the package should be recompiled, and the server should be updated when the compiled code changes.

4. Listen to the SRC file

Gulp watch() watch() watch

  • The first parameter to listen on a path string | array
  • Options see official website
  • task
const { watch } = require('gulp')
const serve = () = > {
  // Listen for resource changes in SRC and run tasks relative to them
  watch('src/assets/styles/*.scss', style) 
  watch('src/assets/scripts/*.js', script) 
  watch('src/*.html', page) 
  watch('src/assets/images/**', image) 
  watch('src/assets/fonts/**', font) 
  watch('public/**', extra) 

  // Initialize the web server configuration
  bs.init({
    server: {
      // The root directory of the website
      baseDir: 'dist'.index: 'index.html'.// For routers, see /node_modules.... The requested resource is looked up from node_modules relative to the current working directory.
      routes: {
        '/node_modules': 'node_modules'}},// Browsersync can listen to files at work.
    // Type: Array | String
    files: 'dist/**'})}Copy the code

8. Optimization of build tasks

step1 Step2 there is a problem with the changes above. If we change the image resources under SRC at this point, the resources on the page will not change because they are not packaged under dist file. The solution: Let the web server get the resource, look under SRC, listen for changes to the SRC resource file, and then call browserSync’s Reload method. The reload method will notify all browsers that the file has been changed, either causing the browser to refresh or inject the file, updating the change in real time.Distinguish between production and development environments to optimize packaging speed

Ps: The following is to understand the knowledge of a variety of hot update implementations. Instead of using browserAsync’s files to listen for all files in the dist directory, you can use gulp’s watch method to listen for changes in the SRC file and then perform the packaging task. Finally, the reload method provided by browserAsync is called to notify all browsers that the relevant files have been changed, either causing the browser to refresh or injecting the files to update the changes in real time.

It is written as follows:

Useref consolidates all dependent files in the HTML type

1. Basic use

Gulp-useref combines multiple CSS and JS references to HTML, reducing the number of dependent files and thus reducing the number of browser requests. Gulp-useref uses the comments to find the HTML blocks that need to be merged and compressed, and then merges all the files in the blocks. Only combine CSS or JS, not compress CSS and JS resources.

Such as: Below jquery. Js, popper. Js, the bootstrap. Js, will this three files to merge for vendor. Js, will combine the file output to assets/scripts/vendor. The js, generate a script tag, The SRC address is the address of the packaged file.

/ / eight. UseRef
const useRef = () = > {
  return src('dist/*.html', { base: 'dist' })
    // searchPath relative to the current file, to find build comments and endBuild comments wrapped resources
    // Where can I find the resources to be merged
    .pipe(plugins.useref({ searchPath: ['dist'.'/'] }))
    .pipe(dest('dist'))}Copy the code

2. Compress the CSS, JS and HTML files after packaging

1. Basic implementation

HTML file => gulp-htmlmin js => gulp-uglify CSS => gulp-clean-css => gulp-if

const useRef = () = > {
  return src('dist/*.html', { base: 'dist' })
    // searchPath relative to the current file, to find build comments and endBuild comments wrapped resources
    // Where can I find the resources to be merged
    .pipe(plugins.useref({ searchPath: ['dist'.'/']}))// 1. File compression
    // The useref plugin is not capable of compressing HTML, CSS, js, etc. It will need to be compressing when the project goes live.
    // Import the plug-in transformation stream to process the file HTML, CSS, JS
    // Because the three types of files require different types of operations, you can use gulp-if to perform different conversion streams for different files
    // The first parameter of the gulp-if plug-in matches the re of the file path
    .pipe(plugins.if(/\.js$/, plugins.uglify()))
    // htmlmin To collapse the newline, pass in the configuration collapseWhitespace: true, and in the JS compression of the HTML file minifyCSS, minifyJS respectively
    .pipe(plugins.if(/\.html/, plugins.htmlmin({ collapseWhitespace: true.minifyCSS: true.minifyJS: true })))
    .pipe(plugins.if(/\.css$/, plugins.cleanCss()))
    .pipe(dest('dist'))}Copy the code

2. Problems encountered

After the transformation, the main.css content is empty.

The reason is that the I/O operation using byte stream is read a section => file conversion => write a section, read a section again => file conversion => write a section, so that the read and write are in the same dist directory, and there will be conflicts. In this case, we only need to change the output directory into a temporary TEMP file to solve the problem.

Optimize the build process

The compressed file is only used during production and is exported under dist file. The sass file, the JS file that needs to be Babel converted and we put it under the temp file. For details about other temp tasks that need to be modified, see Gitee

Encapsulate automated build workflows

1. Why encapsulate an automated build workflow?

Because projects of the same type share roughly the same amount of automated build work, encapsulating the automated build flow is better than copying code if you want to reuse it. For example, if the workflow is used in 10 projects, and the configuration changes are caused by the gulp plug-in update, you need to change the build tasks for the 10 projects. For now in the learning stage of me, encapsulation workflow to write demo will be very convenient.

2. How to encapsulate an automated build workflow?

1. Basic implementation idea

  • We abstracted the gulP workflow we implemented and encapsulated it into an NPM package
  • Then install the package with the project we need to use and import the gulpFile file with the extracted workflow.
  • Simply put: Workflow = Gulp + gulpFile file

PS: The picture shows the workflow that has not been published to the NPM. In the local test, yarn link is used.

2. Concrete implementation

For some write-dead configurations, we can pass them in as configuration parameters for using the project.If we do not abstract the following./swig-data.js, we will not get it when we use the encapsulated workflow, so we will report an error.Solution:@ called bael/preset – env modulePresets are passed in as a string, and he looks for @babel/preset-env under the currently running node_modules. At this time, the package is not in the gulp-demo project, but in the wrapped workflow project. If the package is generated as require instead, it will go to the node_modules project where the workflow resides.

The paths of the workflow are abstracted into configurationsEach project’s path is different, and we need to abstract its path configuration into a configuration.The default configurationReplace the paths of all gulp tasks with configuration files. Only the style configuration is displayed

3. Packing CLI

  1. Every time you need to create gulpfile.js and introduce the encapsulated gulp task as the entry file for gulp work, it is too troublesome. What should you do?

The gulp command provides command-line arguments:

  • –gulpfile Entry file address
  • — CWD run directory (where the task runs)

2. You need to specify parameters each time you run the gulp command, which makes it more troublesome. Is there a way not to enter these parameters. Yes, we can provide a command-line interface (CLI) command in the encapsulated automated build workflow and specify these parameters in the COMMAND-LINE interface (CLI) command. There is no need to run the gulp command, which fully implements the wrapped gulp command.

Yarn link Running the yarn link command in the NPM package folder creates a symbolic link between the global folder {prefix}/lib/node_modules/ and the package folder where you executed the NPM link.

Run the reborn

In the bin file, run the gulp-cli command. For details, see how to write the gulp package.

Package gulp CLI commands

#! What does /usr/bin/env node choose to run scripts with
/ / 1.! /usr/bin/node tells the operating system to call the node interpreter under /usr/bin when the script is executed. Write the dead node path
/ / 2.! The /usr/bin/env node is used in case the operating system user does not have the node installed in the default /usr/bin path.
// When the system sees this line, it first looks for the node installation path in the env setting, and then calls the interpreter program in that path to complete the operation. Node will go to environment Settings and look for the node directory

// The require.resolve method concatenates an absolute path based on the arguments passed.
// This method checks whether the path exists and throws an exception if it does not

Argv takes the command-line arguments and returns an array

// 1. In this case, we need to specify the working directory of gulp-cli, namely -- CWD parameter, and the path of gulp-file file --gulpfile

process.argv.push('--cwd')
process.argv.push(process.cwd())
process.argv.push('--gulpfile')
process.argv.push(require.resolve('.. /lib/index.js'))

require('gulp-cli') ();Copy the code

3. Publish to the NPM repository

One thing to note before publishing is that you need to configure the files field in the package.json file, which files you want to publish. See the configuration belowPublish using YARN Publish.