Hello, brave friends, Hello, everyone, I am your mouth strong king xiaowu, healthy body, brain is not sick.

I have a wealth of hair loss techniques that can make you a veteran.

A look will be a waste is my main purpose, food to scratch my feet is my characteristics, humble with a trace of strong, stupid people have silly fortune is the biggest comfort to me.

Welcome to primary Five’s essay series file upload guide.

preface

This article is a newbie, leading you to start from single file upload, gradually expand to fragment upload and resumable. The article will be involved in the knowledge points listed one by one, you can rest assured to eat.

React + Koa2

Upload the code link: fe-upload, be-upload

Basic gleanings

This part of the content is the knowledge point needed in this article, if you need to expand, please consult relevant information by yourself

In an HTML form, the only control for uploading a file is . The value must be “Content-Type”: “multipart/form-data” && “method”: “post”.

Since FECth cannot monitor the file upload progress, THE author uses AXIos

FormData

Used to “serialize the form” or “create data in the same format as the form”. If the form’s encType is multipart/form-data, the form’s submit() method is used to send the data

FormData is stored as a key/value pair that can be appended by append

const formData = new FormData()
formData.append('f1', chunk1)
formData.append('f1', chunk2)
formData.append('f1', chunk3)

formData.getAll('f1') // [chunk1, chunk2, chunk3]
Copy the code

FileReader

The FileReader object allows Web applications to asynchronously read files stored on the user’s computer, using File or Blob objects to specify files or data to read

  • Filereader. onload: This event is triggered when the read operation is complete

  • FileReader. ReadAsArrayBuffer: read the content of the Blob, save as ArrayBuffer data objects

  • Filereader. abort: Aborts the read operation

The node based

“Path” πŸ¦…

  • __dirname: Returns the location of the file folder

  • Process.cwd () : the location of the folder where the node command is run

  • Path.join () : join path ~ Join all path fragments to generate paths

  • Path.resolve () : Resolve paths ~ resolve multiple paths to an absolute path (similar to the CD operation)

File operations

Rename files: fs.rename(), fs.renamesync ()

fs.renameSync(oldPath, newPath)
Copy the code

Fs.readdir (), fs.readdirsync ()

Used to read directories and return an array of files and directories

fs.readdirSync(path)
Copy the code

File/directory information: fs.stat(), fs.statsync ()

It takes a file or directory and returns an object containing specific information about the file or directory

  • Stats.isdirectory () : determines whether it is a directory

  • Stats.isfile () : Determines whether it is a file

let stats = fs.statSync(path)
if (stats.isDirectory()) { ... }
if (stats.isFile()) { ... }
Copy the code

File/directory exists: fs.exists(), fs.existssync ()

fs.existsSync(folder)
Copy the code

Create folders: fs.mkdir(), fs.mkdirsync ()

fs.mkdirSync(folder)
Copy the code

Delete files: fs.unlink(), fs.unlinksync ()

fs.unlink(fname)
Copy the code

Delete directories: fs.rmdir(), fs.rmdirsync ()

You can delete the directory only when the directory is empty. If the directory is not empty, you need to traverse the files and delete the files one by one before deleting the directory

Stream (stream)

Instead of reading the entire file and then returning, stream reads and returns as it goes.

  • Fs.createreadstream () : readable stream used to read data

  • Fs.createwritestream () : writable stream used to write data

  • .pipe() : pipe used to connect stream files

const fs = require('fs')
const readerStream = fs.createReadStream('input.txt')
const writerStream = fs.createWriteStream('output.txt')
readerStream.pipe(writerStream)
Copy the code

Ordinary upload

Page structure πŸ¦…

<input
  ref={fileRef}// Used to triggerinputClick event ofvalue={fileValue}// The value must be cleared before uploading. Otherwise, the same file cannot be uploadedstyle={{display: 'none'}} // Hide the original style and pass in the new stylefileRef.current.click() Triggers the upload actiontype="file"
  name="file"
  accept={accept}// The format of the received fileonChange={upload}// Upload eventmultiple={multiple}// Enable multiple file uploads />
Copy the code

Upload Logic – Web πŸ¦…

Upload Logic – Node end πŸ¦…

If you need to explore the principle, you can jump to: [zihanzy.com] NodeJs native file upload understanding, [Chen Brew wine] start from koa-body analysis, understand node.js file upload process

We use the koa-body library to save files, which are stored by default in a temporary directory on the system that can be configured

File information is obtained through ctx.request.files.f1, where F1 is the name specified for the input file

app.use(koaBody({
  formidable: {
    uploadDir: path.resolve(__dirname, 'public/uploads'), // File upload directory
    // keepExtensions: Boolean Keeps the file suffix
    // maxFieldsSize: number File upload size
    // onFileBegin: (name, file) => void Events before file uploading
  },
  multipart: true.// Support file upload
  // encoding: String encoding
}))
Copy the code

Koa-static enables access to static resource files

app.use(koaStatic(__dirname + 'public'))
Copy the code

“The router”

“Controller”

Drag and drop to upload

Get the file information when you drag it, then execute the upload() method

Tips: Drag an image to a page. The browser’s default behavior opens the image in a new window, so disable the default behavior and prevent events from bubbling

const stopEvent = e= > {
  e.preventDefault()
  e.stopPropagation()
}
Copy the code

File Upload Progress

In the axios config, the onUploadProgress method is used to obtain Loaded, total, and lengthComputable, where Loaded indicates the number of bytes sent. Total indicates the total file size, and lengthComputable indicates whether the current progress has a calculable length. If not, total is 0

Cancel the upload

CancelToken cancels the Ajax request with axios.cancelToken, which then reassigns when requesting the same interface.

Fragments and breakpoints can be used to pause and continue operations

let source = axios.CancelToken.source()

const upload = async() = > {let config = {
    cancelToken: source.token,
  }
}

const cancelUpload = () = > {
  source.cancel()
  source = axios.CancelToken.source() 
}
Copy the code

Echo images

Set content-Type and assign the read file to ctx.body

Content-type can be obtained using the lookup method of MIme-types

const mime = require('mime-types')

let filePath = path.join(__dirname, `public/uploads/${readFileName}`)
let file = null
try {
  file = fs.readFileSync(filePath)
} catch(err) {
  console.log(err)
}

let mimeType = mime.lookup(filePath)
ctx.set('content-type', mimeType)
ctx.body = file
Copy the code

Shard to upload

Slice the file and upload one portion at a time, recording the order. After all uploads are completed, fragments are merged into files in sequence.

Web side

“How to split files” πŸ¦…

Slice the file using the blob.prototype. slice method

const chunkSize = 2 * 1024 * 1024 // Size of each slice
let chunks = [] // Shard array

if (files.size > chunkSize) {
  let start = 0
  let end = 0

  while (true) {
    end += chunkSize
    const blob = files.slice(start, end)
    start += chunkSize

    if(! blob.size)break
    chunks.push(blob)
  }
} else {
  chunks.push(files)
}
Copy the code

“How to convert files to Buffer format” πŸ¦…

Through the FileReader

const fileParse = (files) = > {
  return new Promise((resolve, reject) = > {
    let fileRead = new FileReader()
    fileRead.readAsArrayBuffer(files)
    fileRead.onload = e= > {
      resolve(e.target.result)
    }
  })
}
Copy the code

“How to classify slices of the same file” πŸ¦…

Md5 encryption is used to generate hash. The same hash is a slice of the same file

import SparkMD5 from 'spark-md5'
const buffer = await fileParse(files)
const spark = new SparkMD5.ArrayBuffer()
spark.append(buffer)
let hash = spark.end()

formData.append('token', hash)
Copy the code

“How to ensure that shards are merged sequentially” πŸ¦…

Append indexes to formData

formData.append('index', index)
Copy the code

“When do shards merge” πŸ¦…

After all fragments are uploaded, the back end sends a request of type=merge, which is received by the back end and merged

if (sendChunkCount === chunkCount) { // After all uploads are completed
  const mergeFormData = new FormData()
  mergeFormData.append('type'.'merge')
  mergeFormData.append('token', hash)
  mergeFormData.append('chunkCount', chunkCount)
  mergeFormData.append('filename', files.name)
  const data = await axios.post(action, mergeFormData, config)
}
Copy the code

Progress bar πŸ¦…


Adds up all uploaded bytes / The total number of bytes Sum all uploaded bytes/total bytes

The node end

Create folders using the incoming hash and write fragments to the folders using the index-hash format. Merge the fragments in the folders when the merge request is received

πŸ¦… “router”

πŸ¦… “controller”

A pass

If the file exists, the system does not transfer the file and returns the file address directly. This operation is called second transmission

Md5 encrypted hash is unique, so determine whether the current uploaded file is in the Uploads folder. If yes, return the file address. Otherwise, upload the file.

Breakpoint continuingly

If the files to be uploaded in fragments are not fully transferred, the same file is uploaded with the last progress

The information about uploaded fragments is returned to the front end, and the front end determines which fragments to upload according to the index

πŸ¦… “router”

πŸ¦… “controller”

Refer to the link

【ikoala】 To learn node. js, stream first need to know

[zz_jesse] Various file upload guides for beginners, from small images to large file breakpoints

120 lines of code to achieve an interactive complete drag upload component