preface

Before see, a large file upload article, feel upload inside the details of consideration or a lot of.

Bytedance interviewer: Please implement a large file upload and resumable breakpoint – Juejin.

And I’ve been working on file downloads lately

The description is not good, please forgive me, MOE new first time to write an article. 😁

I am currently working in Hangzhou. I have some friends to play with

The overall train of thought

  1. Get the size of the file to download and divide by the size of the shard to determine how many network requests to send
  2. The back end gets the range field in the request header
  3. The front end merges the requested slices and creates a URl.createObjecturl to download

The back-end express

I use two interfaces, one is to get file size, one is to download

The router. The get ('/size / : name, (the req, res) = > {/ / get the path to download files let filePath = path. Resolve (__dirname, distPath, the req. Params. Name) . / / the console log (filePath) / / get the size of the file size = fs. StatSync (filePath). The size | | null. The console log (' download file size + size) res. Send ({ msg:'ok', data:size.toString() }) })Copy the code

Download can be divided into three situations: one is to download directly without fragmentation; the second is to reject the request if the start and end positions of the fragment are not correct; the third is to return the fragment to the front end if the position is correct

router.get("/down/:name", (req, res) => { let filename = req.params.name; Let filePath = path.resolve(__dirname, distPath, req.params.name); let size = fs.statSync(filePath).size; Let range = req.headers["range"]; let file = path.resolve(__dirname, distPath, filename); // If (! range) { //res.set({'Accept-Ranges':'bytes'}) res.set({ "Content-Type": "application/octet-stream", "Content-Disposition": `attachment; filename=${filename}`, }); fs.createReadStream(file).pipe(res); return; } let bytesV = range.split("="); bytesV.shift() let [start, end] = bytesV.join('').split("-"); Start = Number (start) end = Number (end) / / shard end position wrong refused to download the if (start > size | | end > size) {res. Set ({" Content - Range ": `bytes */${size}`}); res.status(416).send(null); return; } // Start shard download res.status(206); res.set({ "Accept-Ranges": "bytes", "Content-Range": `bytes ${start}-${end ? end : size}/${size}`, }); console.log(start + '---' + end) fs.createReadStream(file, { start, end }).pipe(res); });Copy the code

Front the react

// view <div> <Input placeholder=" onChange={(e) => setName (e.target.value)} /> <Button type="primary" </Button> </div>Copy the code

downfile

// Download file SIZE is the fragment SIZE const SIZE = 200 * 1024 * 1024; Const downfile = async () => {let contentLength = await filesize(name); let chunks = Math.ceil(contentLength / SIZE); let chunksl = [...new Array(chunks).keys()]; // Streaming doesn't work, but if the back end turns the file into a zip so that each fragment is a zip, you should be able to do it this way. Multiple zip files are downloaded and the right number of zip files are extracted. If you're interested, look at streamSaver i of chunksl) { let start = i * SIZE; let end = i + 1 === chunks ? contentLength : (i + 1) * SIZE - 1; let res = await getBinaryContent(start, end, i); let fileStream = streamSaver.createWriteStream(name,{flags:'a',start}); If (res.data.stream().pipeto) {// This is done with axios when the responseType is set to "blob". Blob has stream method await res.data.stream().pipeTo(fileStream); }} let results = await asyncPool(3,chunksl,(I)=>{let start = I * SIZE; let end = i + 1 === chunks ? contentLength : (i + 1) * SIZE - 1; return getBinaryContent(start, end, i); }) results.sort((a,b)=> a.index-b.index) let arr = results.map(r=>r.data Blob(arr) saveAs(name,buffers) };Copy the code

filesize

Const filesize = async (name) => {let res = await http.get(' /size/${name} '); return res.data; };Copy the code

getBinaryContent

Const getBinaryContent = async (start, end, I) => {let result = await http.get(' down/${name} ', { headers: { Range: `bytes=${start}-${end}` }, responseType: "blob", }); return { index: i, data: result }; };Copy the code

saveAs

// Save file const saveAs = (name, buffers, mime = "application/octet-stream") => {const blob = new blob ([buffers], { type: mime }); const blobUrl = URL.createObjectURL(blob); const a = document.createElement("a"); a.download = name a.href = blobUrl; a.click(); URL.revokeObjectURL(blob); };Copy the code

The resources

How to implement parallel downloading of large files in JavaScript? – the nuggets (juejin. Cn)

Front-end big file download solution _azureCho -CSDN blog _ Front-end big file download