preface

When it comes to playing video on web pages, most of the front end must first think of:

<video width="600" controls>
  <source src="demo.mp4" type="video/mp4">
  <source src="demo.ogg" type="video/ogg">
  <source src="demo.webm" type="video/webm">Your browser does not support the video TAB.</video>
Copy the code

Indeed, a simple video TAB makes it easy to play a video

However, when the video file is very large, the playback effect of video is not very ideal:

  1. Poor playback (especially when: Stalling is obvious when initializing the video scene for the first time)
  2. A waste of bandwidth. If a user watches only the first few seconds of a video, tens of megabytes of data may have already been downloaded in advance. That is, the user’s traffic is wasted, and the server’s expensive bandwidth is wasted

Ideally, we want this to look like:

  1. Download as you play (progressive download), no need to download the video at once (streaming media)
  2. Video Bit Rate Seamless Switching (DASH)
  3. Hide the real video access address to prevent theft and download (Object URL)

In this case, ordinary video tags are no longer sufficient

A status code of 206

<video width="600" controls>
  <source src="demo.mp4" type="video/mp4">
</video>
Copy the code

When we play the demo. Mp4 video, the browser has already been partially optimized. Instead of waiting for the video to download, the browser requests some data first

We add in the request header

Range: bytes=3145728-4194303
Copy the code

Represents data from the 3145728 byte to 4194303 byte range of the file that is required

The backend response header returns

Content-Length: 1048576
Content-Range: bytes 3145728-4194303/25641810
Copy the code

Content-range indicates that data from 3145728 bytes to 4194303 bytes of the file is returned. The total size of the requested file is 25641810 bytes

Content-length indicates that the request returned 1048576 bytes (4194303-3145728 + 1)

This status code is used for breakpoint continuations and segmented video downloads that follow in this article

Object URL

Let’s take a look at the market each major video website is how to play video?

Billie Billie:

Tencent Video:

IQIYI:

As you can see, the video element in the website referred to above points to is a address: begin with a blob blob:https://www.bilibili.com/0159a831-92c9-43d1-8979-fe42b40b0735, the address has several features:

  1. Fixed format:Blob: current web site domain name/string of characters
  2. Cannot be accessed directly in the browser address bar
  3. Even if it is the same video, every time a new page is opened, the generated address is different

CreateObjectURL createObjectURL createObjectURL

const obj = {name: 'deepred'};
const blob = new Blob([JSON.stringify(obj)], {type : 'application/json'});
const objectURL = URL.createObjectURL(blob);

console.log(objectURL); // blob:https://anata.me/06624c66-be01-4ec5-a351-84d716eca7c0
Copy the code

CreateObjectURL takes a File, Blob, or MediaSource object as a parameter. The returned ObjectURL is a reference to this object

Blob

A Blob is a file-like object made up of immutable raw data; They can be read as text or binary data, or converted to a ReadableStream that can be used to process data

The File objects we commonly use inherit and extend the capabilities of Blob objects

<input id="upload" type="file" />
Copy the code
const upload = document.querySelector("#upload");
const file = upload.files[0];

file instanceof File; // true
file instanceof Blob; // true
File.prototype instanceof Blob; // true
Copy the code

We can also create a custom BLOB object

const obj = {hello: 'world'};
const blob = new Blob([JSON.stringify(obj, null.2)] and {type : 'application/json'});

blob.size; / / property
blob.text().then(res= > console.log(res)) / / method
Copy the code

Object URL application

<input id="upload" type="file" />
<img id="preview" alt="Preview" />
Copy the code
const upload = document.getElementById('upload');
const preview = document.getElementById("preview");

upload.addEventListener('change'.() = > {
  const file = upload.files[0];
  const src = URL.createObjectURL(file);
  preview.src = src;
});
Copy the code

The ObjectURL returned by createObjectURL is loaded directly through img

Similarly, if we load the Object URL with video, can we play the video?

index.html

<video controls width="800"></video>
Copy the code

demo.js

function fetchVideo(url) {
  return new Promise((resolve, reject) = > {
    const xhr = new XMLHttpRequest();
    xhr.open('GET', url);
    xhr.responseType = 'blob'; // Set the file type to blob
    xhr.onload = function() {
      resolve(xhr.response);
    };
    xhr.onerror = function () { reject(xhr); }; xhr.send(); })}async function init() {
  const res = await fetchVideo('./demo.mp4');
  const url = URL.createObjectURL(res);
  document.querySelector('video').src = url;
}

init();
Copy the code

The file directories are as follows:

├─ demo.mp4 ├─ index.html ├─ demo.jsCopy the code

Simply start a static server using HTTP-server

npm i http-server -g

http-server -p 4444 -c-1
Copy the code

Visit http://127.0.0.1:4444/, video element does normal play video, but we use ajax asynchronous request all the video data, the load and use video directly, compared to the original video, there is no advantage

Media Source Extensions

Combining the 206 status code introduced above, can we request partial video segments through Ajax, first buffer them into the video TAB, and then continue to download some video segments before the end of the video to realize the segmented playback?

The answer is of course yes, but raw shard data cannot be loaded directly using video, rather through the MediaSource API

We only need to Fragmented MP4 files for loading via MediaSource. We only need to use transcoding tools to convert ordinary MP4 files into FMP4 (Fragmented MP4). For a simple demonstration, instead of using live transcoding, we will convert a full MP4 to FMP4 directly using the MP4Box tool

#### One segment is divided every 4s
mp4box -dash 4000 demo.mp4
Copy the code

Running this command generates a demo_dashinit.mp4 video file and a demo_dash.mpd configuration file. Demo_dashinit.mp4 is the transcoded file that can be loaded using MediaSource

The file directories are as follows:

├─ demo.mp4 ├─ Demo_dash.mp4 ├─ demo_Dash.mpD ├─ index.html ├─ demo.jsCopy the code

index.html

<video width="600" controls></video>
Copy the code

demo.js

class Demo {
  constructor() {
    this.video = document.querySelector('video');
    this.baseUrl = '/demo_dashinit.mp4';
    this.mimeCodec = 'video/mp4; Codecs = "avc1.42 E01E mp4a. 40.2" ';

    this.mediaSource = null;
    this.sourceBuffer = null;

    this.init();
  }

  init = () = > {
    if ('MediaSource' in window && MediaSource.isTypeSupported(this.mimeCodec)) {
      const mediaSource = new MediaSource();
      this.video.src = URL.createObjectURL(mediaSource); // Return the object URL
      this.mediaSource = mediaSource;
      mediaSource.addEventListener('sourceopen'.this.sourceOpen); // Listen for the sourceOpen event
    } else {
      console.error('MediaSource not supported');
    }
  }

  sourceOpen = async() = > {const sourceBuffer = this.mediaSource.addSourceBuffer(this.mimeCodec); / / return sourceBuffer
    this.sourceBuffer = sourceBuffer;
    const start = 0;
    const end = 1024 * 1024 * 5 - 1; // load 5M data at the beginning of the video. If your video file is too large, 5M May not start the video, you can adjust the size appropriately
    const range = `${start}-${end}`;
    const initData = await this.fetchVideo(range);
    this.sourceBuffer.appendBuffer(initData);

    this.sourceBuffer.addEventListener('updateend'.this.updateFunct, false);
  }

  updateFunct = () = > {
    
  }

  fetchVideo = (range) = > {
    const url = this.baseUrl;
    return new Promise((resolve, reject) = > {
      const xhr = new XMLHttpRequest();
      xhr.open('GET', url);
      xhr.setRequestHeader("Range"."bytes=" + range); // Add the Range header
      xhr.responseType = 'arraybuffer';

      xhr.onload = function (e) {
        if (xhr.status >= 200 && xhr.status < 300) {
          return resolve(xhr.response);
        }
        return reject(xhr);
      };

      xhr.onerror = function () { reject(xhr); }; xhr.send(); }}})const demo = new Demo()
Copy the code

Implementation principle:

  1. By request headerRangePull the data
  2. Feed the data tosourceBuffer.MediaSourceThe data is decoded
  3. throughvideoPlay it

We only requested the first 5M of the video this time, and as you can see, the video played successfully for a few seconds before the picture froze.

All we need to do is listen for the playback time of the video and continue to download the next 5M data if the buffer is running low

const isTimeEnough = () = > {
  // Whether the current buffered data is sufficient to play
  for (let i = 0; i < this.video.buffered.length; i++) {
    const bufferend = this.video.buffered.end(i);
    if (this.video.currentTime < bufferend && bufferend - this.video.currentTime >= 3) // Download the video 3s in advance
      return true
  }
  return false
}
Copy the code

Of course, we still have many problems to consider, such as:

  1. How is the data updated each time it is segmentedRangeRequest scope of
  2. How do YOU ensure when you first request datavideoThere’s enough data to play the video
  3. Compatibility issues
  4. More details…

See the complete code for the segmented download process

Streaming media protocol

Video services are generally divided into:

  1. On demand
  2. live

Different services choose different streaming media protocols. Mainstream protocols include: RTMP, HTTP-FLV, HLS, DASH, webRTC, etc. See “Understanding streaming Media Protocols” for details.

Our previous example was an on-demand service using the DASH protocol. Remember the demo_dash.mpd file you generated with mp4Box? The Media Presentation Description (MPD) file stores all kinds of information about the FMP4 file, including the video size, resolution, segment video bit rate…

The DASH protocol is used for vod of station B

The M3U8 index file for the HLS protocol is similar to the DASH MPD description file

agreement Index file The transmission format
DASH mpd m4s
HLS m3u8 ts

Open source library

We used native Media Source handwriting loading process, but there are mature open Source libraries available in the market, such as http-Streaming, HLS. Js, flv.js. At the same time, with some decoding transcoding libraries, it is also very convenient to carry out real-time transcoding of files in the browser side, such as MP4box.js, ffmpeg.js

conclusion

This article briefly introduces the principle of gradual video playback implemented by Media Source Extensions, which involves basic knowledge of live on demand. As audio and video technology involves a lot of content, coupled with my level of limitation, so can only help you initially into a door

reference

  • Why is the video link address of video website blob?
  • Start with 3 unnecessary requests for an activity video on Tmall
  • Why do we use DASH
  • Build streaming players using MediaSource
  • The whole Web video thing
  • Building a simple MPEG-DASH streaming player
  • Summary of front-end live video technology and application of Video.js in H5 page
  • Understanding of streaming media protocols
  • Let HTML5 video support the playback of piecewise progressive downloads