I don’t know when the video address of major websites is no longer the mp4 video address that we can download whenever we want.

Instead, an unknown address of the form blob:

/

, as in:

blob:https://www.bilibili.com/2d2f8de5-0e42-4044-aeb0-6db9ff195550

When we try to open the address with a new TAB, we get something like this:

URL.createObjectURL()

If you’ve ever encountered the need to preview a local image, you’re no stranger to url.createObjecturl (). He uses it as follows:

const objectURL = URL.createObjectURL(object);
Copy the code

Url.createobjecturl () takes an object argument, which can be a File object, a Blob object, or a MediaSource object. It then returns a URL of the form blob:

/

. This URL points to the object parameter we passed in, along with the lifetime of the URL and the Document binding in the window that created it.

Image preview function:

<input id="selectInput" type="file" ></input>
<img id="previewImg" />
Copy the code
$("#selectInput").on("change".function(){$("#previewImg").attr("src",URL.createObjectURL($(this) [0].files[0]));
});
Copy the code

Be careful that each time the createObjectURL() method is called, a new URL is created, and the object to which the URL points will reside in memory. As we mentioned earlier, the lifetime of the URL is bound to the document in the window that created it. Without doing anything, the browser will only release the object when the document is unloaded, which can easily cause memory leaks.

RevokeObjectURL (URL) removes the reference to the object from the internal map. When there are no other references to the object, the browser frees up memory.

Looking at this, we can probably guess where these BLOb urls come from. Don’t? Do they download mp4 files and regenerate them into BLObs? (a dog

Of course not. When we introduced the argument received by url.createObjecturl (), we said that this argument could be a File object, a Blob object, or a MediaSource object.

The BLOB URL for these videos actually points to a MediaSource object.

Before introducing the Media Source Extensions API, let’s take a look at some concepts.

coding

As we know, the data volume of the original media file is very large, and in order to facilitate transmission and storage, we usually encode the media file to compress, which involves the concept of the encoding format. Our common video encoding formats are H.264, MPEG-4, MPEG-2, VC-1 and so on, and the common audio encoding and decoding formats are AAC, MP3, AC-3 and so on.

Media package format

In addition to the encoded and compressed video data, a video can also contain audio, subtitles and other data, which are packaged into a file container, which is the process of container packaging. We commonly used packaging formats are: MP4, MOV, TS, FLV, MKV and so on.

How the video player works

The video player will go through the following processes:

  1. decapsulation

First we need to separate the audio and video compressed encoded data from the file container. For example, we can unpack FLV files to get H.264 encoded video data and AAC encoded audio data.

  1. decoding

Then we need to compress and encode the video and audio data back into uncompressed video and audio data. For example, we can decode h.264 encoded compressed video data to get uncompressed video color data, such as YUV or RGB, and decode AAC encoded audio data to get uncompressed audio data, such as PCM.

  1. Rendering and audio and video synchronization

Finally, the decoded audio and video data are respectively sent to the system sound card and video card for playback. You also need to synchronize audio and video.

Media Source Extensions API for streaming Media playback

Although the browser’s built-in player TAB already has the ability to unpack and decode the video, we just need to provide an MP4 or WEBM video address to the TAB and the TAB can play the video.

However, the media package format supported by the tag is very limited (only MP4 is supported by the W3C standard), and it can only play the entire track at one time, not split/merge several buffer files.

This is where MSE (Media Source Extensions) comes in.

MSE provides for replacing the SRC value of a single media file with elements that reference MediaSource objects (a container containing information such as the readiness of the media file to be played) and multiple SourceBuffer objects (representing multiple different media blocks that make up the entire stream).

In simple terms, developers can construct MediaSource objects dynamically and then feed and tag them through url.createObjecturl ().

That leaves a lot of room for imagination. Such as:

  • We can convert the unsupported package format of the player into a supported package format in real time and feed it to the player. (Flv.js principle)
  • We can also enable the server to provide multiple media streams at bit rate, and select the appropriate media streams to feed to the player according to the current user’s network conditions, so as to achieve bit rate adaptation. (HLS. Js, Dash.js adaptive principle)

And so on and so on.

MediaSource

interface MediaSource : EventTarget {
    constructor(a); readonly attribute SourceBufferList sourceBuffers; readonly attribute SourceBufferList activeSourceBuffers; readonly attribute ReadyState readyState; attribute unrestricteddouble duration;
                    attribute EventHandler        onsourceopen;
                    attribute EventHandler        onsourceended;
                    attribute EventHandler        onsourceclose;
    static readonly attribute boolean canConstructInDedicatedWorker;
    SourceBuffer   addSourceBuffer (DOMString type);
    undefined           removeSourceBuffer (SourceBuffer sourceBuffer);
    undefined           endOfStream (optional EndOfStreamError error);
    undefined           setLiveSeekableRange (double start, double end);
    undefined           clearLiveSeekableRange (a);
    static boolean isTypeSupported (DOMString type);
};
Copy the code

MediaSource properties

ReadyState

Indicates the current state of MediaSource. Possible values are:

  • Closed, which means that the media element is not currently attached.
  • Open has been attached to the media element and is ready to append data to SourceBuffer to sourceBuffers.
  • Ended, still attached to the media element, but called endOfStream() to end the current stream.

sourceBuffers

Read-only property that returns the list of sourceBuffers contained in the current MediaSource.

activeSourceBuffers

Read-only property, returns the current MediaSource. SourceBuffers SourceBuffer subset of the object, this object contains the currently selected video track (video track), An object list of enabled Audio tracks and shown/hidden text tracks.

duration

Gets and sets the duration of the current stream being pushed.

When it gets, NaN is returned if the readyState attribute is closed.

When set, TypeError is raised if the value set is negative or NaN; If the readyState property is not open, an InvalidStateError exception is raised; If the updating property is equal to true, an InvalidStateError is raised;

MediaSource method

addSourceBuffer(mineCodes)

Based on the incoming mineCodes, create a new SourceBuffer and add it to MediaSource’s SourceBuffers list.

removeSourceBuffer(sourceBuffer)

Removes the specified SourceBuffer from the SourceBuffers list.

endOfStream(endOfStreamError?)

Send an end signal to the MediaSource and receive the endOfStreamError parameter, indicating the error to be thrown when the end of the stream is reached.

Enum EndOfStreamError {"network", // Terminates playback and sends network error signals. "Decode" // terminates playback and emits a decoding error signal. };Copy the code

MediaSource event

sourceopen

Triggered when the readyState changes from Closed to Open or from Ended to Open.

sourceended

Triggered when readyState changes from Open to Ended.

sourceclose

Triggered when the readyState changes from Open to Closed or from Ended to Closed

MediaSource static method

isTypeSupported(mineCodes)

Returns a Boolean value indicating whether the current browser player supports the given mineCodes type. True indicates possible support, but not necessarily support.

SourceBuffer

interface SourceBuffer : EventTarget {
                    attribute AppendMode          mode;
    readonly        attribute boolean             updating;
    readonly        attribute TimeRanges          buffered;
                    attribute double              timestampOffset;
    readonly        attribute AudioTrackList      audioTracks;
    readonly        attribute VideoTrackList      videoTracks;
    readonly        attribute TextTrackList       textTracks;
                    attribute double              appendWindowStart;
                    attribute unrestricted double appendWindowEnd;
                    attribute EventHandler        onupdatestart;
                    attribute EventHandler        onupdate;
                    attribute EventHandler        onupdateend;
                    attribute EventHandler        onerror;
                    attribute EventHandler        onabort;
    undefined appendBuffer (BufferSource data);
    undefined abort (a);
    undefined changeType (DOMString type);
    undefined remove (double start, unrestricted double end);
};
Copy the code

SourceBuffer properties

mode

Mode can be set to segments or sequence. The SourceBuffer processes the added data in different ways under different mode values.

  • Segments: Time stamps in media segments determine the order in which each media segment is played. Media clips can be attached in any order, but the order in which they are played depends only on the timestamp.

  • Sequence: The sequence in which media clips are added determines the playback sequence, which is not affected by the timestamp of the media clips.

The initial value of this property is set to segments if the media fragment has a timestamp, otherwise sequence, when addSourceBuffer() is called. , which can be updated via changeType() or by setting this property.

updating

Read-only property that indicates whether the appendBuffer() or remove() of the call is still being processed.

buffered

Read-only property, used to indicate which TimeRanges are buffered by SourceBuffer.

timestampOffset

The timestamp offset used to control the media fragment. The default is 0.

audioTracks

Read-only property that returns the AudioTrackList object of the currently contained AudioTrack.

videoTracks

Read-only property that returns the VideoTrackList object of the currently contained VideoTrack.

textTracks

Read-only property that returns the TextTrackList object of the currently contained TextTrack.

appendWindowStart

Sets or gets the start timestamp of the Append window

appendWindowEnd

Sets or gets the end timestamp of the AppEnd Window

Append Window uses appendWindowStart and appendWindowEnd to represent a time range in which the encoding frame’s timestamp is appended to the SourceBuffer, otherwise it will be filtered out

SourceBuffer method

appendBuffer(source)

Adds a media data fragment (ArrayBuffer or ArrayBufferView) to SourceBuffer.

abort

Aborts operations on the current media fragment data and resets the parser. When called, the updating property is reset to false.

changeType

Changes the MIME type currently associated.

remove(start, end)

Removes media data from a specified range.

SourceBuffer event

updatestart

Triggered when updating changes from false to true.

update

AppendBuffer or remove has completed successfully, triggered when updating changes from True to false.

updateend

The appendBuffer or remove operation has ended and is triggered after the update event.

error

An error occurred when appendBuffer was executed, triggered when updating changed from True to false.

abort

AppendBuffer or remove is interrupted by abort() and triggered when updating changes from true to false.

Use the sample

var video = document.querySelector('#mse-video');
var mineCodes = 'video/mp4; Codecs = "avc1.42 E01E mp4a. 40.2" ';

if (window.MediaSource && MediaSource.isTypeSupported(mineCodes)) { 
  // Check whether the current environment supports the MediaSource API and whether the mineCodes are supported
  var mediaSource = new MediaSource();
  // Create a BLOb URL using the mediaSource object and assign it to video.src
  video.src = URL.createObjectURL(mediaSource);
  mediaSource.addEventListener('sourceopen', sourceOpen); 
} else {
  console.log("The Media Source Extensions API is not supported.")}function sourceOpen(e) {
  RevokeObjectURL Actively releases the reference
  URL.revokeObjectURL(video.src);

  var mediaSource = e.target;
  // addSourceBuffer Creates a new SourceBuffer based on the mineCodes passed in and adds it to MediaSource's SourceBuffers list
  var sourceBuffer = mediaSource.addSourceBuffer(mineCodes); 
  var videoUrl = 'video.mp4';
  fetch(videoUrl)
    .then(function(response) {
      return response.arrayBuffer();
    })
    .then(function(arrayBuffer) {
      sourceBuffer.addEventListener('updateend'.function(e) {
        if(! sourceBuffer.updating && mediaSource.readyState ==='open') {
          // After data is added, call endOfStream to end the current flowmediaSource.endOfStream(); }});// Add media data to the sourceBuffer
      sourceBuffer.appendBuffer(arrayBuffer); 
    });
}
Copy the code

When MediaSource. ReadyState is ended, call appendBuffer() and remove() again or set mode and timestampOffset. Both will change readyState to open and trigger the SourceOpen event. If you do not need to listen for the SourceOpen event caused by the subsequent SourceBuffer operation, you should only listen for the first SourceOpen event and then remove the listening for the SourceOpen event

MSE compatibility

Other than IE and Safari (yes, Safari doesn’t support it either, but it does support HLS natively), compatibility is good.