Since HTML5 provides the video tag, playing a video on a web page has become a very simple matter. Just a video tag and the SRC attribute is set to the address of the video. Since SRC points to the real video network address, in the early days, general website resource files were not set up through the referer to prevent piracy. When we got the video address, we could download or use it at will (every time we went home on holiday, some relatives would ask me to help me load and load things from some video websites).
Most of the current cloud storage service providers support Referer for anti-theft. The principle is that when accessing resources, the request header will bring the address of the page that initiated the request, and judge that it does not exist (indicating that the image address is directly accessed) or is not in the whitelist, that is, it is a leaker.
But at some point when we opened up the debug tool and looked at the SRC videos of all the major video sites, they all became like this.
Take a video from station B, for example. The video address in the red box. What is this blob? .
In fact, this Blob URL is not a new technology, both at home and abroad for a while, but the online related articles are not very detailed, today to share with you to learn.
The Blob and ArrayBuffer
In the early days, databases used BLOBs directly to store binary data objects without worrying about the format of the stored data. In the Web world, a Blob object represents a file-like object that is read-only raw data, binary raw data but file-like, and can therefore be manipulated as a file object.
The ArrayBuffer object is used to represent a generic, fixed-length buffer of raw binary data. A contiguity of memory can be obtained by using a new ArrayBuffer(Length), which cannot be read or written directly but can be passed to a TypedArray view or DataView object as needed to interpret the original buffer. The view really just gives you some kind of read-write interface that you can manipulate the data in the ArrayBuffer. TypedArray specifies an array type to ensure that the array members are of the same data type, whereas DataView array members can be of different data types.
TypedArray views have the following array objects of type:
- Int8Array: an 8-bit signed integer, 1 byte in length.
- Uint8Array: 8-bit unsigned integer, 1 byte in length.
- Uint8ClampedArray: 8-bit unsigned integer, length 1 byte, different overflow handling.
- Int16Array: a 16-bit signed integer of 2 bytes.
- Uint16Array: 16-bit unsigned integer, length of 2 bytes.
- Int32Array: a 32-bit signed integer of 4 bytes.
- Uint32Array: a 32-bit unsigned integer of 4 bytes.
- Float32Array: a 32-bit floating-point number of 4 bytes.
- Float64Array: a 64-bit floating-point number of 8 bytes.
Blob differs from ArrayBuffer in that it provides MIME Type as metadata in addition to the raw bytes, and can be converted between Blob and ArrayBuffer.
The File object actually inherits from the Blob object and provides basic metadata such as name, lastModifiedDate, size, type, etc.
Create a Blob object and convert it to ArrayBuffer:
// Create an HTML file stored in binary data
const text = "<div>hello world</div>";
const blob = new Blob([text], { type: "text/html" }); // Blob {size: 22, type: "text/html"}
// Read as text
const textReader = new FileReader();
textReader.readAsText(blob);
textReader.onload = function() {
console.log(textReader.result); // <div>hello world</div>
};
// Read as an ArrayBuffer
const bufReader = new FileReader();
bufReader.readAsArrayBuffer(blob);
bufReader.onload = function() {
console.log(new Uint8Array(bufReader.result)); // Uint8Array(22) [60, 100, 105, 118, 62, 104, 101, 108, 108, 111, 32, 119, 111, 114, 108, 100, 60, 47, 100, 105, 118, 62]
};
Copy the code
Create an ArrayBuffer of the same data and convert it to a Blob:
// We just create a Uint8Array and populate it with data
const u8Buf = new Uint8Array([60.100.105.118.62.104.101.108.108.111.32.119.111.114.108.100.60.47.100.105.118.62]);
const u8Blob = new Blob([u8Buf], { type: "text/html" }); // Blob {size: 22, type: "text/html"}
const textReader = new FileReader();
textReader.readAsText(u8Blob);
textReader.onload = function() {
console.log(textReader.result); Hello world
};
Copy the code
For more information about BLOBs and arrayBuffers, see the following:
- MDN Blob
- MDN ArrayBuffer
Ruan Yifeng JS standard reference tutorial binary array
URL.createObjectURL
The SRC attribute of the video tag, audio tag, or IMG tag, whether it is a relative path, an absolute path, or a network address, is the address of a file resource. Now that we know that a Blob is binary data that can be used as a file, if we can generate an address that points to the Blob, we can use the SRC attribute of the tag. The answer is definitely yes, using the url.createObjecturl ().
const objectURL = URL.createObjectURL(object); //blob:http://localhost:1234/abcedfgh-1234-1234-1234-abcdefghijkl
Copy the code
The object argument is the File object, Blob object, or MediaSource object used to create the URL. The resulting link is an address beginning with Blob:, indicating that it points to a binary.
Where localhost:1234 is the host name and port number of the current page, location.host, and the Blob URL is directly accessible. Note that every time the url.createObjecturl method is called, you get a different Blob URL, even with the same binary data. This URL lasts as long as the web page does, and the Blob URL is invalidated once the page is refreshed or uninstalled.
RevokeObjectURL (objectURL) frees a previously existing URL object created by calling url.createObjecturl (). When you’re done with a URL object, you should call this method to let the browser know that you don’t need to keep references to the file in memory, allowing the platform to do garbage collection when appropriate.
If an HTML file is opened in a file protocol (that is, the url starts with file://), http://localhost:1234 becomes null in the address, and the Blob url is not directly accessible.
Sometimes we want to preview the image before we upload the image file through Input. In this case, we can do this by using what we learned earlier, and it’s very simple.
<input id="upload" type="file" />
<img id="preview" src="" alt="Preview"/>
const upload = document.querySelector("#upload");
const preview = document.querySelector("#preview");
upload.onchange = function() {
const file = upload.files[0]; / / File object
const src = URL.createObjectURL(file);
preview.src = src;
};
Copy the code
The same method can be used for uploading video previews.
Practice 2: Load web videos with Blob URLS
Now we have a web video address, how can we change the video address to a Blob URL? The idea is to get the Blob object that stores the raw data of the video, but unlike the input upload, which can directly get the File object, we only have a web address.
We know that we can use XHR (ajax and AXIos in jquery) or FETCH to request a server address to return our corresponding data. What will be returned if we use XHR or FETCH to request a picture or video address? ResponseType = responseType = responseType = responseType = responseType = responseType
function ajax(url, cb) {
const xhr = new XMLHttpRequest();
xhr.open("get", url);
xhr.responseType = "blob"; / / "" |" text "string" blob "- a blob object" arraybuffer "- arraybuffer object
xhr.onload = function() {
cb(xhr.response);
};
xhr.send();
}
Copy the code
Note that XMLHttpRequest and Fetch API requests have cross-domain issues that can be resolved by cross-domain resource sharing (CORS).
So if we see that the responseType can set bloB and ArrayBuffer we should have a spectrum, request to return a BLOB, or return an ArrayBuffer to be converted into a BLOB, Then createObjectURL generates the SRC property that assigns the address to the video. Here we request a Blob object directly.
ajax('video.mp4'.function(res){
const src = URL.createObjectURL(res);
video.src = src;
})
Copy the code
Using debugging tools to check the video TAB of the SRC attribute has become a Blob URL, seemingly is consistent and each big video website form, but consider a problem, in this form will have to wait until the request all the video data to play, small video is to say, if not bigger bang video resources, obviously the video website can’t do that.
Solution to this problem is streaming, it brings us the most intuitive experience is to make the media files can bottom sowing (90 after man like me should realize the advantages of streaming media stems from the earliest that the players of the head), if you want to use streaming media web end, can have multiple streaming protocol for us to choose from.
HLS and MPEG DASH
HTTP Live Streaming (HLS) is a media Streaming protocol based on HTTP implemented by Apple. HLS is transmitted in TS format, and M3U8 is the index file (the file contains the information of TS file name and duration, etc., which can be played by player or opened by editor like VScode). It is supported by most browsers on mobile terminals. This means that you can use the Video TAB to load an M3U8 file directly to play a video or broadcast, but on PC, with the exception of Apple’s Safari, you need to import the library to support it.
Video websites using this scheme, such as Youku, can check the XHR request in the Network through debugging when the video is playing, and they will find an M3U8 file and request several TS files at intervals.
But in addition to HLS, Adobe HDS, Microsoft MSS, solution one had to have a standard point of things, so there was THE MPEG DASH.
DASH (Dynamic Adaptive Streaming over HTTP) is a Video Streaming technology that transmits Dynamic bit rate over the Internet, similar to Apple’S HLS. DASH slices video content into a short file segment using media Presentation Description (MPD). Each slice has multiple different bit rates. The DASH Client can select a bit rate for playback based on network conditions, allowing seamless switching between different bit rates.
Youtube, site B all use this scheme. ‘S’ index files are typically MPD files (similar to HLS ‘M3U8 files) with’ FMP4 ‘(Fragmented MP4) as their recommended transmission format, with.m4s or.mp4 as their extension. Therefore, it can be found that there are several M4S file requests every period of time to check the network requests when the video is playing at station B by debugging.
Both HLS and DASH have libraries and even advanced players for us to use, but we really want to learn a little bit about implementation. In fact, regardless of the index file parsing and the actual media file transfer address, all we have is how to combine multiple video data so that the video TAB can play seamlessly.
A related website B article is recommended for interested people: Why do we Use DASH
MediaSource
The video tag SRC points to a video address. After the video is played, change SRC to the address of the next video and play it. This obviously does not meet our requirement of seamless playback. In fact, given the Blob URL we learned earlier, we might come up with the idea of using a Blob URL to point to a video binary, and then constantly adding and concatenating the binary of the next video. In this way, you can constantly update the content of the video and play it on without affecting the playback, and think about whether it is a bit of streaming meaning.
To do this, we use MediaSource. MediaSource is a pure media data container that can be bound to HTMLMediaElement. CreateObjectURL creates the container’s BLob URL on the SRC tag of the video. During playback, we can still add data to the container via the MediaSource. AppendBuffer method to update the content of the video.
// Implement code as follows:
const video = document.querySelector('video');
Video1.mp4 ~ video5.mp4. The first segment is the initialization video init.mp4
const assetURL = "http://www.demo.com";
// Video format and encoding information, mainly to determine whether the browser supports the video format, but if the information and video does not match may report an error
const mimeCodec = 'video/mp4; Codecs = "avc1.42 E01E mp4a. 40.2" ';
if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) {
const mediaSource = new MediaSource();
video.src = URL.createObjectURL(mediaSource); // Bind video to MediaSource, where a Blob URL is generated
mediaSource.addEventListener('sourceopen', sourceOpen); // Open the container
} else {
// The browser does not support this video format
console.error('Unsupported MIME type or codec: ', mimeCodec);
}
function sourceOpen () {
const mediaSource = this;
const sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
function getNextVideo(url) {
// The Ajax code implements the data request type as shown above: ArrayBuffer
ajax(url, function(buf) {
// Adding the requested data to the container does not affect the current video playback.
sourceBuffer.appendBuffer(buf);
});
}
// This is triggered every time appendBuffer data is updated
sourceBuffer.addEventListener("updateend".function() {
if (i === 1) {
// Start playing as soon as the first initialization video is loaded
video.play();
}
if (i < 6) {
// After a video is loaded, request the next video
getNextVideo(`${assetURL}/video${i}.mp4`);
}
if (i === 6) {
// Close the container after loading all video clips
mediaSource.endOfStream();
URL.revokeObjectURL(video.src); // The Blob URL has been used and loaded, and can be released if it is not needed again.
}
i++;
});
// Load the initial video
getNextVideo(`${assetURL}/init.mp4`);
};
Copy the code
This code is modified from the MDN MediaSource example. In the original example, only one video was loaded, but I changed it to multiple videos. Many parts of the code can be optimized and simplified.
Now we have basically realized a simple streaming media playback function, if you want to add m3U8 or MPD file parsing, design the UI interface, you can realize a streaming media player.
One last pitfall, many people who run MDN’s MediaSource sample code may find that using official videos is fine, but using their own MP4 video will cause an error. This is because the FMP4 file extension is usually.m4s or just.MP4, but it is a special MP4 file.
Fragmented MP4
Usually the MP4 files we use are nested. The client must load an MP4 file from scratch to play the entire mp4 file, not from the middle. Fragmented MP4 (FMP4 for short) is a series of Fragmented fragments which, if Fragmented supports byte-range requests, can be requested and played independently by the client without the need for the entire file to be loaded.
Fragmented MP4 is the site where we can tell if a MP4 file is Fragmented MP4.
We use FFmpeg or Bento4’s MP4Fragment to convert ordinary MP4 into Fragmented MP4. Each of these tools is Fragmented MP4’s command line tool. Decompress the compressed package for each system and set environment variables to point to the bin directory in the folder to use relevant commands.
Bento4 mp4fragment, without too many parameters, command:
mp4fragment video.mp4 video-fragmented.mp4
Copy the code
FFmpeg will need to set some parameters as follows:
ffmpeg -i video.mp4 -movflags empty_moov+default_base_moof+frag_keyframe video-fragmented.mp4
Copy the code
Default_base_moof can be converted successfully, but MediaSource will report an error if this parameter is not added.
The video can be segmented using Bento4’s mp4slipt command:
mp4split video.mp4 –media-segment video-%llu.mp4 –pattern-parameters N
Blog.csdn.net/weixin_4518… Thanks to the original author
conclusion
- BLOB failure mechanism
Where localhost:1234 is the host name and port number of the current page, location.host, and the Blob URL is directly accessible. Note that every time the url.createObjecturl method is called, you get a different Blob URL, even with the same binary data. This URL lasts as long as the web page does, and the Blob URL is invalidated once the page is refreshed or uninstalled.
- Blob urls are suitable for small videos
- The web end of large video uses streaming media technology and plays on the side
Using debugging tools to check the video TAB of the SRC attribute has become a Blob URL, seemingly is consistent and each big video website form, but consider a problem, in this form will have to wait until the request all the video data to play, small video is to say, if not bigger bang video resources, obviously the video website can’t do that.
Solution to this problem is streaming, it brings us the most intuitive experience is to make the media files can bottom sowing (90 after man like me should realize the advantages of streaming media stems from the earliest that the players of the head), if you want to use streaming media web end, can have multiple streaming protocol for us to choose from
- HLS (HTTP Live Streaming) is a media Streaming protocol based on HTTP implemented by Apple
HLS is transmitted in TS format, and M3U8 is the index file (the file contains the information of TS file name and duration, etc., which can be played by player or opened by editor like VScode). It is supported by most browsers on mobile terminals. This means that you can use the Video TAB to load an M3U8 file directly to play a video or broadcast, but on PC, with the exception of Apple’s Safari, you need to import the library to support it.
Video websites using this scheme, such as Youku, can check the XHR request in the Network through debugging when the video is playing, and they will find an M3U8 file and request several TS files at intervals.