1. An overview of the
Big front end is a hot word these years, for the front segment if not big front end, technology is relatively backward. If you still stay in ES6, Vue these basic skills can only be said to be in a pass line.
Other big front-end skills are necessary for excellence, such as NodeJS, Express.js,koaJs services, or 3d data graphics such as three.js, as well as 2D image processing such as D3, Raphael, Echart, and finally HLS, FLV video industry.
If you only know ES6,Vue, React,Webpack, this is only the front end, the big front end is at least one of the above skills. This article is about live video in the big front end.
This article mainly tells about the front-end part of H5, video and audio acquisition part will be discussed later. First of all, I will lead you to quickly implement a live broadcasting system, and then I will explain the important concepts.
2. Install tools
Here first describes the operation method of the MAC system, the installation method of the Windows system is below. I have uploaded the required tools to Github and you can download them by yourself. Git address
Server is a push flow tool. Tools contains the following tools.
1. Install FFMPEG on Mac
Use –disable-yasm for a crippledbuild FFmpeg YASM /nasm not found or too old. After FFmpeg is installed, there is no ffplay file in the bin directory.
Installation yasmyasm;
Yasm CD/h5LIVE /tools/ yASM-1.3.0 # configure # make sudo make installCopy the code
Installation SDLSDL;
CD/h5LIVE /tools/ sdl2-2.0.8 /configure make -j 16 sudo make installCopy the code
Installation ffmpegffmpeg;
# cut ffmpeg can be found in H5live, CD/h5LIVE /tools/ ffmPEg-4.3 # configure./configure --prefix=/usr/local/ffmpeg --enable-debug=3 --enable-shared Make -j 4 sudo make installCopy the code
Set the ffmpeg soft connection, equivalent to the environment variable, so that the ffmpeg command can be used in any directory, /usr/local/ffmpeg-4.3/ffmpeg is the installation path
Ln -s/usr/local/ffmpeg - 4.3 / ffmpeg/usr/local/bin/ffmpegCopy the code
2. Windows system
FFmpeg programs convert various media formats so that they can be played on different devices. This program only has command line mode, so it may seem like a bit of a hassle to install on your computer, but with this guide you should be able to install FFmpeg in just a few minutes!
Download ffmpeg:
When you visit the download page, you will see a number of different download options. You can download the latest 32-bit or 64-bit static version of the program depending on your operating system.
Installation:
Click the Start menu, then click Computer. Select the disk on which to install Windows (usually C :). In the C: drive’s root directory (named Windows and Program Files folders), right – click and select New Folder from the pop-up menu. Name the new folder ffmpeg. Unzip the downloaded FFmpeg package into this folder.
Add the ffmpeg startup command to the environment variable, c: ffmpeg\bin, commonly known as configuring environment variables.
Open a command prompt window and enter ffmpeg -version. If the command prompt window returns FFmpeg version information, the installation is successful, and you can run FFmpeg in any folder on the command prompt line.
If you receive the error message libstdc++ -6 is missing, you may need to install the Microsoft Visual C++ Redistributable Package, which is available free on the Microsoft web site.
3. Start the server
Very simple, go to github obtained from the H5LIVE server directory, run the server program.
cd /h5live/server
open server
Copy the code
You can see that there are paths provided for three protocols: 1935 for RTMP, 7001 for HTTP-FLV, and 7002 for HLS.
4. Perform the push flow
Find a video file in MP4 format (let’s say 1.mp4) and execute the following commands in the folder of 1.map.
Ffmpeg - re - I 1. FLV mp4 - c copy - f RTMP: / / 127.0.0.1:1935 / live/movieCopy the code
The image below shows that 1.mp4 has been streamed.
5. Video verification
As mentioned above, the Server tool provides video streaming over three protocols: RTMP, HTTP-FLV, and HLS.
Video streams of the RTMP protocol can be verified using the VLC player.
Inside paste into the RTMP: / / 127.0.0.1:1935 / live/movie
Then you can see the effect of live streaming.
Download VLC player:
- Mac system
Mac OS X
- Windows system
Windows
HLS streaming media can be directly viewed using Safari. Can be directly put http://127.0.0.1:7002/live/movie.m3u8 in the Safari browser address bar to check the effect.
At this point, the push stream is finished. In the development of H5 live broadcast, all these works are servers. The reason for demonstrating these works is to understand the process of live broadcast in the actual development and to quickly provide the solution of live broadcast. The most important in the service has not been completed before the rapid development of a push flow system can be built in this way, the front end of the first development.
# RTMP. Can use VLC player RTMP: / / 127.0.0.1: # 1935 / live/movie FLV http://127.0.0.1:7001/live/movie.flv # HLS can use Safari browser to access http://127.0.0.1:7002/live/movie.m3u8Copy the code
6. H5 terminal player
This is the part that the front end really needs to care about. It mainly introduces how to use JS to write a live broadcast player. It is safest to select some existing best practices, so as to quickly meet the business needs. I also posted the code on Github.
Video.js is a popular video framework in foreign countries. Its specialty is to make a very good custom UI, which conforms to the scene of online finished products. In addition to custom UI, it also provides many plug-ins, such as bullet screen, shortcut keys, HLS support and so on. He is a relatively complete JS framework, vod, live are very suitable, the disadvantage is the file is larger.
HLS. Js is a compact framework for HLS protocol, and it can also be broadcast on demand. The downside is that you need to write your own UI styles. Video.js supports HLS because the plug-in is based on hls.js
Flv. js is an open source FLV format player of B station. It is very suitable for HTTP – FLV protocol live broadcast.
As for RTMP, it is not commonly used in H5 live broadcasts, so it will not be covered here.
7. Use VideoJS development
Find VideoJS on Github and download the following two files locally.
There are a number of plug-ins in the video.js plug-in documentation that you can find what you need. https://videojs.com/plugins/
Videojs-contrib-hls is a video plugin that supports HLS live streaming. Find the CDN and save the JS locally. The way to use it is very simple, just need to bring in the js plug-in.
Import video.min.js and videojs-contrib-hls. Js and video-js.min.css into the page. The source TAB has the m3u8 suffix for HLS. That’s it.
Note that you need to view this in the server environment.
<! DOCTYPEhtml>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="Width = device - width, initial - scale = 1.0">
<title>Document</title>
<link rel="stylesheet" href="video-js.min.css">
</head>
<body>
<video id=example-video width=600 height=300 class="video-js vjs-default-skin" controls>
<source
src="http://127.0.0.1:7002/live/movie.m3u8"
type="application/x-mpegURL">
</video>
<script src="video.min.js"></script>
<script src="videojs-contrib-hls.js"></script>
<script>
var player = videojs('example-video');
player.play();
</script>
</body>
</html>
Copy the code
You can see that VideoJS has taken care of the UI problem. It is a quick to use mine construction.
8. Use hls.js to develop live pages
To download the corresponding HLS code, go to Github and find it at https://github.com/video-dev/hls.js.
It’s also easy to use, because hls.js does not provide UI styles, so you just need to import JS.
<! DOCTYPEhtml>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="Width = device - width, initial - scale = 1.0">
<title>Document</title>
</head>
<body>
<video id="video" controls width="400" height="300"></video>
<script src="hls.js"></script>
<script>
var video = document.getElementById('video');
var videoSrc = 'http://127.0.0.1:7002/live/movie.m3u8';
if (Hls.isSupported()) {
var hls = new Hls();
hls.loadSource(videoSrc);
hls.attachMedia(video);
} else if (video.canPlayType('application/vnd.apple.mpegurl')) {
video.src = videoSrc;
}
</script>
</body>
</html>
Copy the code
9. HTTP-FLV
Flv. js is an open source FLV player of B station, which can be said to be the pride of Chinese people. It is also very simple to use. Also introducing flv.js.
<! DOCTYPEhtml>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="Width = device - width, initial - scale = 1.0">
<title>Document</title>
</head>
<body>
<script src="flv.js"></script>
<video id="videoElement" controls width="400" height="300"></video>
<script>
if (flvjs.isSupported()) {
var videoElement = document.getElementById('videoElement');
var flvPlayer = flvjs.createPlayer({
type: 'flv'.url: 'http://127.0.0.1:7001/live/movie.flv'
});
flvPlayer.attachMediaElement(videoElement);
flvPlayer.load();
flvPlayer.play();
}
</script>
</body>
</html>
Copy the code
10. Wechat mini program live broadcast
Here is about the small program live, by default I think here we are small program development experience, so how to create a small program, install developer tools, debugging and other content will not say.
Wechat applet live uses the live-player of media component, and it only supports FLV or RTMP format, which is particularly important when confirming the technical scheme at the front and back end. It is also open only to the corresponding industry. Can look at the link https://developers.weixin.qq.com/miniprogram/dev/component/live-player.html here
To open a small program in line with the above categories, you need to pass the category review first, and then open the permission of this component in the “Development” – “Interface Setting” section of the small program management background.
With this enabled, you can use the live-player component of the applet for live development. It is also very simple to use.
<live-player src="https://domain/pull_stream" mode="RTC" autoplay bindstatechange="statechange" binderror="error" style="width: 300px; height: 225px;" />
Copy the code
Page({
statechange(e) {
console.log('live-player code:', e.detail.code)
},
error(e) {
console.error('live-player error:', e.detail.errMsg)
}
})
Copy the code
It is important to note that the developer tools cannot do live debugging, only remote debugging. Connect your phone to see the effect. The live-player defaults to 300px width and 225px height. You can set the width and height using WXSS.
So far H5 and small program live development is over, the above content is enough for everyone to complete the work of live development tasks and may appear live business.
Then I will introduce the process of live broadcasting, protocols and build a live broadcasting service based on NGINx. The following content is more theoretical and service. It doesn’t have much to do with the front end so I can skip it.
11. Live streaming process
For a live broadcast process, the first is the collection of video and audio and other media. Generally, there are three collection methods: PC terminal, Android terminal and IOS terminal. Their work also depends on cameras and microphones. People who broadcast will choose to buy professional cameras and microphones. The data collected here is in stream format, that is, binary data, which is uploaded to the server via socket or HTTP.
The first step after information source collection is to encode, because the original stream collected cannot be played directly by the client, it must be encoded by a certain protocol. H.264 is generally used for video coding, and AAC is generally used for audio coding. These two are the most common encoding formats in the live streaming industry.
Coding may be accompanied by superposition of subtitles, of course, this is not necessary is an optional process. What I’m talking about here is actually some processing of the video, possibly adding watermarks and other secondary processing.
The video and audio processing is followed by push-streaming, that is, pushing the video and audio to the server, as shown above using FFMPEG to push 1.mp4.
In work, the server will deploy the streaming media pushed to the CDN and then push the resources to the CDN. Generally, static resources or multimedia resources will be published to the CDN to ensure user experience and speed of pulling.
The CDN address is directly accessed by the client, which can also be a PC, Android or IOS player.
The above is a simple live broadcast processing process.
12. Video format
The most common video format is MP4, which is very compatible with Google, Firefox, Apple, Internet Explorer and other browsers. Webm is a streaming video format commonly found on YouTube, but only supported by Google and Firefox.
HLS is not a video format strictly speaking, it is a video protocol, its video format is TS, commonly called HLS for easy identification, this format was developed by Apple itself, so Safari browser can directly support, in the above demo HLS is directly placed in the Safari browser address bar can be played directly.
FLV is an early Flash video format that station B adopted early on, and even now station B supports H5 and Flash players.
13. Live streaming agreement
The three most commonly used protocols for live streaming are HLS, RTMP and HTTP-FLV. HLS corresponds to the video in HLS format which is.TS. RTMP and HTTP-FLV are both corresponding FLV formats for video.
HLS protocol is the simplest and most commonly used. It is a live streaming protocol launched by Apple, and its working principle is relatively simple. H5 generally obtains an M3U8 index file from the client through the video label, and this M3U8 will be directly placed in the SRC path of video.
Since M3U8 is an index file, it will be parsed into many.TS segments, each of which is a segment of a live stream.
The video label of the browser will request M3U8 again at a certain time to obtain a new live stream clip, so as to realize the real-time broadcast, and the request to send m3U8 is the independent behavior of the browser.
<body>
<video id=example-video width=600 height=300 class="video-js vjs-default-skin" controls>
<source
src="http://127.0.0.1:7002/live/movie.m3u8"
type="application/x-mpegURL">
</video>
<script src="video.min.js"></script>
<script src="videojs-contrib-hls.js"></script>
<script>
var player = videojs('example-video');
player.play();
</script>
</body>
Copy the code
This is a relatively standard live broadcast protocol, but M3U8 may not contain TS files, there may also be a layer of M3U8 files nested, that is to say, the first M3U8 file is still m3U8 file.
If the current m3U8 contains an M3U8 file, the current file is called the Master PlayList.
If the current M3U8 file contains TS, it is called Media PlayLists.
This is uncommon, but it does exist and requires extra attention. If the stream file you receive while practicing live streaming does not play, it is possible that the player does not support this nesting.
M3u8 is divided into dynamic list, static list and full list. Static lists are basically invisible in the live streaming industry, they just exist in the standard.
The dynamic list is mainly used in the process of live broadcasting, while the full list is mostly used for on-demand, that is, recording and broadcasting. The result of m3U8’s response is a text file.
The first line indicates the version of M3U8, which is important because it refers directly to the HLS version supported by the player. If the version is not supported, some of the following instructions may not be parsed.
The second line is the version declaration, the default is 3, the third line is the default length of the video, the fourth line is the number of video stream blocks, each request will be increased by 1, next is the VIDEO TS file, the preceding 9.901 is the length of this TS file. This determines when the browser’s video updates the M3U8 file.
The static list returns the same file as the dynamic list, except that the fifth line adds a playlist-type event. Nothing else makes much difference.
A full list has two more things than a dynamic list. The first is playlist-type, which has a vod value, and the bottom ext-X-endList, which stands for end. The browser recognizes this field and will not send the request again.
For the first TS file, it will have a PAT package, and the PAT package will tell you to find a PMT package, and the PMT will tell all the TS’s which are video TS and which are audio TS. A lot of TS’s make up something called PES.
To parse a video, the browser first needs to know the video frame and audio frame. In the first TS file, the browser will be told to find PAT first, then PMT, then TS file, and then the TS file according to the video and audio classification of the connected TS into a frame.
14. RTMP protocol
RTMP is an acronym for Real Time Messaging Protocol. This protocol is based on TCP and is a protocol family, including RTMP and RTMPT/RTMPS/RTMPE. RTMP is a network protocol designed for real-time data communication. It is mainly used for audio, video and data communication between Flash,AIR platforms and streaming/interactive servers that support RTMP.
The traditional interaction between software and server is still based on RTMP. For example, the video and audio collection mentioned above is mainly based on PC. If the client collects through software, it is based on RTMP; if the acquisition terminal uses H5, its protocol is generally weBRTC. These are two different technical solutions. During the RTMP transmission, the video is also in FLV format. There’s a little bit of caution here.
15. HTTP – FLV agreement
RTMP is a bit more complicated to use than HLS because it’s based on TCP. HLS is very simple to use, but its real-time performance is poor, that is to say, it has a delay, the more slices, the greater the delay. Http-flv combines the benefits of HLS, namely HTTP requests, with the low latency features of RTMP.
Http-flv and RTMP are both long connections and the transmission format is FLV. The difference lies in their connection to CDN and player. RTMP is TCP while HTTP-FLV is HTTP.
Http-flv has many advantages over RTMP, such as avoiding firewall interference to a certain extent, being well compatible with HTTP302 jump, flexible scheduling, using HTTPS as encryption channel, and good support for mobile terminals (Android and IOS).
16. Video Label Introduction
Here to add a little video tag knowledge, for most front-end development, there is no real sense of understanding of the video tag, his attributes, his events. Most people are stuck knowing that it’s a video tag that can play, pause, adjust volume, and has a SRC attribute.
This is very dangerous, Video tag is H5 launched a very powerful multimedia tag, it can be said that he is the future of the media in the web.
Tag attributes:
<video
src="test.map"
width="400"
height="225"
controls
controlslist="nodownload nofullscreen"
poster="Preview"
autoplay
muted
loop
preload
></video>
Copy the code
Controls: Bottom control bar
Controlslist: Bottom control bar customization
Poster: preview
Autoplay: autoplay
③ Autoplay is not allowed for unmuted videos on mobile devices, and must be muted to reduce exertion.
Loop: indicates a loop
Preload: preload, which behaves differently from browser to browser, especially on mobile, if necessary.
JS control part:
Volume: 0-1
CurrentTime: Sets the current playback time, in seconds. The uHD and HD addresses are different, and the location time is required when the file is switched.
SRC: Obtains the video address
video.volume = 0.5;
video.currentTime = 60;
video.src;
Copy the code
The source tag can be compatible with the video address error, in which case JS needs to use currentSrc to get the current address.
<video>
<source src="./test.map" type="video/mp4"></source>
<source src="./test2.map" type="video/mp4"></source>
</video>
Copy the code
Video:
Loadstart: The video starts to load.
Durationchange: indicates that the video duration can be obtained.
Loadedmetadata: A LoadedMetadata event occurs when the specified audio/video metadata has been loaded
Loadeddata: The LoadedData event occurs when the data for the current frame is loaded but there is not enough data to play the next frame of the specified audio/video
Progress: A Progress event occurs when the browser is downloading the specified audio/video
Canplay: The CanPlay event occurs when the browser is able to start playing the specified audio/video
Canplaythrough: The CanplayThrough event occurs when the browser expects to be able to continue playing the specified audio/video without stopping to buffer
Play: The Play event is triggered when the pause state changes to play
Seeking: Events triggered when the progress bar is switched.
Seeked: Seeking will be executed after the data is downloaded.
Waiting: Waiting will be triggered after Seeking when playing the state. Waiting if there is not enough data to support it.
Playing: Playing status.
Timeupdate: Indicates the event when the playback time is updated.
Ended: Indicates that the playback ends
Error: indicates an error event.
17. Manually set up a live broadcast server
Before using the integration of good server tools to create a live server, here using Nginx manual build a similar server, again to understand his work process.
First you need to install the Nginx and FFMPEG tools.
Configure nginx after the tool is installed. Conf file, you need to configure the service RTMP module, at the bottom of the configuration file to open a new code block called RTMP, which says listen to port 1935, video slice size set 4000, you can also set yourself, and then configure a live streaming application rTMplive, In configuring an HLS live application.
RTMP {server {# listen port listen1935; chunk_size4000; Rtmplive {# enable live on; # max_connections1024; } # application HLS {live on; hls on; Hls_path /usr/local/var/www/hls; HLS fragment size hls_fragment 5s; }}}Copy the code
After configuring this, you need to configure the access location in the HTTP module. You can add a location to the Server block.
server {
listen 8080; . Locaton/HLS {# corresponding declarations, namely response header types {application/VND. Apple. Mpegurl m3u8; video/mp2t ts; } # the directory points to the directory root /usr/local/ sliced in the RTMP blockvar/www; Add_header cache-control no-cache; }}Copy the code
Now that nginx is configured, you can restart nginx to test.
nginx -s reload
Copy the code
Also, find the previous 1. Mp4 uses FFMPEG to push the stream. Here video using libx264 encoding, USES the aac audio coding, make FLV pushed to the RTMP: / / localhost: 1935 / rtmplive/RTMP domain, the 1935 is set RTMP port, rtmplive is the name of the application.
ffmpeg -re -i 1.mp4 -vcodec libx264 -acodec aac -f flv rtmp://localhost:1935/rtmplive/rtmp
Copy the code
– I: input
– vCOdec: video coding
Acodec: audio encoding
At this point, the video is cut, and the process is similar to how the camera captures the video in real time.
This is a live streaming RTMP protocol, you can use the VLC to play, as long as in the file/open networks input RTMP: / / localhost: 1935 / rtmplive/RTMP can play.
Here, the ffmpeg command is basically the same, but the path needs to be changed. The above nGINx configuration rTMPLive and HLS two. Push HLS flow need to use the RTMP: / / localhost: 1935 / HLS/stream, the stream here freely, but this name all access address later.
ffmpeg -re -i 1.mp4 -vcodec libx264 -acodec aac -f flv rtmp://localhost:1935/hls/stream
Copy the code
In this case, HLS is HTTP. Nginx previously added an HLS access path to the HTTP protocol. The HTTP address can be accessed using Safari. It can also be accessed using our own player developed above.
http://localhost:8080/hls/stream.m3u8
Copy the code
The compilation of HTTP-FLV is complicated and will not be covered here. So far H5 and small procedures live on the introduction of the end.