Introduction to the

First of all, this article is a concept and practice, to hope to learn and practice a simple camera live web function will help people, due to the limited space and practice deep degree, the effect of the demo only support live play computer end and commonly used real-time streaming video cameras, other complex functions (such as video information real-time processing, high concurrency, Network distribution, etc.) has not yet been implemented and needs further exploration.

The body of the

Follow the outline of the table of contents to explain one by one.

1. Introduction of live broadcast technology

Live streaming technology covers a wide range of areas. Nowadays, there are widely known technologies such as personal live streaming on video websites, live streaming on mobile phones, and security camera monitoring.

Here is a concept map that describes the various technical aspects of the live streaming process. It can be understood as collection end, streaming media server and playback end; You also need to understand what is push flow and what is pull flow.

  • Capture terminal: as the name implies, it is the source of video, and the capture of video is generally obtained from the real camera. For example, mobile device, PC device camera, and some camera devices;
  • Streaming media server: Streaming media server is a very important part of the whole technical framework of live broadcasting. It needs to receive the video stream pushed up from the collection end, and then push the video stream to the playback end.
  • Player end: The player end is a variety of apps, web players, pull the video stream on the streaming media server, and then transcoding, and finally play out;
  • Push flow: the process of encapsulating the data collected in the collection phase and transmitting it to the server;
  • Pull: The process of pulling live content from the server with the specified address.

Since push stream and pull stream are required, the transmission of video stream is inevitably involved, so the common streaming media transmission protocols are introduced next. The common streaming media transmission protocols are RTMP, RTSP, HLS and HTTP-FLV.

  • RTMP (available for push and pull) In RTMP, video must be H264 encoding, audio must be AAC or MP3 encoding, and most packets are in FLV format. Because the RTMP protocol is basically the FLV format of streaming files, you must use flash player to play.

  • RTSP real-time Stream Protocol, RTSP real-time effect is very good, suitable for video chat, video surveillance and other directions.

  • HLS Http Live Streaming, defined by Apple inc., is a real-time Streaming protocol based on Http. The transfer consists of two parts: 1.M3U8 description file; 2. 2.TS media. Video in TS media files must be H264 encoding and audio must be AAC or MP3 encoding. Data is transmitted over HTTP. Currently, the video.js library supports playing files in this format.

  • Http-flv (used for streaming) This protocol is HTTP + FLV, which encapsulates audio and video data into FLV format, and then transmits it to the client through HTTP protocol. This protocol greatly facilitates the browser client to play live video stream. The flv.js library currently supports this format for file playback.

With the above basic concepts, we have a rough idea of what we need to build a page with live streaming function. The following is the implementation of each part based on this architecture.

2. Technology used in front-end construction

  • Build streaming media services

    When it comes to streaming media server, in fact, as a front-end developer, I was also at a loss at the beginning. I didn’t know how to implement this thing or what language to write it in. The first thought must be to search for existing implementation technologies to see if they can be implemented purely on the front end. If you use pure JS technology, you must think of Node.js first, so you use the keyword “Node.js + video streaming technology implementation solution” to search, and get a relatively reliable result: NodeMediaServer is an open source streaming media server based on Node. Although the latest version has been refactored using GO, it is historically developed by Node, so I decided to look at the documentation to try to build such a server. For details, see NodeMediaServer’s official website.

    NodeMediaServer supports: Using RTMP, RTSP, and HLS protocols to pull and push streams, and supporting HTTP-FLV and WS-FLV to pull streams. In other words, browsers can use HTTP or Websocket to transmit FLV video streams for playback.

    Set up the streaming media server:

    • Download the corresponding installation package and use the Linux environment

      Download:
      Wget HTTP: / / https://cdn.nodemedia.cn/nms/3.2.12/nms-linux-amd64-20200222.tar.gzCopy the code

      Extract:

      tar -zxvf nms-linux-amd64-20200222.tar.gz
      Copy the code

    Go to the decompressed directory and run the following command to start the service:

    • Enter./ NMS on the console to run

    • Run sudo./service.sh install in the current program directory to install the service and run it automatically

    • Run sudo./service.sh uninstall in the current program directory to stop and uninstall service A

    • After the service is successfully started, you can access the background system of the streaming media service on port 8000 (default port), which looks like this:

    The dashboard on the home page displays the SERVER CPU usage and network bandwidth.

  • After the service is started, the next thing to do is to push the stream

    How to push the stream? There is a very powerful thing involved here, ffMPEG, which is an open source software that can be used to record, convert digital audio and video, and turn it into a stream. It can capture the video, encapsulate it into a stream, and push it to a streaming media server. For example, after installing this software on a MAC, you can call the camera through it. The camera data is encapsulated into streams and then pushed to the streaming media server. This process is called stream pushing. Ffmpeg can also push local video files to the streaming media server.

    Ffmpeg for MAC local camera real-time push to nodeMediaServer:

    ffmpeg -f avfoundation -video_size 1280x720 -framerate 30 -i 0:0 -vcodec libx264 -preset veryfast -f flv http://ip:8000/live/stream.flv
    Copy the code

    Here is the FFMPE tool, the above parameters are not explained one by one, just the most important:

    • DE - vide_sizeRepresents the resolution size of the video screen to be output
    • -fThe FLV parameter indicates the format of the output, followed by the addresshttp://ip:8000/live/stream.flvThe stream. FLV of this address can be changed to suit your needs. Just keep the suffix in the FLV format you want.

    Another common scenario is to directly pull the video stream data from the camera device. This mode is also supported by nodeMediaServer. You only need to configure the corresponding camera configuration information in the management background to push the video stream. The configuration information includes the IP address, user name, and password. The configuration page is as follows:

    Default configuration:

    If you are using a custom camera, with RTSP transmission function, you can use the western configuration method to configure the camera information, specify the output stream address, so that directly from the browser side can play the video through the output stream address:

  • The front page supports video streams

    As for the front page, the primary goal was to find front-end players that supported HTTP-FLV and WS-FLV protocol formats. First, I observed the live broadcast of B station and found that their live broadcast page used the video tag. Later, I discovered that they used their own open source FLV.js library. This is a player that supports http-FLV and WS-FLV video streams on the browser side, just what you need to play live video streams.

    Now that the video stream is available, you can use flv.js to build the page demo and see the actual effect.

3. Practical effects

  • First, fix the push flow:

    The video stream data of the camera is pushed directly from the MAC and the address information of the camera device is bound to the MAC respectively, and the push stream and pull stream services are conducted through nodeMediaServer.

  • Then the front end page plays the video stream, and the following is the core code of the player part:

    live-demo.js

    import * as React from 'react';
    
    import { Button, Input, Row, Col } from 'react-x-component';
    import flv from 'flv.js';
    
    const{ useState, useEffect } = React; interface LiveDemoProps { defaultUrl? : string, onUrlChange? :Function
    }
    
    export default function LiveDemo({ defaultUrl = 'http://ip:8000/live/stream.flv', onUrlChange }: LiveDemoProps) {
    
        let player = null;
        let playerDom = null;
        
        const [liveUrl, setLiveUrl] = useState(defaultUrl);
        
        useEffect((a)= > {
            if (flv.isSupported) {
                player = flv.createPlayer({
                    type: 'flv'.isLive: true.hasAudio: false.hasVideo: true.url: liveUrl,
                    cors: true
                }, {
                    enableWorker: false.lazyLoadMaxDuration: 3 * 60.seekType: 'range'
                });
                player.attachMediaElement(playerDom);
                player.load();
        
            } else {
                console.log('Your browser is not support flv.js'); }} []);function updatePlayer() {
            if (player) {
                player.unload();
                player.detachMediaElement();
                player.destroy();
                player = null;
            }
        
            player = flv.createPlayer({
                type: 'flv'.isLive: true.hasAudio: false.hasVideo: true.url: liveUrl,
                cors: true
            }, {
                enableWorker: false.lazyLoadMaxDuration: 3 * 60.seekType: 'range'
            });
            player.attachMediaElement(playerDom);
            player.load();
        }
        
        return (
            <div className='live-demo'>
                <div className="modify-url">
                    <Row>
                        <Col md={6}>
                            <Input
                                value={liveUrl}
                                onChange={(value)= >{ setLiveUrl(value); }} / ></Col>
                        <Col md={6}>
                            <Button
                                type={'primary'}
                                onClick={()= >{ updatePlayer(); onUrlChange && onUrlChange(liveUrl); }} > modification</Button>
                        </Col>
                    </Row>
                </div>
                <video
                    style={{ width: '100% ',height: '100%' }}
                    controls
                    className='video-demo'
                    ref={(e)= > playerDom = e}
                ></video>
            </div>
        );
    }
    Copy the code
  • Play the camera video stream effect, on the left is the camera data stream directly obtained, on the right is the real-time camera image pushed through the MAC computer:

OK, so that’s a complete set of live web page needs to build front-end technical services!

4. The follow-up needs to continue to practice and explore the content

The above example is relatively too simple, but with the help of the third party technology and set up a framework of streaming media server, and the front support play video playing page, collect data and through the camera, and push flow, through the entire process, forming a closed loop, but there are a lot of content need further:

  • Real-time processing of video information, how to add more information
  • How to achieve the high concurrency scenario, the implementation of the streaming media server is too simple, there must be a distribution processing mechanism
  • Browser playback performance needs to be stress tested

conclusion

Based on the concept of learning and introduction, to understand the overall architecture of common video broadcast technology process, based on the Angle of the front end to quickly set up a complete set of the function of television broadcasting, and of course there were a lot of shortage and need deep place, need further exploration, the subsequent if there is a more in-depth technical precipitation, will continue to form the article to share!