Welcome to my GitHub

Github.com/zq2599/blog…

Content: all original article classification summary and supporting source code, involving Java, Docker, Kubernetes, DevOPS, etc.;

Welcome to my GitHub

Here classification and summary of xinchen all original (including supporting source code) : github.com/zq2599/blog…

This paper gives an overview of

  • How to play your MP4 files remotely for more people? As shown below:

  • Here’s a quick explanation of the function of the image above:
  1. Deploy the open source streaming media server (SRS)
  2. Develop a Java application called PushMp4, which reads the Mp4 file on the local disk, reads each frame and pushes it to the SRS
  3. Everyone who wants to watch the video connects to the SRS using a streaming software (such as VLC) on their computer and plays the video pushed up by PushMp4
  • Today, let’s finish the actual combat in the picture above. The whole process is divided into the following steps:
  1. Environmental information
  2. Prepare MP4 files
  3. Deploy SRS with Docker
  4. Java application development and execution
  5. VLC to play

Environmental information

  • For your reference, the environmental information involved in this actual combat is as follows:
  1. Operating system: macOS Monterey
  2. The JDK: 1.8.0 comes with _211
  3. JavaCV: 1.5.6
  4. The SRS: 3

Prepare MP4 files

  • Prepare an ordinary MP4 video file. I have downloaded this video online. The address is:

www.learningcontainer.com/wp-content/…

Deploy SRS with Docker

  • SRS is a famous open source media server. All streams pushed to SRS can be played online by media player. For simplicity, I complete deployment in docker environment with one line command:
docker run -p 1935:1935 -p 1985:1985 -p 8080:8080 ossrs/srs:3
Copy the code
  • At this point the SRS service is running and can be pushed up

Develop JavaCV applications

  • Next, I started the most important coding phase, creating a new Maven project called Simple-grab-push. Pom.xml would look like this. (The parent project called Javacv-tutorials didn’t really matter. You can delete the parent project node) :

      
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>javacv-tutorials</artifactId>
        <groupId>com.bolingcavalry</groupId>
        <version>1.0 the SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.bolingcavalry</groupId>
    <version>1.0 the SNAPSHOT</version>
    <artifactId>simple-grab-push</artifactId>
    <packaging>jar</packaging>

    <properties>
        <! -- Javacpp current version -->
        <javacpp.version>1.5.6</javacpp.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
            <version>1.2.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-to-slf4j</artifactId>
            <version>2.13.3</version>
        </dependency>

        <! -- Javacv dependencies, one is enough -->
        <dependency>
            <groupId>org.bytedeco</groupId>
            <artifactId>javacv-platform</artifactId>
            <version>${javacpp.version}</version>
        </dependency>
    </dependencies>
</project>
Copy the code
  • As you can see from the above files, JavaCV has only one dependency, JavacV-Platform, which is quite concise

  • Now start coding. Before coding, draw the whole process so that writing code is much clearer:

  • As you can see from the figure above, the process is simple, with all the code in a Single Java class:
package com.bolingcavalry.grabpush;

import lombok.extern.slf4j.Slf4j;
import org.bytedeco.ffmpeg.avcodec.AVCodecParameters;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.avformat.AVStream;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.bytedeco.javacv.Frame;

/ * * *@author willzhao
 * @version 1.0
 * @descriptionRead the specified MP4 file and push it to SRS server *@date2021/11/19 while he * /
@Slf4j
public class PushMp4 {
    /** * Full path to local MP4 files (2 minutes and 5 seconds of video) */
    private static final String MP4_FILE_PATH = "/Users/zhaoqin/temp/202111/20/sample-mp4-file.mp4";

    /** * SRS push address */
    private static final String SRS_PUSH_ADDRESS = "RTMP: / / 192.168.50.43:11935 / live/livestream";

    /** * Read the specified MP4 file and push it to SRS server *@paramSourceFilePath Indicates the absolute path of the video file *@paramPUSH_ADDRESS PUSH_ADDRESS *@throws Exception
     */
    private static void grabAndPush(String sourceFilePath, String PUSH_ADDRESS) throws Exception {
        // ffmepg Log level
        avutil.av_log_set_level(avutil.AV_LOG_ERROR);
        FFmpegLogCallback.set();

        // Instantiate the frame grabber object, passing in the file path
        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(MP4_FILE_PATH);

        long startTime = System.currentTimeMillis();

        log.info("Start initializing frame grabber");

        // Initialize frame grabbers, such as data structures (timestamps, encoder context, frame objects, etc.),
        // If the input parameter is true, the avformat_find_stream_info method is also called to get the information about the stream, which is put into the AVFormatContext member variable OC
        grabber.start(true);

        log.info("Frame grabber initialization completed in [{}] ms", System.currentTimeMillis()-startTime);

        In grabber.start, the initialized decoder information is stored in the grabber member variable oc
        AVFormatContext avFormatContext = grabber.getFormatContext();

        // The file contains several media streams (usually video streams + audio streams)
        int streamNum = avFormatContext.nb_streams();

        // No need to continue without media streaming
        if (streamNum<1) {
            log.error("No media stream exists in file");
            return;
        }

        // Get the frame rate of the video
        int frameRate = (int)grabber.getVideoFrameRate();

        log.info("Video frame rate [{}], video duration [{}] seconds, Number of media streams [{}]",
                frameRate,
                avFormatContext.duration()/1000000,
                avFormatContext.nb_streams());

        // Iterate over each stream to check its type
        for (int i=0; i< streamNum; i++) {
            AVStream avStream = avFormatContext.streams(i);
            AVCodecParameters avCodecParameters = avStream.codecpar();
            log.info("Stream index [{}], encoder type [{}], encoder ID[{}]", i, avCodecParameters.codec_type(), avCodecParameters.codec_id());
        }

        // Video width
        int frameWidth = grabber.getImageWidth();
        // Video height
        int frameHeight = grabber.getImageHeight();
        // Number of audio channels
        int audioChannels = grabber.getAudioChannels();

        log.info("Video width [{}], video height [{}], audio channel number [{}]",
                frameWidth,
                frameHeight,
                audioChannels);

        // instantiate FFmpegFrameRecorder to pass in the SRS push address
        FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(SRS_PUSH_ADDRESS,
                frameWidth,
                frameHeight,
                audioChannels);

        // Set the encoding format
        recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);

        // Set the encapsulation format
        recorder.setFormat("flv");

        // The number of frames per second
        recorder.setFrameRate(frameRate);

        // The number of frames between two key frames
        recorder.setGopSize(frameRate);

        // Set the number of audio channels equal to the number of channels of the video source
        recorder.setAudioChannels(grabber.getAudioChannels());

        startTime = System.currentTimeMillis();
        log.info("Start initializing frame grabber");

        // Initialize the frame recorder, such as the data structure (audio stream, video stream pointer, encoder),
        // Call the av_guess_format method to determine the encapsulation mode of video output,
        // Memory allocation for media context objects,
        // Set parameters of the encoder
        recorder.start();

        log.info("Frame recording initialization completed in [{}] ms", System.currentTimeMillis()-startTime);

        Frame frame;

        startTime = System.currentTimeMillis();

        log.info("Start pushing the current");

        long videoTS = 0;

        int videoFrameNum = 0;
        int audioFrameNum = 0;
        int dataFrameNum = 0;

        // Assuming 15 frames per second, the interval between two frames is (1000/15) ms
        int interVal = 1000/frameRate;
        // The sleep time after sending a frame cannot be exactly equal to (1000/frameRate), otherwise it will stall.
        // Make it smaller. Take 1/8 here
        interVal/=8;

        // Continue fetching frames from the video source
        while (null! =(frame=grabber.grab())) { videoTS =1000 * (System.currentTimeMillis() - startTime);

            / / timestamp
            recorder.setTimestamp(videoTS);

            // If there is an image, add one to the video frame
            if (null! =frame.image) { videoFrameNum++; }// If there is a sound, add one to the audio frame
            if (null! =frame.samples) { audioFrameNum++; }// If there is data, add one to the data frame
            if (null! =frame.data) { dataFrameNum++; }// Every fetched frame is pushed to SRS
            recorder.record(frame);

            // Pause and push
            Thread.sleep(interVal);
        }

        log.info("Push completed, video frame [{}], audio frame [{}], data frame [{}], time [{}] seconds",
                videoFrameNum,
                audioFrameNum,
                dataFrameNum,
                (System.currentTimeMillis()-startTime)/1000);

        // Close the frame recorder
        recorder.close();
        // Close the frame grabber
        grabber.close();
    }

    public static void main(String[] args) throws Exception { grabAndPush(MP4_FILE_PATH, SRS_PUSH_ADDRESS); }}Copy the code
  • Each line of the above code is commented in detail, so I won’t go into detail here, except for the following four key points to note:
  1. MP4_FILE_PATH is the location where the MP4 files are stored locally, please change it to the location where the MP4 files are stored on your computer
  2. SRS_PUSH_ADDRESS is the SRS service push address. Please change it to your own SRS service deployment address
  3. Grabber.start (true) initializes the frame grabber and obtains information about the MP4 file
  4. When the Recorder. Record (frame) method is executed, the frame is pushed to the SRS server
  • Select * from mp4mp4; select * from MP4MP4; select * from MP4MP4;
23:21:48. [the main] INFO 107 com. Bolingcavalry. Grabpush. PushMp4 - began to crawl 23:21:48 initialization frames. [the main] 267 INFO Com. Bolingcavalry. Grabpush. PushMp4 - frame crawl initialization is complete, Time-consuming. [163] milliseconds 23:21:48 277 [main] INFO com. Bolingcavalry. Grabpush. PushMp4 [15] - video frame rate, video [125] seconds long, Number of media stream [2] 23:21:48. 277. [the main] INFO com. Bolingcavalry. Grabpush. PushMp4 - flow index [0], [0] encoder type, Encoder ID [27] 23:21:48. 277. [the main] INFO com bolingcavalry. Grabpush. PushMp4 - flow index [1], the encoder types [1], Encoder ID [86018] 23:21:48. 279. [the main] INFO com bolingcavalry. Grabpush. [320] PushMp4 - video width, height of video [240], Number of audio channels [6] 23:21:48. 294. [the main] INFO. Com bolingcavalry. Grabpush. PushMp4 - began to crawl 23:21:48 initialization frames. [the main] 727 INFO Com. Bolingcavalry. Grabpush. PushMp4 - frame recording initialization is complete, Time-consuming. [433] milliseconds 23:21:48 727 [main] INFO com. Bolingcavalry. Grabpush. PushMp4 - began to flowCopy the code
  • Now let’s see if we can pull the stream

Play with VLC

  • Install the VLC software and open it

  • Click Open Network in the red box below… And then enter the first code written by pushing flow address (I am here is the RTMP: / / 192.168.50.43:11935 / live/livestream) :

  • The following image plays successfully, and the sound is normal:

Additional knowledge

  • After the actual combat above, we are familiar with the basic operation of playback and push stream, master the acquisition of conventional information and parameter setting, in addition to the knowledge in the code, there are several hidden knowledge points also worth paying attention to
  1. Avutil. Av_log_set_level (avutil.AV_LOG_ERROR) is used to set the ffMPEG log level. After changing the parameter to avutil.AV_LOG_INFO, you can see richer logs on the console. It shows details about MP4 files, such as two media streams (audio and video) :

  1. The encoder type and encoder ID of AVStream are 0 and 1 respectively. The encoder ID of AVStream is 27 and 86018 respectively. What do these four numbers represent?

  1. First look at the encoder type, use the decompilation function of IDEA to open avutil. Class, as shown in the figure below, encoder type equals 0 means VIDEO, and encoder type equals 1 means AUDIO:

  1. Avcodec.java encoder ID 27 = H264

  1. The hexadecimal value of the encoder ID 86018 is 0x15002. The corresponding encoder is shown in the red box as follows:

  • By now, JavaCV push flow practice (MP4 files) has been completed, I hope that through this article we can get familiar with JavaCV push and pull flow routine operation;

Github.com/zq2599/blog…