This article was first published on wechat public number — interesting things in the world, handling reprint please indicate the source, otherwise will be held responsible for copyright. Wechat account: A1018998632, QQ group: 859640274
- 1. Write a Douyin app from scratch — start
- 4. Copied a Douyin App from scratch — log and buried point and preliminary back-end architecture
- 5. Copied a Douyin App from scratch — App architecture update and network layer customization
- 6. Write a Douyin App from scratch — start with audio and video
- 8. Write a Douyin App from scratch — build a cross-platform video editing SDK project
Making the address
Long time no see. Recently, I have been working overtime so much that the second audio and video article has been delayed for a week. Please forgive me. This article will take about 20 minutes to read.
This article is divided into the following chapters, which can be read on demand
- 1.FFmpeg source food – Clion to compile, modify, reference FFmpeg source code
- 2.FFmpeg Api consumption — FFmpeg data structure and official demo analysis
- 3. Minimalist Video Player – Write a minimalist Android video player based on FFmpeg
A, FFmpeg source food
Matters needing attention:
- 1. Need some knowledge of Git, Git Chinese documents.
- 2. My FFmpeg: I fork the FFmpeg project, source code compilation has been completed, compiled shell script in the root directory.
- 3. Ffmpeg-learing: Sample code for this article
- 4. In the following code block, use —– code block X, this article from the simple book, nuggets: when evening —– to distinguish each code block, the text is not part of the code
- 5. Clone the FFmpeg project path.
- 6. The following operations are based on The Mac platform, Linux platform should also run smoothly, I don’t have time to worry about win platform (up to you).
- 7. Install some pre-installation software: Clion (Baidu), make (Brew for MAC, APT for Linux)
1. Start
Given a project, there are generally two ways to use it: one is to compile and package the product with it, and one is to reference his project and integrate it into his own project. In this chapter we will talk about how to eat FFmpeg source code, write our code into FFmpeg project, and then compile it into android project. Ffmpeg-learing, we strongly recommend that you follow the project code to read the article.
- 1. First fork the FFmpeg official project to our own Github for future modifications.
- Clone your FFmpeg project onto your computer.
- 3. My code modification and compilation will be based on FFmpeg 3.3.8 (this version is easier to compile), so we need to create a new branch local_build_base_on_3.3.8. Then use the git reset – 18 c9d5d3e80dc0b47e0a260b51f5230bdd499e8b hard to FFmpeg tag to n3.3.8 the commit.
- 4. Now we can start compiling the code.There’s a lot of compilation on the web, so I’ll just talk about it.
- 1. Replace lines 3305-3308 in project/configure with the code in block 1.
- 2. Save the code in code block 2 as the project/build_android.sh file and run the./build_android.sh command.
-- -- -- -- -- code block1, this article from the book, nuggets: when xi -----# SLIBNAME_WITH_MAJOR='$(SLIBNAME).$(LIBMAJOR)'
# LIB_INSTALL_EXTRA_CMD='? (RANLIB) "$(LIBDIR)/$(LIBNAME)"'
# SLIB_INSTALL_NAME='$(SLIBNAME_WITH_VERSION)'
# SLIB_INSTALL_LINKS='$(SLIBNAME_WITH_MAJOR) $(SLIBNAME)'SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)' LIB_INSTALL_EXTRA_CMD='? (RANLIB)"$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)'
SLIB_INSTALL_LINKS='$(SLIBNAME)'
Copy the code
----- code block 2, this article from the book, nuggets: when evening -----
#! /bin/bash
#Switch to the FFmpeg directory
cd /Users/whensunset/AndroidStudioProjects/KSVideoProject/ffmpeg
#The PATH of the NDK can be set based on the installation locationexport NDK=/Users/whensunset/AndroidStudioProjects/KSVideoProject/android-ndk-r14b export SYSROOT=$NDK/platforms/android-16/arch-arm/ export TOOLCHAIN = $the NDK/toolchains/arm - Linux - androideabi - 4.9 / prebuilt/Darwin - x86_64 export = arm CPU
#Configure the compiled artifact placement path
export PREFIX=$(pwd)/android/$CPU
export ADDI_CFLAGS="-marm"
#Create a method that uses the configure file to compile FFmpeg with parameters passed in. You can use the configure-help command to learn about the parameters
function build_one
{
./configure \
--prefix=$PREFIX \
--target-os=linux \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--arch=arm \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS" \
--cc=$TOOLCHAIN/bin/arm-linux-androideabi-gcc \
--nm=$TOOLCHAIN/bin/arm-linux-androideabi-nm \
--enable-shared \
--enable-runtime-cpudetect \
--enable-gpl \
--enable-small \
--enable-cross-compile \
--disable-debug \
--disable-static \
--disable-doc \
--disable-asm \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--enable-postproc \
--enable-avdevice \
--disable-symver \
--disable-stripping \
$ADDITIONAL_CONFIGURE_FLAG
sed -i '' 's/HAVE_LRINT 0/HAVE_LRINT 1/g' config.h
sed -i '' 's/HAVE_LRINTF 0/HAVE_LRINTF 1/g' config.h
sed -i '' 's/HAVE_ROUND 0/HAVE_ROUND 1/g' config.h
sed -i '' 's/HAVE_ROUNDF 0/HAVE_ROUNDF 1/g' config.h
sed -i '' 's/HAVE_TRUNC 0/HAVE_TRUNC 1/g' config.h
sed -i '' 's/HAVE_TRUNCF 0/HAVE_TRUNCF 1/g' config.h
sed -i '' 's/HAVE_CBRT 0/HAVE_CBRT 1/g' config.h
sed -i '' 's/HAVE_RINT 0/HAVE_RINT 1/g' config.h
make clean
make -j8
make install
}
#Run the FFmpeg compilation method you created earlier
build_one
Copy the code
- 5. Barring accidents, we’ll be atproject/android/armsawinclude 和 libThese two folders.
- 1. Include: For those of you who know C/C ++, include is a c/ C ++ interface definition file, which can be compared to Java interfaces. It is used to expose internal apis to the outside world.
- 2. Lib: This is the so file that can be used in Android.
- 3. We can call the exposed API in the so file based on the function definition provided in the include file.
- 6. Above is our FFmpeg compilation process.
2. Modify FFmpeg source code
In this section we will talk about how to modify FFmpeg source code, and then automatically compile and package it in our Android project.
Edit FFmpeg in Clion
- 1. First of all, we have got the source code for FFmpeg in the previous section. Now we need to open Clion, click import Project from Sources, select the project folder, and import the source code according to the default Settings of Clion.
- 2. At this point we’ll see that Clion automatically generates a cmakelists. TXT file, which imports all the compilable files in the source code.
- 3. In order to have a clean Git project, you need to add some file filtering in.gitignore. Such as code block 3
PTX *.ptx.c /config.asm/config.h. idea /.idea /cmake-build-debug /android *.logCopy the code
-
4. After the import is complete, you will notice that many files will be red, and some of the included header files will not be found. This is normal because we have a special script to compile the code, and Clion is only used as an editor, so the red color doesn’t affect what we do next. If you really don’t like it, try using Clion’s Auto Import shortcut to correct each one as you see it.
-
5. Now we can happily edit FFmpeg source code. We in the project/libavcodec/allcodecs. C/avcodec_register_all this method with a line of beginner’s standard av_log (NULL, AV_LOG_DEBUG, “hello world”);
-
6. Now that you can modify the source code, there are also scripts to compile the source code. An easy way to introduce the so file into the Android project is to manually compile and copy the so file into the Android project. But we’re programmers, and we need a more convenient way to build this process.
- 1. First of all, we copied a Douyin App from scratch — Audio and video. This article introduced how to introduce so file into the Android project and then call it in the JNI layer.
- 2. Then we just need to compile the source of FFmpeg when we need it, and then replace the old SO file with the generated SO file. Such as block 4
- 3. Now that we have the script to automatically compile the copy, we need to run this script while gradle is compiling the project. For example, in code block 5, we put the code inside the build.gradle file of app moudle.
- 4. Now just click on Run and you will see that the Gradle Console will output the output log of FFmpeg compilation. Now we can happily modify and use FFmpeg source code.
---- code block 4, this article from the book, nuggets: when evening ----- #! /usr/bin/env bash # exitIf not commented, ffMEpg does not need to be compiled during android project compilation. If commented, FFMPEG needs to be compiled during Android project compilation. # exit #Execute the compiled script in the FFmpeg source project sh /Users/whensunset/AndroidStudioProjects/KSVideoProject/ffmpeg/build_android.sh #The so file directory of the current project needs to be changed to its own so_path="/Users/whensunset/AndroidStudioProjects/KSVideoProject/FFmpeglearning/app/src/main/jni/ffmpeg/armeabi/" #The default name for all so files after compilation libavcodec_name="libavcodec-57.so" libavdeivce_name="libavdevice-57.so" libavfilter_name="libavfilter-6.so" libavformat_name="libavformat-57.so" libavutil_name="libavutil-55.so" libpostproc_name="libpostproc-54.so" libswresample_name="libswresample-2.so" libseacale_name="libswscale-4.so" #Delete the old so file in the current project rm ${so_path}${libavcodec_name} rm ${so_path}${libavdeivce_name} rm ${so_path}${libavfilter_name} rm ${so_path}${libavformat_name} rm ${so_path}${libavutil_name} rm ${so_path}${libpostproc_name} rm ${so_path}${libswresample_name} rm ${so_path}${libseacale_name} #FFmpeg source project, compiled so file path, need to change their own build_so_path="/Users/whensunset/AndroidStudioProjects/KSVideoProject/ffmpeg/android/arm/lib/" #Copy the newly compiled so file to the so directory of the current project cd /Users/whensunset/AndroidStudioProjects/KSVideoProject/FFmpeglearning/app cp ${build_so_path}${libavcodec_name} ${so_path}${libavcodec_name} cp ${build_so_path}${libavdeivce_name} ${so_path}${libavdeivce_name} cp ${build_so_path}${libavfilter_name} ${so_path}${libavfilter_name} cp ${build_so_path}${libavformat_name} ${so_path}${libavformat_name} cp ${build_so_path}${libavutil_name} ${so_path}${libavutil_name} cp ${build_so_path}${libpostproc_name} ${so_path}${libpostproc_name} cp ${build_so_path}${libswresample_name} ${so_path}${libswresample_name} cp ${build_so_path}${libseacale_name} ${so_path}${libseacale_name} Copy the code
- the code block5, this article from the book, nuggets: when xi -----// Create a build_ffmpeg task that runs the shell script
task build_ffmpeg {
doLast {
exec {
commandLine 'sh'.'/Users/whensunset/AndroidStudioProjects/KSVideoProject/FFmpeglearning/app/build_ffmpeg.sh'}}}// Execute the build_ffmpeg task as a pre-compilation task.
tasks.whenTaskAdded { task ->
task.dependsOn 'build_ffmpeg'
}
Copy the code
Two, FFmpeg decoding
In the last article, we briefly analyzed an official DEMO of FFmpeg. A few weeks later, the project now has five official demos with successful porting, all working. So in this chapter I’m going to analyze the decoding demo. Lay the groundwork for writing a simple Android video player for the final chapter.
Ffmpeg-learing: Example project for this chapter.
Write a Douyin App from Scratch — Audio and Video Introduction: The last article.
1. Start
- 1. First of all, the project is relatively simple. The entrance is MainActivity, which has many buttons.
- 2. After clicking the button, a thread will be opened to execute the corresponding code, which will eventually go into the c++ code to use FFmpeg Api for video file processing.
- 3.FFmpegPlayer this Java class is used to call c++ code.
- 4. Player. CPP is an entry point for native code.
- 5. You probably haven’t forgotten the log we added to FFmpeg in the last chapter. Some people might ask, where is the log actually visible?In C/C ++, there’s a concept of a standard output stream, and Ffmpeg logs are output to the standard output stream, and the standard output stream prints data to the console or something, so we can redirect the output stream of this log stream to the Android log stream, So we can see it in Logcat in Android Studio.
- CPP file contains the code in block 6. Here we define two macros, which are the android log printing method provided in NDK respectively. We set the log TAG to “FFmpeg”. Later we just need to filter this field in the AS console to see the internal output of FFmpeg log.
- 2. We then define a method that we expect to be called after FFmpeg prints the log, and then pass the log that FFmpeg prints to the method to output the log to the Android log.
- 3. Look at block 7, which is in player. CPP, where FFmpeg provides av_log_set_callback, which holds the method we just defined as a function pointer to FFmpeg. Whenever FFmpeg makes a log call, it triggers the method we defined in 2 to redirect FFmpeg’s log output stream to our Android logging system.
- 4. Of course, we need to define the native method in FFmpegPlayer, and then conduct the initial call in MainActivity.
-- -- -- -- -- code block6, this article from the book, nuggets: when xi -----------#ifndef LOG_TAG
#define LOG_TAG "FFMPEG"
#endif
#define XLOGD(...) __android_log_print(ANDROID_LOG_INFO,LOG_TAG,__VA_ARGS__)
#define XLOGE(...) __android_log_print(ANDROID_LOG_ERROR,LOG_TAG,__VA_ARGS__)
static void log_callback_null(void *ptr, int level, const char *fmt, va_list vl)
{
static int print_prefix = 1;
static char prev[1024];
char line[1024];
av_log_format_line(ptr, level, fmt, vl, line, sizeof(line), &print_prefix);
strcpy(prev, line);
if (level <= AV_LOG_WARNING)
{
XLOGE("%s", line);
}
else
{
XLOGD("%s", line); }}Copy the code
-- -- -- -- -- code block7, this article from the book, nuggets: when xi -----------extern "C"
JNIEXPORT void JNICALL
Java_com_example_whensunset_ffmpeg_1learning_FFmpegPlayer_initFfmpegLog(JNIEnv *env, jobject instance) {
av_log_set_callback(log_callback_null);
}
Copy the code
2. The decoding
- 1. The following code is the decoding code, you can find in the sample project FFMPEG_ pure decoder button click to trigger this function.
- 2. Before running, copy the C.mpeg4 file in the example project to the directory **/storage/emulated/0/av_test/ ** in the mobile phone.
- 3. As a prerequisite, we need to understand that an MP4 file is parsed to the screen in the following steps:
- 1. Unpacking: parse the structure of Mp4 files, and then read the data flow in the files.
- 2. Decoding: 1 data flow is compressed by encoding algorithm, generally h264, MPEG4 and other encoding methods. This step requires that each frame of the data stream be decoded into an image-like form.
- 3. Display: Draw the image decoded in 2 to the screen.
- 4. The following code is mainly used to directly decode the c.Mpeg4 file we passed into c.uv, the original image data, without the process of unpacking.
- the code block8, this article from the book, nuggets: when xi -----#include <stdio.h>
#include <stdlib.h>
#include <string.h>
extern "C" {
#include "libavcodec/avcodec.h"
}
#define INBUF_SIZE 4096
static void pgm_save(unsigned char *buf, int wrap, int xsize, int ysize,
const char *filename) {
FILE *f;
int i;
f = fopen(filename, "w");
fprintf(f, "P5\n%d %d\n%d\n", xsize, ysize, 255);
for (i = 0; i < ysize; i++)
fwrite(buf + i * wrap, 1, xsize, f);
fclose(f);
}
static int decode(AVCodecContext *dec_ctx, AVFrame *frame, AVPacket *pkt,
const char *filename) {
char buf[1024];
int ret;
// Pass a compressed image into the decoder
ret = avcodec_send_packet(dec_ctx, pkt);
if (ret < 0) {
return ret;
}
while (ret >= 0) {
Avcodec_send_packet and AVCODEC_receive_frame are usually corresponding. The RET is less than 0 when the data is retrieved
ret = avcodec_receive_frame(dec_ctx, frame);
if (ret < 0) {
return ret;
}
av_log(NULL, AV_LOG_DEBUG, "saving frame %3d\n", dec_ctx->frame_number);
fflush(stdout);
/* the picture is allocated by the decoder. no need to free it */
snprintf(buf, sizeof(buf), "%s-%d", filename, dec_ctx->frame_number);
/ /... **
/ /... **
/ /... **
/ /... **
/ /... **
/ /... **
/ /... **
// As shown above, the dot is an image we usually see, * is useless data. In general: width refers to the number of points in a row, height refers to the number of points in a row, and linesize[0] refers to the number of points in a row.
// data[0] stores data in........ ** ** ** ** ** ** ** Tils a frame of image.
// The final data we saved in the file looks like this:........ . . . . . . The middle space does not exist in the file, just to look good
pgm_save(frame->data[0], frame->linesize[0],
frame->width, frame->height, filename);
}
return 0;
}
char *decode_video(char **argv) {
const char *filename, *outfilename;
const AVCodec *codec;
AVCodecParserContext *parser;
AVCodecContext *c = NULL;
FILE *f;
AVFrame *frame;
uint8_t inbuf[INBUF_SIZE + AV_INPUT_BUFFER_PADDING_SIZE];
uint8_t *data;
size_t data_size;
int ret;
AVPacket *pkt;
// The names of the input and output files. The input file is C.mpeg4 and the output file is C.uv.
filename = argv[0];
outfilename = argv[1];
// Register all codecs
avcodec_register_all();
// Initializes AVPacket, the data structure used for a compressed image frame
pkt = av_packet_alloc();
if(! pkt)exit(1);
// Set inbuf from INBUF_SIZE to INBUF_SIZE + AV_INPUT_BUFFER_PADDING_SIZE to 0(this ensures that corrupted MPEG streams are not overread)
/* set end of buffer to 0 (this ensures that no overreading happens for damaged MPEG streams) */
memset(inbuf + INBUF_SIZE, 0, AV_INPUT_BUFFER_PADDING_SIZE);
// Find a codec by name, here we use the input file's codec MPEG4
codec = avcodec_find_decoder_by_name("mpeg4");
if(! codec) { ret =- 1111.;
goto end;
}
// Based on the id of the codec, find a parser that can be used to parse out a compressed frame of data in the MPEG4 file stream
parser = av_parser_init(codec->id);
if(! parser) { ret =- 1112.;
goto end;
}
// Initializes the encoder's context data structure according to the codec.
c = avcodec_alloc_context3(codec);
if(! c) { ret =- 1113.;
goto end;
}
// Call the codec
if ((ret = avcodec_open2(c, codec, NULL))"0) {
goto end;
}
// Open the file
f = fopen(filename, "rb");
if(! f) { ret =- 1114.;
goto end;
}
// Initialize the AV_Frame data structure, which is used to store a decoded frame of the image
frame = av_frame_alloc();
if(! frame) { ret =- 1115.;
goto end;
}
// Loop until the input file is read to the end
while(! feof(f)) {// Read 4096 bytes from the original file
data_size = fread(inbuf, 1, INBUF_SIZE, f);
if(! data_size)break;
// The 4096 bytes may contain multiple frames of compressed image, so the compressed image data is parsed out one frame at a time, then decoded into one frame of decoded image data, and then recycled until the 4096 bytes have been read.
data = inbuf;
while (data_size > 0) {
// A frame of compressed image data is parsed to AV_Packet from 4096 bytes with data as the starting point. The return value is the byte size of the compressed frame
if ((ret = av_parser_parse2(parser, c, &pkt->data, &pkt->size, data, data_size,
AV_NOPTS_VALUE, AV_NOPTS_VALUE, 0))"0) {
goto end;
}
// Move data to the new starting point
data += ret;
// Records the size of the remaining available bytes in the 4096 bytes
data_size -= ret;
// If size is greater than 0, data was successfully read
if (pkt->size) {
// Parse a PKT packet into a framedecode(c, frame, pkt, outfilename); }}}/* flush the decoder */
decode(c, frame, NULL, outfilename);
fclose(f);
end:
av_parser_close(parser);
avcodec_free_context(&c);
av_frame_free(&frame);
av_packet_free(&pkt);
if (ret < 0) {
char buf2[500] = {0};
if (ret == - 1111.) {
return (char *) "codec not found";
} else if (ret == - 1112.) {
return (char *) "parser not found";
} else if (ret == - 1113.) {
return (char *) "could not allocate video codec context";
} else if (ret == - 1114.) {
return (char *) "could not open input file";
} else if (ret == - 1115.) {
return (char *) "could not allocate video frame";
}
av_strerror(ret, buf2, 1024);
return buf2;
} else {
return (char *) "Decoded successfully"; }}Copy the code
Minimalist video player
The last chapter will introduce a minimalist video player decoded with FFmpeg.
- 1. First of all, the video player is very simple, so simple that there is nothing, just draw the image decoded from the file onto the surface.
- 2. The usage of the example program is as follows: Copy the video to be played to the mobile phone with the name /storage/emulated/0/av_test/ B.MP4.
- 3. For more information, you can look at the code, which has comments. A little tired of writing, this article will stop here 🙂
- the code block9, this article from the book, nuggets: when xi -----extern "C"
{
#include <android/native_window.h>
#include <android/native_window_jni.h>
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "libswscale/swscale.h"
#include "libavutil/imgutils.h"
};
#include <sys/time.h>
#include <unistd.h>
#include <pthread.h>
static AVFormatContext *pFormatCtx;
static AVCodecContext *pCodecCtx;
static int video_stream_index = - 1;
static AVCodec *pCodec;
static int64_t last_pts = AV_NOPTS_VALUE;
static long getCurrentTime(a)
{
struct timeval tv;
gettimeofday(&tv,NULL);
return tv.tv_sec * 1000 + tv.tv_usec / 1000;
}
struct timeval now;
struct timespec outtime;
pthread_cond_t cond;
pthread_mutex_t mutex;
static void sleep(int nHm) {
gettimeofday(&now, NULL);
now.tv_usec += 1000 * nHm;
if (now.tv_usec > 1000000) {
now.tv_sec += now.tv_usec / 1000000;
now.tv_usec %= 1000000;
}
outtime.tv_sec = now.tv_sec;
outtime.tv_nsec = now.tv_usec * 1000;
pthread_cond_timedwait(&cond, &mutex, &outtime);
}
static int open_input_file(const char *filename) {
int ret;
// Open the file, confirm the encapsulation format of the file, and write the file's information to AVFormatContext
if ((ret = avformat_open_input(&pFormatCtx, filename, NULL.NULL))"0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open input file\n");
return ret;
}
// Parse information from AVFormatContext for various streams in a file, such as audio streams, video streams, subtitle streams, etc
if ((ret = avformat_find_stream_info(pFormatCtx, NULL))"0) {
av_log(NULL, AV_LOG_ERROR, "Cannot find stream information\n");
return ret;
}
// Find the most suitable data stream and the codec for that data stream according to the parameters passed in
ret = av_find_best_stream(pFormatCtx, AVMEDIA_TYPE_VIDEO, - 1.- 1, &pCodec, 0);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot find a video stream in the input file\n");
return ret;
}
// The index of the video stream will be found temporarily
video_stream_index = ret;
// Construct the codec context based on the codec of the video stream found earlier
pCodecCtx = avcodec_alloc_context3(pCodec);
if(! pCodecCtx)return AVERROR(ENOMEM);
// Use information about the video stream to codec context parameters
avcodec_parameters_to_context(pCodecCtx, pFormatCtx->streams[video_stream_index]->codecpar);
// Open the codec
if ((ret = avcodec_open2(pCodecCtx, pCodec, NULL))"0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open video decoder\n");
return ret;
}
return 0;
}
int play(JNIEnv *env, jobject surface) {
int ret;
char filepath[] = "/storage/emulated/0/av_test/b.mp4";
// Initialize libavFormat and register all wrappers, decapsulers and protocols.
av_register_all();
if (open_input_file(filepath) < 0) {
av_log(NULL, AV_LOG_ERROR, "can not open file");
return 0;
}
// Initialize two data structures that store decoded video frames, pFrame represents decoded video frames, and pFrameRGBA represents converted pFrame to RGBA format video frames
AVFrame *pFrame = av_frame_alloc();
AVFrame *pFrameRGBA = av_frame_alloc();
// Calculate the byte size of video frames in RGBA format. The length and width of video frames are determined when unpacking
int numBytes = av_image_get_buffer_size(AV_PIX_FMT_RGBA, pCodecCtx->width, pCodecCtx->height, 1);
// Initialize a block of memory that is the size of RGBA video frames
uint8_t *buffer = (uint8_t *) av_malloc(numBytes * sizeof(uint8_t));
/ / fill the buffer
av_image_fill_arrays(pFrameRGBA->data, pFrameRGBA->linesize, buffer, AV_PIX_FMT_RGBA,
pCodecCtx->width, pCodecCtx->height, 1);
// Since the decoded frame format is not RGBA, format conversion is required before rendering
struct SwsContext *sws_ctx = sws_getContext(pCodecCtx->width.pCodecCtx->height.pCodecCtx->pix_fmt.pCodecCtx->width.pCodecCtx->height.AV_PIX_FMT_RGBA.SWS_BILINEAR.NULL.NULL.NULL);
// Get the native window
ANativeWindow *nativeWindow = ANativeWindow_fromSurface(env, surface);
// Get the video width and height
int videoWidth = pCodecCtx->width;
int videoHeight = pCodecCtx->height;
// Set the buffer size of native window to automatically stretch
ANativeWindow_setBuffersGeometry(nativeWindow, videoWidth, videoHeight,
WINDOW_FORMAT_RGBA_8888);
ANativeWindow_Buffer windowBuffer;
av_dump_format(pFormatCtx, 0, filepath, 0);
// Initializes the data structure for compressed video frames
AVPacket *packet = (AVPacket *) av_malloc(sizeof(AVPacket));
while (1) {
long start_time = getCurrentTime();
// Read a compressed frame from the video stream
if ((ret = av_read_frame(pFormatCtx, packet)) < 0) {
av_log(NULL, AV_LOG_DEBUG, "can not read frame");
break;
}
If the compressed frame is read from the video stream, it can be decoded
if (packet->stream_index == video_stream_index) {
/ / decoding
ret = avcodec_send_packet(pCodecCtx, packet);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while sending a packet to the decoder\n");
break;
}
while (ret >= 0) {
/ / decoding
ret = avcodec_receive_frame(pCodecCtx, pFrame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
break;
} else if (ret < 0) {
av_log(NULL, AV_LOG_ERROR,
"Error while receiving a frame from the decoder\n");
}
ANativeWindow_lock(nativeWindow, &windowBuffer, 0);
// Convert YUV data to RGBA data
sws_scale(sws_ctx, (uint8_t const *const *) pFrame->data,
pFrame->linesize, 0, pCodecCtx->height,
pFrameRGBA->data, pFrameRGBA->linesize);
/ / for stride
uint8_t *dst = (uint8_t *) windowBuffer.bits;
int dstStride = windowBuffer.stride * 4;
uint8_t *src = pFrameRGBA->data[0];
int srcStride = pFrameRGBA->linesize[0];
// Since the stride for window is different from the stride for frame, it is necessary to copy the data of image frame line by line into the buffer stream of Surface.
int h;
for (h = 0; h < videoHeight; h++) {
memcpy(dst + h * dstStride, src + h * srcStride, srcStride);
}
// If decoding time is fast, then sleep for a while
int sleep_time = 40 - (getCurrentTime() - start_time);
if (sleep_time > 0) {
sleep(sleep_time);
}
ANativeWindow_unlockAndPost(nativeWindow);
}
}
av_packet_unref(packet);
}
if (sws_ctx) sws_freeContext(sws_ctx);
av_frame_free(&pFrameRGBA);
if (pFrame) av_frame_free(&pFrame);
if (pCodecCtx) avcodec_close(pCodecCtx);
if (pFormatCtx) avformat_close_input(&pFormatCtx);
if (buffer) av_free(buffer);
return 0;
}
Copy the code
Four, tail
At the end of another article, the company has worked too much overtime recently, and many plans have not been carried out as scheduled. I hope that after this month, things will be better. There is no need to reward, I just hope that you can comment, like and follow me more, which can also be regarded as my support and encouragement. See you in the next article!
No angst peddling, no clickbait. Share some interesting things about the world. Topics include but are not limited to: science fiction, science, technology, the Internet, programmers, computer programming. The following is my wechat public number: Interesting things in the world, dry goods waiting for you to see.