Forwarded from white Wolf Stack: Check out the original
A filter
What is a filter? Baidu Baike explains that “filters are mainly used to achieve various special effects on images……” .
We first learned about filters in the article how FFMPEG is transcoded, to review the transcoding flow chart at that time.
In this picture, you can see the dotted lines before and after the filter, indicating the optional. In terminology, a filter refers to the action of processing the raw data (i.e. audio and video frames) decoded by the decoder before encoding. We can also call it a filter.
Ffmpeg has nearly 400 filters built in. You can view all of them using the ffmpeG-filters command, using the ffmpeg-h filter= XXX command or looking at the official documentation for each filter.
In fact, filters are used in most audio and video processing, so you can see how important they are.
Multiple filters can be combined to form A filter chain or filter image. In each filter, not only the input source can be processed, but the processed results of filter A can also be used as the input parameters of filter B for further processing through filter B.
For filter processing, FFMPEG provides two ways of processing, simple filter and complex filter.
A simple filter
A simple filter is one that has only one input and one output and ensures that the input and output streams are of the same type.
For example, how do we select streams in operation (2) of flow in the previous article? At the end of the original video r3.MP4 scale twice
ffmpeg -i r3.mp4 -vf scale=272:480 -y filter.mp4
-vf is short for -filter:v. Similarly, we can use -filter:a or -af for audio streams.
-filter[:stream_specifier] filtergraph (
output,per-stream)
Stream_specifier Stream type we generally use a for audio, V for video, and filtergraph for specific filters
Scale filter.
The Scale filter is used to adjust the size of the video, such as equal-scale scaling and equal-scale zooming. If you do not do the equal-scale operation, the output will be deformed. Generally, we do not consider the deformation result.
Since we know that the resolution of the original video R1Ori.MP4 is 544×960, so the scale is double, the above command directly specifies 272×480, the scale filter comes with many parameters, we will introduce a few commonly used.
In_w IN_h or iw Ih indicates the width and height of the input video
Out_w out_H or ow OH indicates the width and height of the output video
Of course, it doesn’t have to be video, the input and output can also be pictures.
So if I zoom in twice, I can also say:
ffmpeg -i r3.mp4 -vf scale=iw/2:ih/2 -y filter.mp4
Question 1: What if we want to adjust the original video width to 300 and keep the original resolution?
Let’s say 544/960 =300 /x, x=300×960/540, which is a lot of trouble, but it’s not necessarily divisible, so we can just say that the height is equal to negative 1, and it will automatically scale.
ffmpeg -i r1ori.mp4 -vf scale=300:-1 -y filter.mp4
The result indicates that transcoding fails
[libx264 @ 0x7ff509053a00] height not divisible by 2 (300x529) Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, Width or height [aac@0x7ff50904e200] Qavg: 28010.410 [aac@0x7ff50904e200] 2 frames left in the queue on closing
Height not divisible by 2 (300×529) This is because some codecs require that the width and height of many videos be multiples of n (in this case n is 2), so when writing scripts that deal with video or images, don’t use -1, use -2 instead.
Ffmpeg-i r1ori. Mp4-vf Scale =300: -2-y filter.mp4 The resolution of the output video is 300 × 530
Problem 2: in order to spite you, the boss put forward a new requirement: “I want the resolution of all output videos to be 300×500 and not deformed”, how to do?
We know that the aspect ratio of 3:5 is very rare, and now the common resolution is 16:9 and 4:3. That is to say, we have to process the original video to meet the boss’s abnormal demand.
For the original video R1Ori.MP4, if the width is guaranteed to be 300 and the height after scaling is 530, forcing the height to be 500 will deform, which means we can only make the height equal to 500, try to reduce the width as much as possible.
Ffmpeg-i r1ori. Mp4-vf Scale = -2:5000-y filter.mp4 The resolution of the output video is 284x500
As shown in the image above, the blue box indicates the actual width and height of the video, and the red box indicates the target width and height.
After reviewing the documentation we found that the Pad filter could solve our problem.
-pad=width[:height[:x[:y[:color]]]]
Mp4-vf "scale=-2:500,pad=300:500 (300-IW)/2:0" -y filter2.mp4 2, FFMPEG-I r1ori. Mp4-vf "scale=-2:500,pad=300:500 (300-IW)/2:0" -y filter2.mp4 Mp4-vf scale=-2:500,pad=300:500:-1:0: black-y filter.mp4 3. Ffmpeg-i r1ori 4, ffmPEG-I r1ori. Mp4-vf "scale=-2:500,pad=300: IH :(OW-IW)/2:0:green" -y filter.mp4
The above provides 4 writing methods, we do a simple introduction to method 4.
So scale is equal to -2:500, so the height is equal to 500, which is 284×500.
Pad =300: IH :(OW-IW)/2:0:green, 300: iH: 300:500 is the width and height of the red box (OW-IW)/2, which refers to the half of the difference between the red box and the blue box, namely the range to be filled on both sides; The last parameter indicates the color to be filled. The default is black, but for debugging purposes we’ll set the color to green.
Now we have guaranteed that the current video will be output at the ratio of 300×500 without distortion, but please pay attention to what the boss said about “all output videos”, that is to say, the resolution of the input video may be 200×300, 544×960, 500×400, 200×800 and so on. All the ratios should be output at 300×500. Obviously, the above writing method is not completely universal, how to do?
Now that we know the width and height of the original input video and the desired width and height, we can solve this situation by developing a set of processing rules:
- Width and height are small, no stretching, no scaling
- Both width and height are larger, and the height shall prevail
- If the width exceeds the range, the width shall prevail
- If the height exceeds the range, the proportion shall be reduced, and the height shall prevail
In the actual development process, we work with code, and the normal implementation on the command line is an exercise, so based on this rule, we have the following code
<? php declare(strict_types=1); Class CalculatorService {/** * User video resolution conversion * rule: * Width and height are too small, no stretching, no scaling * Width and height are too large, proportionally reduced, depending on the height * Width is out of range, proportionally reduced, depending on the width * Height is out of range, proportionally reduced, depending on the width * Height is out of range, proportionally reduced, depending on the width * Height is out of range, proportionally reduced, depending on the width * Height is out of range, proportionally reduced, @param int $inputWidth input video width * @param int $inputHeight input video height * @param int $outWidth Output video width and height * @param int Public function getSize(int $inputWidth, int $inputHeight, int $outWidth, int $outWidth) int $outHeight): string { $scale = ""; if ($inputWidth <= $outWidth && $inputHeight <= $outHeight) { $scale = "scale={$inputWidth}:{$inputHeight},pad={$outWidth}:{$outHeight}:-1:-1:green"; } elseif (($inputWidth > $outWidth && $inputHeight > $outHeight) || ($inputHeight > $outHeight) ) { $scale = "scale=-2:{$outHeight},pad={$outWidth}:{$outHeight}:-1:0:green"; } elseif ($inputWidth > $outWidth) { $scale = "scale={$outWidth}:-2,pad={$outWidth}:{$outHeight}:0:-1:green"; } return $scale; } } $calculatorService = new CalculatorService(); var_dump($calculatorService->getSize(200, 300, 300, 500)); var_dump($calculatorService->getSize(544, 960, 300, 500)); var_dump($calculatorService->getSize(500, 400, 300, 500)); var_dump($calculatorService->getSize(200, 600, 300, 500)); // result string(37) "scale=200:300,pad=300:500:-1:-1:green" string(35) "scale=-2:500,pad=300:500:-1:0:green" string(35) "scale=300:-2,pad=300:500:0:-1:green" string(35) "scale=-2:500,pad=300:500:-1:0:green"
In order to understand this, you can refer to the following figure for one-to-one mapping.
Sophisticated filters
Compared to a simple filter, a complex filter is a filter image that can handle any number of input and output effects. It can do almost anything.
Complex filters are represented by the command -filter_complex and also have an alias -lavfi.
The previous article showed that the combination of streams and filters is one of the most important and commonly used methods. Still is the input video R3.MP4 equal scale to double, we take the manual selection of streams as an example.
ffmpeg -i r3.mp4 -filter_complex "[0]scale=272:480[out]" -map 0:a -map "[out]" -y filter.mp4
A simple analysis is as follows:
- [0]scale=272:480[out] [0]scale=272:480[out] [0:v]
- [0] Combined with scale filter, it means that the first input video is input as the parameter of scale filter;
- [out] brackets are required. Out is a custom alias that, combined with the Scale filter, means that the output of the scale filter is named [out], but it is not the final output. It can only be used as a result of the intermediate process output.
- -map “[out]” directly selects the [out] stream as the output
As we said, the output of one filter is used as the input of the other filter, which greatly avoids the need to write multiple commands repeatedly. Our rule is to never use two commands if one can handle it.
Repetitive codec operations with lossy codecs degrade the original video quality.
For example, now I want to cut out the middle part of the original video R1ori.MP4, but still keep the resolution of the original video at 544×960, how do I do it?
ffmpeg -i r1ori.mp4 -filter_complex "nullsrc=s=544x960[background]; \
crop=iw:(ih/2 - 110):0:250[middle]; \
[background][middle]overlay=shortest=1:x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[out]" \
-map "[out]"
-map 0:a
-movflags +faststart
-y fc.mp4
This command is a little longer and uses three common filters: NULlsrc, crop, and OVERLAY.
The Nullsrc filter is used to create an empty video, which simply means an empty canvas or green cloth, since the default color is green. S specifies the size of the canvas. The default is 320×240, which means we create a 544×960 canvas and name it Background.
There are many different types of nullsrc users, such as using Nullsrc and CIQRCodeGenerator to create a QR code for the home page of “White Wolf Stack”
ffmpeg -f lavfi -i nullsrc=s=200x200,coreimage=filter=CIQRCodeGenerator@inputMessage=\
http\\\\\://manks.top/@inputCorrectionLevel=H -frames:v 1 manks.png
Crop filter is used to crop the video, which means that we can crop out any area of the video of any size. crop=iw:(ih/2 – 110):0:250[middle]; Here we’re going to crop the middle of the original video and call it middle;
An overlay filter shows two videos stacked on top of each other. If set to 1, force the output to terminate when the shortest input terminates. Default value is 0. Since we created a canvas with no timeline using Nullsrc, we need the middle video time as the final time, so set it to 1. Main_w and main_H indicate the width and height of the main video, and overlay_w and overlay_h indicate the width and height of the overlay video. If video A is to be superimposed on video B, main_w and main_h represent the width and height of video B, and overlay_w and overlay_h represent the width and height of video A. Add middle to the background and place it in the middle of the background.
The last parameter is -movflags, which relates to the MP4 metadata. Setting it to fastStart will move MOOV to the front of MDAT, making it a little faster when playing online.
Assignment: We have introduced two cases in the audio and video synthesis case. Go and try to see if you can solve them with one command.
This is all about the basic introduction of filters, any questions can be left below.