There are several good open source projects mentioned in the live broadcast. Here are some of the highlights:
Currently, there are not many books on audio and video learning in the market, and even if you read the books and learn the theory, you still have to come back to the code.
After all, the IT industry requires high practicality and emphasizes hands-on ability, so audio and video are more manipulative and explorative.
It is recommended that the following projects will have their own focus, covering the Android audio and video recording API, OpenGL rendering and integrated application examples respectively.
- GPUImage
Making address:
Github.com/cats-oss/an…
GPUImage is undoubtedly a must-read project in audio and video projects, which focuses on the rendering aspect.
The importance of familiarity with GPUImage may be indicated in the recruitment requirements of some companies.
By reading the source code of GPUImage, you can master OpenGL rendering and the construction of the rendering chain. At the same time, there are many special effects Shader codes in the project. By reading and practicing these Shader codes, you can master the preliminary Shader writing ability.
For example, for common filter effects, there are already code examples in GPUImage, which I also talked about in my livestream. Interested can read the video, master the common filter effect code.
If you need a source code analysis document related to GPUImage, you can also refer to an article I wrote earlier:
OpenGL GPUImage source code analysis
- AudioVideoRecordingSample
Making the address
Github.com/saki4510t/A…
The project focuses on the use of Android audio and video apis, especially for recording and coding.
The project can encode the video and audio content captured by the Camera into an MP4 file.
It uses MediaCodec for encoding, MediaMuxer to mix audio and video.
Such a complete example for the master of Android audio and video related API help is very big, because it can successfully run correctly, and can modify its source code to do their own experiments, verify their understanding of the API and master.
Once you’ve got the hang of it, or you can take it one step further and try FFmpeg for audio and video encoding and mixing to achieve the same functionality as the Android AUDIO and video API.
- Grafika
Making the address
Github.com/google/graf…
This project is an unofficial project provided by Google, which focuses on integrating OpenGL with Android audio and video apis.
It contains a number of complete small examples, such as how to display OpenGL content using TextureView, how to record OpenGL content in three ways, how to hard code, and so on.
By reading through these examples, you will be able to master more skills and use the OpenGL and Android audio and video apis in a more flexible way.
There are even examples that can be used from earlier requirements in the project, such as how to encapsulate EGL and separate the render thread from the main thread.
In addition, the above three examples all contain camera-related operations, such as how to display Camera content to SurfaceView, TextureView, how to take Camera shots, etc.
The last
The previous three projects were science books, and for good reason. At least I have read both sides of the source code above.
When you read it for the first time, you’re like, “Oh, I see what’s going on,” and when you do it, you have to go through it all again, and then you look back and you’re like, “Oh, that’s the way it’s going to be,” and when you get better, you look back and you’re like, “Oh, that’s the way it’s going to be.”
Above, I hope to be engaged in audio and video development for you, can also look at the above several project source code, learn more skills, common progress.