preface
Last time we learned FFmpeg decoding, pixel format conversion and audio resampling, this time we mainly learn the QT cross-platform audio and video rendering API.
QT for MAC OS & FFmpeg environment
Cross-platform Player development (2) QT for Linux & FFmpeg environment build
Cross-platform Player development (3) QT for Windows & FFmpeg environment build
Cross-platform Player Development (iv) What FFmpeg knowledge is needed to develop a player
PCM rendering
In fact, whether Android AudioTrack or OpenSL ES to render PCM, the principle is the same, is to configure the basic information of PCM, such as sampling rate, number of channels, sampling bit number, Then you can write(pcmBuffer) data according to the sound card callback, and we will code according to this idea step.
Step 1: Set basic PCM information
To configure audio information, we will use the QAudioFormat object. According to the official website, we want to do multimedia programming, we need to configure the Multimedia module
This can be configured in cMakelists.txt
set(QT_VERSION 5)
set(REQUIRED_LIBS Core Gui Widgets Multimedia)
set(REQUIRED_LIBS_QUALIFIED Qt5::Core Qt5::Gui Qt5::Widgets Qt5::Multimedia)
find_package(Qt${QT_VERSION} COMPONENTS ${REQUIRED_LIBS} REQUIRED)
add_executable(qt-audio-debug ${QT_AUDIO_SRC})
target_link_libraries(qt-audio-debug ${REQUIRED_LIBS_QUALIFIED})
Copy the code
Next, call QAudioFormat to configure the audio information
QAudioFormat format;
// Set the sampling rate
format.setSampleRate(this->sampleRate);
// Set the channel
format.setChannelCount(this->channelCount);
// Set the number of sampling bits
format.setSampleSize(this->sampleSize);
format.setCodec("audio/pcm");
format.setByteOrder(QAudioFormat::LittleEndian);
format.setSampleType(QAudioFormat::SignedInt);
const QAudioDeviceInfo audioDeviceInfo = QAudioDeviceInfo::defaultOutputDevice(a);QAudioDeviceInfo info(audioDeviceInfo);
// Whether this setting is supported
bool audioDeviceOk = info.isFormatSupported(format);
if(! audioDeviceOk) {qWarning() < <"Default format not supported - trying to use nearest";
format = info.nearestFormat(format);
}
Copy the code
Step 2: Send the audio data to the audio output device interface
// Pass the audio data and device information configured above to the audio output object
auto *audioOutput = new QAudioOutput(audioDeviceInfo, format)
// Start playing
audioOutput->start(QIODevice *device)
Copy the code
When playing, we need to pass in a QIODevice class, which is what we said earlier, and the sound card will give us a callback to write PCM data. If we don’t use the QIODevice and just keep writing according to an infinite loop it’s not going to work, it has a buffer underneath it, and we’re writing when the buffer runs out, that’s the best way to do it.
Step 3: Provide PCM data to sound card
First, we’re going to inherit from QIODevice and then override the readData function
class PCMPlay : public QIODevice {
Q_OBJECT
public:
PCMPlay(a); .qint64 readData(char *data, qint64 maxlen) override; . };#pcmplay.cpp
qint64 PCMPlay::readData(char *data, qint64 maxlen) {
if (m_pos >= m_buffer.size())
return 0;
qint64 total = 0;
if(! m_buffer.isEmpty()) {
while (maxlen - total > 0) {
const qint64 chunk = qMin((m_buffer.size() - m_pos), maxlen - total);
memcpy(data + total, m_buffer.constData() + m_pos, chunk);
m_pos = (m_pos + chunk) % m_buffer.size();
total += chunk;
}
}
return maxlen;
}
Copy the code
In this step, we need to copy the PCM data to the data address of readData. When the PCM buF read from the data is sent to the sound card, then there will be sound.
Then if you want to pause or do anything else, you can call the following function provided by QAudioOutput:
void stop(a);
void reset(a);
void suspend(a);
void resume(a);
Copy the code
The code to implement the audio player is still relatively small, and I haven’t posted all the code here for readability. Access complete code
YUV rendering
As far as I know, YUV data cannot be directly rendered on any device. We can only convert YUV data into RGB format before rendering on the graphics card. In the previous article, we used ffMPEG sws_getCachedContext SWs_scale class function to convert. Because ffMPEG conversion is too memory intensive, so we use OpenGL Shader to convert. The conversion formula is as follows:
const char *fString = GET_STR(
varying vec2 textureOut;
uniform sampler2D tex_y;
uniform sampler2D tex_u;
uniform sampler2D tex_v;
void main(void) {
vec3 yuv;
vec3 rgb;
yuv.x = texture2D(tex_y, textureOut).r;
yuv.y = texture2D(tex_u, textureOut).r - 0.5;
yuv.z = texture2D(tex_v, textureOut).r - 0.5;
rgb = mat3(1.0.1.0.1.0.0.0.0.39465.2.03211.1.13983.0.58060.0.0) * yuv;
gl_FragColor = vec4(rgb, 1.0); });Copy the code
To use OpenGL in QT you need to add the following code to cmakelist.txt:
set(QT_VERSION 5)
set(REQUIRED_LIBS Core Gui Widgets Multimedia OpenGL)
set(REQUIRED_LIBS_QUALIFIED Qt5::Core Qt5::Gui Qt5::Widgets Qt5::Multimedia Qt5::OpenGL)
find_package(Qt${QT_VERSION} COMPONENTS ${REQUIRED_LIBS} REQUIRED)
add_executable(qt-audio-debug ${QT_AUDIO_SRC})
target_link_libraries(qt-audio-debug ${REQUIRED_LIBS_QUALIFIED})
Copy the code
Using QOpenGLWidget in QT requires inheriting it:
class QYUVWidget : public QOpenGLWidget, protected QOpenGLFunctions {
Q_OBJECT
public:
QYUVWidget(QWidget *);
~QYUVWidget(a);// Initialize the data size
void InitDrawBufSize(uint64_t size);
/ / to draw
void DrawVideoFrame(unsigned char *data, int frameWidth, int frameHeight);
protected:
// Refresh the display
void paintGL(a) override;
// Initialize gl
void initializeGL(a) override;
// The window size has changed
void resizeGL(int w, int h) override; . }Copy the code
Define the CPP implementation function:
// Initialize the buffer that defines the YUV size
void QYUVWidget::InitDrawBufSize(uint64_t size) {
impl->mFrameSize = size;
impl->mBufYuv = new unsigned char[size];
}
// Call opengL Update when there is new data, then paintGL() is executed.
void QYUVWidget::DrawVideoFrame(unsigned char *data, int frameWidth, int frameHeight) {
impl->mVideoW = frameWidth;
impl->mVideoH = frameHeight;
memcpy(impl->mBufYuv, data, impl->mFrameSize);
update(a); }// Initialize the OpengL function
void QYUVWidget::initializeGL(a) {
// initialize the QT Opengl function
initializeOpenGLFunctions(a);//2. Load and compile the vertex and fragment shader
impl->mVShader = new QOpenGLShader(QOpenGLShader::Vertex, this);
// Compile the vertex Shader program
if(! impl->mVShader->compileSourceCode(vString)) {
throw QYUVException(a); } impl->mFShader =new QOpenGLShader(QOpenGLShader::Fragment, this);
// Compile the chip shader program
if(! impl->mFShader->compileSourceCode(fString)) {
throw QYUVException(a); }//3. Create a program to execute the shader
impl->mShaderProgram = new QOpenGLShaderProgram(this);
// Add the vertex slice shader to the program container
impl->mShaderProgram->addShader(impl->mFShader);
impl->mShaderProgram->addShader(impl->mVShader);
//4, set vertex slice coordinates
impl->mShaderProgram->bindAttributeLocation("vertexIn", A_VER);
// Set material coordinates
impl->mShaderProgram->bindAttributeLocation("textureIn", T_VER);
/ / compile a shader
qDebug() < <"program.link() = " << impl->mShaderProgram->link(a);qDebug() < <"program.bind() = " << impl->mShaderProgram->bind(a);//5, get the shader textures y,u, V
impl->textureUniformY = impl->mShaderProgram->uniformLocation("tex_y");
impl->textureUniformU = impl->mShaderProgram->uniformLocation("tex_u");
impl->textureUniformV = impl->mShaderProgram->uniformLocation("tex_v");
//6, load vertex slice position
/ / the vertices
glVertexAttribPointer(A_VER, 2, GL_FLOAT, 0.0, VER);
glEnableVertexAttribArray(A_VER);
/ / material
glVertexAttribPointer(T_VER, 2, GL_FLOAT, 0.0, TEX);
glEnableVertexAttribArray(T_VER);
//7, create y,u,v texture id
glGenTextures(3, texs);
impl->id_y = texs[0];
impl->id_u = texs[1];
impl->id_v = texs[2];
}
// Bind y, U,v to the corresponding texture ID and render
void QYUVWidget::paintGL(a) {
//1, activate and bind the y texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, impl->id_y);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, impl->mVideoW, impl->mVideoH, 0, GL_LUMINANCE,GL_UNSIGNED_BYTE,
impl->mBufYuv);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
//2, activate and bind the U texture
glActiveTexture(GL_TEXTURE1);//Activate texture unit GL_TEXTURE1
glBindTexture(GL_TEXTURE_2D, impl->id_u);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, impl->mVideoW / 2, impl->mVideoH / 2.0, GL_LUMINANCE,
GL_UNSIGNED_BYTE, (char *) impl->mBufYuv + impl->mVideoW * impl->mVideoH);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
//3, activate and bind the v texture
glActiveTexture(GL_TEXTURE2);//Activate texture unit GL_TEXTURE2
glBindTexture(GL_TEXTURE_2D, impl->id_v);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, impl->mVideoW / 2, impl->mVideoH / 2.0, GL_LUMINANCE,
GL_UNSIGNED_BYTE, (char *) impl->mBufYuv + impl->mVideoW * impl->mVideoH * 5 / 4);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
/ / 4, rendering
// to specify a new value for the y texture, only 0,1,2, etc. can represent the index of the texture unit
glUniform1i(impl->textureUniformY, 0);
glUniform1i(impl->textureUniformU, 1);
glUniform1i(impl->textureUniformV, 2);
/ / rendering
glDrawArrays(GL_TRIANGLE_STRIP, 0.4);
}
// Frame port changes will be updated
void QYUVWidget::resizeGL(int w, int h) {
qDebug() < <"resizeGL " << width << ":" << height;
glViewport(0.0, w, h);
update(a); }Copy the code
Because OpenGL is cross-platform, the call interface is essentially the same on any platform, as long as you learn it on one platform and tweak it a bit on the other. If you are interested in OpenGL, you can refer to the OpenGL ES 3.0 series of tutorials summarized by this big guy.
The program is compiled and run, and the following screen represents success
Access complete code
conclusion
Using QT cross-platform API we implemented YUV & PCM rendering, generally speaking OpenGL is not easy to get started, but as long as we seriously knock out a few examples, in fact, it is the same thing, because the use of the steps are similar. That concludes this article, and the next one will focus on how to design a generic player architecture. farewell