preface

In Android audio and video development, online knowledge is too fragmented, self-learning is very difficult, but audio and video cattle Jhuster put forward “Android audio and video from the entry to improve – task list”. This article is one of the Android audio and video tasks list, the corresponding content to learn is: learn MediaCodec API, complete audio AAC hard programming, hard solution.


Audio and video task list

Audio and video task list: Click here to jump to view.


directory

(a) what is coding, decoding?

In the field of audio and video, we often say that coding is compression, and decoding is decompression. The purpose of encoding is to reduce the volume of data, reduce storage space and the bandwidth required to transfer stored files. The encoded data cannot be used directly. It must be decoded to its original form. For example, there is a picture in the ZIP compressed file, we can not open it with the picture viewer, we must first decompress the file, restore the original data of the picture, so as to view. Audio and video coding and decoding is the same.


(2) A brief introduction to MediaCodec

(1) A brief introduction to MediaCodec

The MediaCodec class is an Android audio codec API. MediaCodec uses a producer-consumer model based on circular buffers to process data asynchronously. On the input side, the Client is the ring buffer “producer” and MediaCodec is the “consumer”. On the Output side, MediaCodec is the ring buffer “producer” and the Client becomes the “consumer”.


(2) MediaCodec workflow

(1) The Client requests an empty buffer from the input buffer queue (dequeueInputBuffer). (2) The Client copies the data that needs to be codified into the empty buffer. (3) MediaCodec takes a frame from the input buffer queue for encoding and decoding. (4) After processing, MediaCodec sets the original buffer to empty and puts it back into the input buffer queue, (5) The Client requests the dequeueOutputBuffer from the output buffer queue. (6) The Client renders/plays the codec buffer (7) After rendering/playback, Client puts the buffer back into the output buffer queue [releaseOutputBuffer]

(3) MediaCodec’s life cycle

A MediaCodec life cycle has three states-stopped, Executing, and releasing. The Stopped state consists of three sub-states: Uninitialized, Configured, and Error. Executing a substate is Flushed, Running, and end-of-stream.(1) The codec is uninitialized when it is created. First you need to call configure(…). The start() method makes it Configured and then calls the start() method to make it a Executing state. In Executing, you can use the above buffer to process data. (2) Uploading state is also divided into three uploading states: Running state, end-of-stream state. After the Flushed () call, the uploading codec is as the uploading state, where it saves all the buffers. Once the first input buffer is present, the codec automatically runs to the Running state. When a buffer with the end-of-stream flag is entered, the codec enters the end-of-stream state. In this state, the codec no longer accepts the input buffer but still generates the output buffer. At this point you can call the flush() method to reset the uploading codec to the Flushed state. (3) A call to stop() returns the codec to an uninitialized state, which can then be reconfigured. Once you have finished using the codec, you must release it by calling release(). (4) In rare cases, the codec may encounter an error and go to an error state. This is conveyed using invalid return values from queued operations or sometimes via exceptions. Call reset() to make the codec available again. You can call it from any state to move the codec back to the uninitialized state. Otherwise, a call to release() moves to the terminal release state.


(4) The data types supported by the MediaCodec codec

Compressed data, raw audio data, and raw video data you can use ByteBuffers to process these three types of data, but you need to provide a Surface for displaying the raw video data, which can also improve codec performance. Surface uses a local video buffer that is not mapped or copied to ByteBuffers, a mechanism that makes codecs more efficient. Normally with Surface, you can’t access the raw video data, but you can use ImageReader to access the decoded raw video frames. In mode using ByteBuffer, you can access raw video frames using the Image class and getInput/OutputImage (int).


(c) Introduction to MediaCodec API

The MediaCodec lifecycle diagram above contains some of the main MediaCodec methods. Here is an introduction to the main MediaCodec apis:

  • CreateDecoderByType/createEncoderByType: according to the particular MIME type (such as “video/avc”) to create the codec.
  • CreateByCodecName: Create a COdec based on the component name when you know the exact name of the component (e.g. Omx.google.mp3.decoder). Use MediaCodecList to get the name of the component.

Configure: Configures the decoder or encoder. Start: call start after the component is successfully configured. Buffer: Interface to process:

  • DequeueInputBuffer: Encodes data from the input stream queue.
  • QueueInputBuffer: Input flows into the queue.
  • DequeueOutputBuffer: Fetch the data from the output queue after the encoding operation.
  • ReleaseOutputBuffer: ByteBuffer data is released after processing.
  • GetInputBuffers: Gets the input stream queue that needs to encode data. This returns a number of Bytebuffers
  • GetOutputBuffers: Gets the codec data output stream queue, returning a ByteBuffer array.

Flush: Indicates that the input and output ports are cleared. Stop: terminates the decode/encode session release: releases the resources used by the codec instance.


3.1 MediaCodec create

An instance of MediaCodec processes a particular type of data (such as MP3 audio or H.264 video) for encoding or decoding. MediaCodec creation: (1) You can use MediaCodecList to create a MediaCodec for a particular media format.

  • The format of track can be obtained from MediaExtractor#getTrackFormat.
  • Inject any features you want to add using MediaForm #setFeatureEnabled.
  • MediaCodecList#findDecoderForFormat is then called to get the name of the codec that can handle that particular media format.
  • Finally, create the codec with createByCodecName(string).

(2) you can also use createDecoder/EncoderByType (Java. Lang. String) for a particular MIME type to create the preferred codec. However, this cannot be used for injection features and may create a codec that cannot handle a particular media format.


3.2 configure

Configure the codec

public void configure (
            MediaFormat format,
            Surface surface, MediaCrypto crypto, int flags);
Copy the code
  • MediaFormat Format: The format of input data (decoder) or the desired format of output data (encoder). Passing NULL is equivalent to passing MediaFormat#MediaFormat as empty MediaFormat.
  • Surface Surface: Specifies the Surface for rendering the decoder output. Null is passed if the codec does not generate raw video output (for example, not a video decoder) and/or you want to configure the decoder to output ByteBuffer.
  • MediaCrypto crypto: Specifies a crypto object that is used to decrypt media data securely. For insecure codecs, null is passed.
  • Int FLAGS: When the component is an encoder, flags specifies the constant CONFIGURE_FLAG_ENCODE.

MediaFormat: Encapsulates information that describes the format of media data (including audio or video), as well as optional feature metadata.

  • The format of the media data is specified as key/value pairs. Key is a string. Values can be INTEGER, long, float, String, or ByteBuffer.
  • The property metadata is specified as string/ Boolean pairs.


3.3 dequeueInputBuffer

public final int dequeueInputBuffer(long timeoutUs)
Copy the code

Returns the index of the input buffer used to populate valid data, or -1 if no buffer is currently available. Long timeoutUs: The time to wait for an input buffer to be available.

  • If timeoutUs == 0, return immediately.
  • If timeoutUs < 0, wait indefinitely for available input buffers.
  • If timeoutUs > 0, wait for “timeoutUs” microseconds.


3.4 queueInputBuffer

After the input buffer is populated at the specified index, the buffer is submitted to the component using queueInputBuffer. Codec-specific data

  • Many CoDecs require that the actual compressed data stream must be preceded by “CoDec-specific data,” that is, the setup data used to initialize the CODEC
  • PPS/SPS in AVC video.
  • Code tables in vorbis audio.
  public native final void queueInputBuffer(
            int index,
            int offset, int size, long presentationTimeUs, int flags)
Copy the code
  • Int index: The index of the input buffer returned by a previous call to dequeueInputBuffer(long).
  • Int offset: The byte offset entered in the buffer at the start of the data.
  • Int size: number of bytes of valid input data.
  • Long presentationTimeUs: PTS(in microseconds) of this buffer.
  • Int flags: A bitmask consisting of BUFFER_FLAG_CODEC_CONFIG and BUFFER_FLAG_END_OF_STREAM flags. Although not prohibited, most COdec do not use the BUFFER_FLAG_KEY_FRAME flag for input buffers.

BUFFER_FLAG_END_OF_STREAM: Used to indicate that this is the last part of the input data. BUFFER_FLAG_CODEC_CONFIG: By specifying this flag, coDEC-specific buffers can be submitted directly after start() or flush(). However, if you configure a codec with a media format that contains these keys, they will be automatically committed directly by MediaCodec after startup. Therefore, the BUFFER_FLAG_CODEC_CONFIG flag is not recommended and is only recommended for advanced users.


3.5 dequeueOutputBuffer

Get the output buffer from MediaCodec.

public final int dequeueOutputBuffer(
            @NonNull BufferInfo info, long timeoutUs)
Copy the code

Return value: index or one of INFO_* constants of the successfully decoded output buffer (INFO_TRY_AGAIN_LATER, INFO_OUTPUT_FORMAT_CHANGED or INFO_OUTPUT_BUFFERS_CHANGED). Return INFO_TRY_AGAIN_LATER and timeoutUs is specified as non-negative, indicating timeout. Returning INFO_OUTPUT_FORMAT_CHANGED indicates that the output format has changed and subsequent data will follow the new format. BufferInfo Info: output buffer metadata. Long timeoutUs: has the same meaning as the timeoutUs parameter in dequeueInputBuffer.

BufferInfo

public final static class BufferInfo {
        public void set(
                int newOffset, int newSize, long newTimeUs, int newFlags);
        public int offset;
        public int size;
        public long presentationTimeUs;
        public int flags;
};
Copy the code

Offset: The initial offset of the data in the buffer. Note that offsets are inconsistent between devices. On some devices, the offset is the upper-left pixel relative to the clipping rectangle, while on most devices, the offset is the upper-left pixel relative to the entire frame. Size: amount of data in the buffer (in bytes). If the value is 0, the buffer contains no data and can be discarded. The only purpose of a 0 buffer is to carry end-of-stream markers. PresentationTimeUs: PTS of buffer (in microseconds). Derived from PTS passed in with the corresponding input buffer. For buffers of size 0, this value should be ignored. Flags: indicates the flags associated with buffer. The flags can be:

  • BUFFER_FLAG_KEY_FRAME: Buffer contains key frame data.
  • BUFFER_FLAG_CODEC_CONFIG: Buffer contains codec initialization/codec-specific data, not media data.
  • BUFFER_FLAG_END_OF_STREAM: Marks the end of a stream, after which no buffer is available unless it is followed by flush.
  • BUFFER_FLAG_PARTIAL_FRAME: The buffer contains only a portion of the frame, and the decoder should batch the data until a buffer without this flag appears before decoding the frame.
public static final int BUFFER_FLAG_KEY_FRAME = 1;
public static final int BUFFER_FLAG_CODEC_CONFIG = 2;
public static final int BUFFER_FLAG_END_OF_STREAM = 4;
public static final int BUFFER_FLAG_PARTIAL_FRAME = 8;
Copy the code


3.6 releaseOutputBuffer

Use this method to return the output buffer to coDEC or render it on the output surface.

public void releaseOutputBuffer (int index, boolean render)
Copy the code

Boolean render: If a valid surface is specified when coDEC is configured, passing true will render this output buffer on the surface. Once the buffer is no longer in use, the surface releases the buffer back into the CODEC.


(4) The process of using synchronous and asynchronous APIS

4.1 Process for using the Synchronization API

- Create and configure MediaCodec objects. - Loop until complete: - If the input buffer is ready: - Read a piece of input and fill it into the input buffer - If the output buffer is ready: - fetch data from the output buffer for processing. - After processing, release the MediaCodec object.Copy the code

Code examples of the synchronization API given in the official documentation

MediaCodec codec = MediaCodec.createByCodecName(name); The codec. The configure (format,...). ; MediaFormat outputFormat = codec.getOutputFormat();// option B
codec.start();
for (;;) {
  int inputBufferId = codec.dequeueInputBuffer(timeoutUs);
  if (inputBufferId >= 0) {ByteBuffer inputBuffer = codec.getinputBuffer (...) ;// fill inputBuffer with valid data... Codec. QueueInputBuffer (inputBufferId,...). ; }intOutputBufferId = codec. DequeueOutputBuffer (...). ;if (outputBufferId >= 0) {
    ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferId);
    MediaFormat bufferFormat = codec.getOutputFormat(outputBufferId); // option A
    // bufferFormat is identical to outputFormat
    // outputBuffer is ready to be processed or rendered.... Codec. ReleaseOutputBuffer (outputBufferId,...). ; }else if (outputBufferId == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    // Subsequent data will conform to new format.
    // Can ignore if using getOutputFormat(outputBufferId)
    outputFormat = codec.getOutputFormat(); // option B
  }
}
codec.stop();
codec.release();
Copy the code


4.2 Process of using asynchronous APIS

In Android 5.0, API21, “asynchronous mode” was introduced.

- Create and configure MediaCodec objects. - Set the Callback to the MediaCodec object mediacodec. Callback - In the onInputBufferAvailable Callback: - Reads an input and fills it into the input buffer - in the onOutputBufferAvailable callback: - Retrieves data from the output buffer for processing. - After processing, release the MediaCodec object.Copy the code

Code examples of asynchronous apis given in the official documentation

MediaCodec codec = MediaCodec.createByCodecName(name);
MediaFormat mOutputFormat; // member variable
codec.setCallback(new MediaCodec.Callback() {
  @Override
  void onInputBufferAvailable(MediaCodec mc, int inputBufferId) {
    ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferId);
    // fill inputBuffer with valid data... Codec. QueueInputBuffer (inputBufferId,...). ; }@Override
  void onOutputBufferAvailable(MediaCodec mc, intOutputBufferId,...). {
    ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferId);
    MediaFormat bufferFormat = codec.getOutputFormat(outputBufferId); // option A
    // bufferFormat is equivalent to mOutputFormat
    // outputBuffer is ready to be processed or rendered.... Codec. ReleaseOutputBuffer (outputBufferId,...). ; }@Override
  void onOutputFormatChanged(MediaCodec mc, MediaFormat format) {
    // Subsequent data will conform to new format.
    // Can ignore if using getOutputFormat(outputBufferId)
    mOutputFormat = format; // option B
  }
  @Override
  void onError(...). {... }}); The codec. The configure (format,...). ; mOutputFormat = codec.getOutputFormat();// option B
codec.start();
// wait for processing to complete
codec.stop();
codec.release();
Copy the code


(v) Complete code

The AAC is first decoded into PCM, and then the PCM is encoded into aAC format audio files

(1) Layout:

activity_main.xml

<? xml version="1.0" encoding="utf-8"? > <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">

    <Button
        android:id="@+id/audio_change"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:text="Audio conversion"/>

</LinearLayout>
Copy the code

(2) Code:

MainActivity

package com.lzacking.mediacodecaac;

import androidx.annotation.RequiresApi;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;

import android.app.Activity;
import android.content.pm.PackageManager;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.View;
import android.widget.Button;

public class MainActivity extends AppCompatActivity implements View.OnClickListener {

    private static final String TAG = "MainActivity";
    private Button btnAudioChange;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        // Obtain permission
        verifyStoragePermissions(this);

        btnAudioChange = (Button)findViewById(R.id.audio_change);
        btnAudioChange.setOnClickListener(this);
    }

    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.audio_change:
                Log.e(TAG,"Click the button.");
                // First decode AAC into PCM, then encode PCM into AAC audio file
                // aAC file (initial file)
                final String aacPath = Environment.getExternalStorageDirectory().getPath() + "/The Dawn_clip.aac";
                / / PCM file
                final String pcmPath = Environment.getExternalStorageDirectory().getPath() + "/The Dawn_clip.pcm";
                // aAC file (result file)
                final String aacResultPath = Environment.getExternalStorageDirectory().getPath() + "/The Dawn_clip1.aac";

                AudioCodec.getPCMFromAudio(aacPath, pcmPath, new AudioCodec.AudioDecodeListener() {
                    @Override
                    public void decodeOver(a) {
                        Log.e(TAG,"Audio decoding completed." + pcmPath);

                        // After decoding successfully, start encoding
                        AudioCodec.PcmToAudio(pcmPath, aacResultPath, new AudioCodec.AudioDecodeListener() {
                            @Override
                            public void decodeOver(a) {
                                Log.e(TAG,"Audio coding completed.");
                            }

                            @Override
                            public void decodeFail(a) {
                                Log.e(TAG,"Audio encoding failed."); }}); }@Override
                    public void decodeFail(a) {
                        Log.e(TAG,"Audio decoding failed."); }});break;

            default:
                break; }}private static final int REQUEST_EXTERNAL_STORAGE = 1;
    private static String[] PERMISSIONS_STORAGE = {
            "android.permission.READ_EXTERNAL_STORAGE"."android.permission.WRITE_EXTERNAL_STORAGE" };

    public static void verifyStoragePermissions(Activity activity) {
        try {
            // Check whether you have write permission
            int permission = ActivityCompat.checkSelfPermission(activity,
                    "android.permission.WRITE_EXTERNAL_STORAGE");
            if(permission ! = PackageManager.PERMISSION_GRANTED) {// There is no write permission. If you apply for the write permission, a dialog box will be displayedActivityCompat.requestPermissions(activity, PERMISSIONS_STORAGE, REQUEST_EXTERNAL_STORAGE); }}catch(Exception e) { e.printStackTrace(); }}}Copy the code

AudioCodec:

package com.lzacking.mediacodecaac;

import android.media.MediaExtractor;
import android.media.MediaFormat;
import android.os.Build;
import android.os.Handler;
import android.os.Looper;
import android.util.Log;

import androidx.annotation.RequiresApi;

import java.io.IOException;

/** * Audio related operations */
public class AudioCodec {

    private static final String TAG = "AudioCodec";
    private static Handler handler = new Handler(Looper.getMainLooper());

    /** * Decodes audio files into raw PCM data *@paramAudioPath Audio file directory *@paramAudioSavePath PCM file saving location *@param listener
     */
    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
    public static void getPCMFromAudio(String audioPath, String audioSavePath, final AudioDecodeListener listener) {
        MediaExtractor extractor = new MediaExtractor();// The audio track and the video track of the separable video file
        int audioTrack = -1;// Audio MP3 files actually have only one track
        boolean hasAudio = false;// Check whether the audio file has an audio track

        try {
            extractor.setDataSource(audioPath);
            for (int i = 0; i < extractor.getTrackCount(); i++) {
                MediaFormat format = extractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                if (mime.startsWith("audio/")) {
                    audioTrack = i;
                    hasAudio = true;
                    break; }}if (hasAudio) {
                extractor.selectTrack(audioTrack);
                // Decode the original audio
                new Thread(new AudioDecodeRunnable(extractor, audioTrack, audioSavePath, new DecodeOverListener() {
                    @Override
                    public void decodeIsOver(a) {
                        handler.post(new Runnable() {
                            @Override
                            public void run(a) {
                                if(listener ! =null) { listener.decodeOver(); }}}); }@Override
                    public void decodeFail(a) {
                        handler.post(new Runnable() {
                            @Override
                            public void run(a) {
                                if(listener ! =null) { listener.decodeFail(); }}}); } })).start(); }else {// If the audio file does not have an audio track
                Log.e(TAG,"Audio file has no audio track");
                if(listener ! =null) { listener.decodeFail(); }}}catch (IOException e) {
            e.printStackTrace();
            Log.e(TAG,"Decoding failed");
            if(listener ! =null) { listener.decodeFail(); }}}/** * PCM file to audio *@paramPcmPath PCM file directory *@paramAudioPath Audio file directory *@param listener
     */
    public static void PcmToAudio(String pcmPath,String audioPath,final AudioDecodeListener listener) {

        new Thread(new AudioEncodeRunnable(pcmPath, audioPath, new AudioDecodeListener() {
            @Override
            public void decodeOver(a) {
                if(listener ! =null) {
                    handler.post(new Runnable() {
                        @Override
                        public void run(a) { listener.decodeOver(); }}); }}@Override
            public void decodeFail(a) {
                if(listener ! =null){
                    handler.post(new Runnable() {
                        @Override
                        public void run(a) { listener.decodeFail(); }}); } } })).start(); }/** * Writes the ADTS header data *@param packet
     * @param packetLen
     */
    public static void addADTStoPacket(byte[] packet, int packetLen) {
        int profile = 2; // AAC LC
        int freqIdx = 4; / / 44.1 KHz
        int chanCfg = 2; // CPE

        packet[0] = (byte) 0xFF;
        packet[1] = (byte) 0xF9;
        packet[2] = (byte) (((profile - 1) < <6) + (freqIdx << 2) + (chanCfg >> 2));
        packet[3] = (byte) (((chanCfg & 3) < <6) + (packetLen >> 11));
        packet[4] = (byte) ((packetLen & 0x7FF) > >3);
        packet[5] = (byte) (((packetLen & 7) < <5) + 0x1F);
        packet[6] = (byte) 0xFC;
    }

    public interface DecodeOverListener {
        void decodeIsOver(a);
        void decodeFail(a);
    }


    /** * Audio decoder listener: listen for decoding success */
    public interface AudioDecodeListener {
        void decodeOver(a);
        void decodeFail(a); }}Copy the code

AudioDecodeRunnable:

package com.lzacking.mediacodecaac;

import android.media.MediaCodec;
import android.media.MediaExtractor;
import android.media.MediaFormat;
import android.os.Build;
import android.util.Log;

import androidx.annotation.RequiresApi;

import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;

/** * Audio decoding process */
public class AudioDecodeRunnable implements Runnable {

    private static final String TAG = "AudioDecodeRunnable";
    final static int TIMEOUT_USEC = 0;
    private MediaExtractor extractor;
    private int audioTrack;
    private AudioCodec.DecodeOverListener mListener;
    private String mPcmFilePath;

    public AudioDecodeRunnable(MediaExtractor extractor, int trackIndex, String savePath, AudioCodec.DecodeOverListener listener) {
        this.extractor = extractor;
        audioTrack = trackIndex;
        mListener = listener;
        mPcmFilePath = savePath;
    }

    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
    @Override
    public void run(a) {
        try {
            // MediaFormat to get tracks directly from MP3 audio files
            MediaFormat format = extractor.getTrackFormat(audioTrack);
            // Initialize the audio decoder and configure the decoder properties
            MediaCodec audioCodec = MediaCodec.createDecoderByType(format.getString(MediaFormat.KEY_MIME));
            audioCodec.configure(format, null.null.0);

            // Start MediaCodec and wait for incoming data
            audioCodec.start();

            ByteBuffer[] inputBuffers = audioCodec.getInputBuffers();// Get the input stream queue to encode data, return a ByteBuffer array
            ByteBuffer[] outputBuffers = audioCodec.getOutputBuffers();// Returns a ByteBuffer array
            MediaCodec.BufferInfo decodeBufferInfo = new MediaCodec.BufferInfo();// Describes information about the decoded byte[] data
            MediaCodec.BufferInfo inputInfo = new MediaCodec.BufferInfo();// Information about byte[] data used to describe input data
            boolean codeOver = false;
            boolean inputDone = false;// Complete input end tag

            FileOutputStream fos = new FileOutputStream(mPcmFilePath);

            while(! codeOver) {if(! inputDone) {for (int i = 0; i < inputBuffers.length; i++) {
                        // Fetch data from the input stream queue
                        // Returns the index of the input buffer used to populate valid data, or -1 if no buffer is currently available
                        int inputIndex = audioCodec.dequeueInputBuffer(TIMEOUT_USEC);
                        if (inputIndex >= 0) {
                            // Take the input from the separator and write it to the decoder
                            / / get the inputBuffer
                            ByteBuffer inputBuffer = inputBuffers[inputIndex];
                            // Set position to 0 without clearing the buffer contents
                            inputBuffer.clear();
                            int sampleSize = extractor.readSampleData(inputBuffer,0);// MediaExtractor reads data into inputBuffer
                            if (sampleSize < 0) {// Indicates that all data has been read
                                audioCodec.queueInputBuffer(inputIndex, 0.0.0L, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                            } else {
                                inputInfo.offset = 0;
                                inputInfo.size = sampleSize;
                                inputInfo.flags = MediaCodec.BUFFER_FLAG_SYNC_FRAME;
                                inputInfo.presentationTimeUs = extractor.getSampleTime();

                                Log.e(TAG,"Write data to decoder, current timestamp:" + inputInfo.presentationTimeUs);
                                // Notify MediaCodec to decode the data just passed in
                                audioCodec.queueInputBuffer(inputIndex, inputInfo.offset, sampleSize, inputInfo.presentationTimeUs, 0);
                                // Read the next frameextractor.advance(); }}}}// dequeueInputBuffer dequeueOutputBuffer Return value Description
                // INFO_TRY_AGAIN_LATER=-1 Wait timeout
                // INFO_OUTPUT_FORMAT_CHANGED=-2 Media format change
                // INFO_OUTPUT_BUFFERS_CHANGED=-3 Buffer changed (obsolete)
                // A buffer data subscript is greater than or equal to 0

                boolean decodeOutputDone = false;// Whole decode end flag
                byte[] chunkPCM;
                while(! decodeOutputDone) {int outputIndex = audioCodec.dequeueOutputBuffer(decodeBufferInfo, TIMEOUT_USEC);
                    if (outputIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                        // No decoder is available
                        decodeOutputDone = true;
                    } else if (outputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                        outputBuffers = audioCodec.getOutputBuffers();
                    } else if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                        MediaFormat newFormat = audioCodec.getOutputFormat();
                    } else if (outputIndex < 0) {}else {
                        ByteBuffer outputBuffer;
                        if (Build.VERSION.SDK_INT >= 21) {
                            outputBuffer = audioCodec.getOutputBuffer(outputIndex);
                        } else {
                            outputBuffer = outputBuffers[outputIndex];
                        }

                        chunkPCM = new byte[decodeBufferInfo.size];
                        outputBuffer.get(chunkPCM);
                        outputBuffer.clear();

                        fos.write(chunkPCM);// Write data to file
                        fos.flush();
                        Log.e(TAG,"Release output stream buffer:" + outputIndex);
                        audioCodec.releaseOutputBuffer(outputIndex,false);

                        if((decodeBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) ! =0) {// The codec is complete
                            extractor.release();
                            audioCodec.stop();
                            audioCodec.release();
                            codeOver = true;
                            decodeOutputDone = true;
                        }
                    }
                }
            }

            fos.close();
            mListener.decodeIsOver();
            if(mListener ! =null) { mListener.decodeIsOver(); }}catch (IOException e) {
            e.printStackTrace();
            if(mListener ! =null) { mListener.decodeFail(); }}}}Copy the code

AudioEncodeRunnable:

package com.lzacking.mediacodecaac;

import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.os.Build;
import android.util.Log;

import androidx.annotation.RequiresApi;

import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.Arrays;

/** * Audio encoding process */
public class AudioEncodeRunnable implements Runnable {

    private static final String TAG = "AudioEncodeRunnable";
    private String pcmPath;
    private String audioPath;
    private AudioCodec.AudioDecodeListener mListener;

    public AudioEncodeRunnable(String pcmPath, String audioPath, final AudioCodec.AudioDecodeListener listener) {
        this.pcmPath = pcmPath;
        this.audioPath = audioPath;
        mListener = listener;
    }

    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
    @Override
    public void run(a) {
        try {
            if (!new File(pcmPath).exists()) {// PCM file directory does not exist
                if(mListener ! =null) {
                    mListener.decodeFail();
                }
                return;
            }

            FileInputStream fis = new FileInputStream(pcmPath);
            byte[] buffer = new byte[8 * 1024];
            byte[] allAudioBytes;

            int inputIndex;
            ByteBuffer inputBuffer;
            int outputIndex;
            ByteBuffer outputBuffer;

            byte[] chunkAudio;
            int outBitSize;
            int outPacketSize;

            // Initializes the encoding format mimeType sampling rate Number of channels
            MediaFormat encodeFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, 44100.2);
            encodeFormat.setInteger(MediaFormat.KEY_BIT_RATE, 96000);
            encodeFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
            encodeFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 500 * 1024);

            // Initialize the encoder
            MediaCodec mediaEncode = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);
            mediaEncode.configure(encodeFormat, null.null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            mediaEncode.start();

            ByteBuffer[] encodeInputBuffers = mediaEncode.getInputBuffers();
            ByteBuffer[] encodeOutputBuffers = mediaEncode.getOutputBuffers();
            MediaCodec.BufferInfo encodeBufferInfo = new MediaCodec.BufferInfo();

            // Initialize the file to be written to the stream
            FileOutputStream fos = new FileOutputStream(new File(audioPath));
            BufferedOutputStream bos = new BufferedOutputStream(fos, 500 * 1024);
            boolean isReadEnd = false;
            while(! isReadEnd) {for (int i = 0; i < encodeInputBuffers.length - 1; i++) {// Subtract 1 is very important, don't forget
                    if(fis.read(buffer) ! = -1) {
                        allAudioBytes = Arrays.copyOf(buffer, buffer.length);
                    } else {
                        Log.e(TAG,"File read completed");
                        isReadEnd = true;
                        break;
                    }

                    Log.e(TAG,"Read file and write to encoder" + allAudioBytes.length);
                    // Encode data from the input stream queue
                    inputIndex = mediaEncode.dequeueInputBuffer(-1);

                    inputBuffer = encodeInputBuffers[inputIndex];
                    inputBuffer.clear();
                    inputBuffer.limit(allAudioBytes.length);
                    inputBuffer.put(allAudioBytes);// Fill the inputBuffer with PCM data
                    // After filling the input buffer at the specified index, use queueInputBuffer to submit the buffer to the component.
                    mediaEncode.queueInputBuffer(inputIndex, 0, allAudioBytes.length, 0.0);// Start encoding
                }

                // Encode data from the input stream queue
                outputIndex = mediaEncode.dequeueOutputBuffer(encodeBufferInfo,10000);
                while (outputIndex >= 0) {
                    // Retrieve data from the decoder
                    outBitSize = encodeBufferInfo.size;
                    outPacketSize = outBitSize + 7;// 7 is the size of the ADTS header
                    outputBuffer = encodeOutputBuffers[outputIndex];// Get the output buffer
                    outputBuffer.position(encodeBufferInfo.offset);
                    outputBuffer.limit(encodeBufferInfo.offset + outBitSize);

                    chunkAudio = new byte[outPacketSize];
                    AudioCodec.addADTStoPacket(chunkAudio, outPacketSize);/ / add ADTS
                    outputBuffer.get(chunkAudio, 7, outBitSize);// Fetch the encoded AAC data into byte[] with offset 7
                    outputBuffer.position(encodeBufferInfo.offset);
                    Log.e(TAG, "Encoding successful and writing to file" + chunkAudio.length);
                    bos.write(chunkAudio,0,chunkAudio.length);// Save the file to sdcard
                    bos.flush();

                    mediaEncode.releaseOutputBuffer(outputIndex,false);
                    outputIndex = mediaEncode.dequeueOutputBuffer(encodeBufferInfo,10000);
                }
            }

            mediaEncode.stop();
            mediaEncode.release();
            fos.close();

            if(mListener ! =null){ mListener.decodeOver(); }}catch (IOException e) {
            e.printStackTrace();
            if(mListener ! =null){ mListener.decodeFail(); }}}}Copy the code

(3) Permission

<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Copy the code

(4) Results


Source code: Android audio and video development foundation (five) : learn MediaCodec API, complete audio AAC hard programming, hard solution