1. Ijkplayer stratification
- Ijkplayer Java Layer API: github.com/JeffMony/ij…
- Ijkplayer Native layer interface: github.com/JeffMony/ij…
- Ijkplayer native layer callback: github.com/JeffMony/ij…
- Ijkplayer SDL: github.com/JeffMony/ij…
- The Java layer player encapsulates the classes github.com/JeffMony/ij…
All of our six-level calls start from this class;
2. IjkMediaPlayer class is introduced
IjkMediaPlayer extends AbstractMediaPlayer; AbstractMediaPlayer implements the IMediaPlayer interface; IMediaPlayer interface is the total interface of the player, the function of the player is very complete;
OnMediaCodecSelectListener callback interface is to choose the codec, easy to do performance statistics;
public interface OnMediaCodecSelectListener {
String onMediaCodecSelect(IMediaPlayer mp, String mimeType, int profile, int level);
}
Copy the code
The OnNativeInvokeListener interface is the player network callback interface.
public interface OnNativeInvokeListener {
int CTRL_WILL_TCP_OPEN = 0x20001; // NO ARGS
int CTRL_DID_TCP_OPEN = 0x20002; // ARG_ERROR, ARG_FAMILIY, ARG_IP, ARG_PORT, ARG_FD
int CTRL_WILL_HTTP_OPEN = 0x20003; // ARG_URL, ARG_SEGMENT_INDEX, ARG_RETRY_COUNTER
int CTRL_WILL_LIVE_OPEN = 0x20005; // ARG_URL, ARG_RETRY_COUNTER
int CTRL_WILL_CONCAT_RESOLVE_SEGMENT = 0x20007; // ARG_URL, ARG_SEGMENT_INDEX, ARG_RETRY_COUNTER
int EVENT_WILL_HTTP_OPEN = 0x1; // ARG_URL
int EVENT_DID_HTTP_OPEN = 0x2; // ARG_URL, ARG_ERROR, ARG_HTTP_CODE
int EVENT_WILL_HTTP_SEEK = 0x3; // ARG_URL, ARG_OFFSET
int EVENT_DID_HTTP_SEEK = 0x4; // ARG_URL, ARG_OFFSET, ARG_ERROR, ARG_HTTP_CODE, ARG_FILE_SIZE
String ARG_URL = "url";
String ARG_SEGMENT_INDEX = "segment_index";
String ARG_RETRY_COUNTER = "retry_counter";
String ARG_ERROR = "error";
String ARG_FAMILIY = "family";
String ARG_IP = "ip";
String ARG_PORT = "port";
String ARG_FD = "fd";
String ARG_OFFSET = "offset";
String ARG_HTTP_CODE = "http_code";
String ARG_FILE_SIZE = "file_size";
/*
* @return true if invoke is handled
* @throws Exception on any error
*/
boolean onNativeInvoke(int what, Bundle args);
}
Copy the code
3. Player process
To use the ijkPlayer, start by calling the code in sequence:
mPlayer = new IjkMediaPlayer();
mPlayer.setSurface(surface);
mPlayer.setDataSource(content, uri, headers);
mPlayer.prepareAsync();
mPlayer.start();
Copy the code
We will analyze the whole calling process and principle of iJkPlayer step by step from these calling steps.
4. Initialize the player
A simple line of code, is ijkPlayer initialization;
mPlayer = new IjkMediaPlayer();
Copy the code
Github.com/JeffMony/ij… ijkplayer/ijkmedia/ijkplayer/android/ijkplayer_jni.c ijkplayer/ijkmedia/ijkplayer/android/ijkplayer_android.c ijkplayer/ijkmedia/ijkplayer/android/pipeline/ffpipeline_android.c ijkplayer/ijkmedia/ijkplayer/ijkplayer.c ijkplayer/ijkmedia/ijkplayer/ijkplayer_internal.h ijkplayer/ijkmedia/ijkplayer/ff_ffplay.c ijkplayer/ijkmedia/ijkplayer/ff_ffplay_def.h ijkplayer/ijkmedia/ijkplayer/ijkmeta.c ijkplayer/ijkmedia/ijksdl/android/ijksdl_vout_android_surface.c ijkplayer/ijkmedia/ijksdl/android/ijksdl_vout_android_nativewindow.c ijkplayer/ijkmedia/ijksdl/ijksdl_vout_internal.h
Ijkplayer initialization execution process involves more files, we mainly explain the code from the process, from the code to the principle;
4.1 Initialize loading SO
Ijkmediaplayer. Java performs the initial loading operation of so, mainly loading the following three SO;
public static void loadLibrariesOnce(IjkLibLoader libLoader) { synchronized (IjkMediaPlayer.class) { if (! mIsLibLoaded) { if (libLoader == null) libLoader = sLocalLibLoader; libLoader.loadLibrary("ijkffmpeg"); libLoader.loadLibrary("ijksdl"); libLoader.loadLibrary("ijkplayer"); mIsLibLoaded = true; }}}Copy the code
Ijkplayer from Java calls to the native layer, interface file is ijkplayer/ijkmedia/ijkplayer/android/ijkplayer_jni. C in the mapping relation;
static JNINativeMethod g_methods[] = { { "_setDataSource", "(Ljava/lang/String; [Ljava/lang/String;[Ljava/lang/String;)V", (void *) IjkMediaPlayer_setDataSourceAndHeaders }, { "_setDataSourceFd", "(I)V", (void *) IjkMediaPlayer_setDataSourceFd }, { "_setDataSource", "(Ltv/danmaku/ijk/media/player/misc/IMediaDataSource;)V", (void *)IjkMediaPlayer_setDataSourceCallback }, { "_setAndroidIOCallback", "(Ltv/danmaku/ijk/media/player/misc/IAndroidIO;)V", (void *)IjkMediaPlayer_setAndroidIOCallback }, { "_setVideoSurface", "(Landroid/view/Surface;)V", (void *) IjkMediaPlayer_setVideoSurface }, { "_prepareAsync", "()V", (void *) IjkMediaPlayer_prepareAsync }, { "_start", "()V", (void *) IjkMediaPlayer_start }, { "_stop", "()V", (void *) IjkMediaPlayer_stop }, { "seekTo", "(J)V", (void *) IjkMediaPlayer_seekTo }, { "_pause", "()V", (void *) IjkMediaPlayer_pause }, { "isPlaying", "()Z", (void *) IjkMediaPlayer_isPlaying }, { "getCurrentPosition", "()J", (void *) IjkMediaPlayer_getCurrentPosition }, { "getDuration", "()J", (void *) IjkMediaPlayer_getDuration }, { "_release", "()V", (void *) IjkMediaPlayer_release }, { "_reset", "()V", (void *) IjkMediaPlayer_reset }, { "setVolume", "(FF)V", (void *) IjkMediaPlayer_setVolume }, { "getAudioSessionId", "()I", (void *) IjkMediaPlayer_getAudioSessionId }, { "native_init", "()V", (void *) IjkMediaPlayer_native_init }, { "native_setup", "(Ljava/lang/Object;)V", (void *) IjkMediaPlayer_native_setup }, { "native_finalize", "()V", (void *) IjkMediaPlayer_native_finalize }, { "_setOption", "(ILjava/lang/String;Ljava/lang/String;)V", (void *) IjkMediaPlayer_setOption }, { "_setOption", "(ILjava/lang/String;J)V", (void *) IjkMediaPlayer_setOptionLong }, { "_getColorFormatName", "(I)Ljava/lang/String;", (void *) IjkMediaPlayer_getColorFormatName }, { "_getVideoCodecInfo", "()Ljava/lang/String;", (void *) IjkMediaPlayer_getVideoCodecInfo }, { "_getAudioCodecInfo", "()Ljava/lang/String;", (void *) IjkMediaPlayer_getAudioCodecInfo }, { "_getMediaMeta", "()Landroid/os/Bundle;", (void *) IjkMediaPlayer_getMediaMeta }, { "_setLoopCount", "(I)V", (void *) IjkMediaPlayer_setLoopCount }, { "_getLoopCount", "()I", (void *) IjkMediaPlayer_getLoopCount }, { "_getPropertyFloat", "(IF)F", (void *) ijkMediaPlayer_getPropertyFloat }, { "_setPropertyFloat", "(IF)V", (void *) ijkMediaPlayer_setPropertyFloat }, { "_getPropertyLong", "(IJ)J", (void *) ijkMediaPlayer_getPropertyLong }, { "_setPropertyLong", "(IJ)V", (void *) ijkMediaPlayer_setPropertyLong }, { "_setStreamSelected", "(IZ)V", (void *) ijkMediaPlayer_setStreamSelected }, { "native_profileBegin", "(Ljava/lang/String;)V", (void *) IjkMediaPlayer_native_profileBegin }, { "native_profileEnd", "()V", (void *) IjkMediaPlayer_native_profileEnd }, { "native_setLogLevel", "(I)V", (void *) IjkMediaPlayer_native_setLogLevel }, { "_setFrameAtTime", "(Ljava/lang/String;JJII)V", (void *) IjkMediaPlayer_setFrameAtTime }, };Copy the code
4.2 Initializing the Player message mechanism
EventHandler is a subclass of Handler that handles player callbacks; The player has many callbacks, such as onPrepared onInfo onVideoSizeChanged, etc. These callbacks need to be distributed by Handler after coming up from the underlying callback, and then called back to the developer;
The underlying by postEventFromNative function callback, seen the last of the mp. MEventHandler. SendMessage (m); Uniformly processed in EventHandler;
@CalledByNative
private static void postEventFromNative(Object weakThiz, int what,
int arg1, int arg2, Object obj) {
if (weakThiz == null)
return;
@SuppressWarnings("rawtypes")
IjkMediaPlayer mp = (IjkMediaPlayer) ((WeakReference) weakThiz).get();
if (mp == null) {
return;
}
if (what == MEDIA_INFO && arg1 == MEDIA_INFO_STARTED_AS_NEXT) {
// this acquires the wakelock if needed, and sets the client side
// state
mp.start();
}
if (mp.mEventHandler != null) {
Message m = mp.mEventHandler.obtainMessage(what, arg1, arg2, obj);
mp.mEventHandler.sendMessage(m);
}
}
Copy the code
Next call the IjkMediaPlayer_native_setup method in ijkplayer_jni.c;
static void
IjkMediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this)
{
MPTRACE("%s\n", __func__);
IjkMediaPlayer *mp = ijkmp_android_create(message_loop);
JNI_CHECK_GOTO(mp, env, "java/lang/OutOfMemoryError", "mpjni: native_setup: ijkmp_create() failed", LABEL_RETURN);
jni_set_media_player(env, thiz, mp);
ijkmp_set_weak_thiz(mp, (*env)->NewGlobalRef(env, weak_this));
ijkmp_set_inject_opaque(mp, ijkmp_get_weak_thiz(mp));
ijkmp_set_ijkio_inject_opaque(mp, ijkmp_get_weak_thiz(mp));
ijkmp_android_set_mediacodec_select_callback(mp, mediacodec_select_callback, ijkmp_get_weak_thiz(mp));
LABEL_RETURN:
ijkmp_dec_ref_p(&mp);
}
Copy the code
Perform ijkmp_android_create (message_loop); The pointer to the message_loop function passed in; Message_loop to message_loop_n;
static void message_loop_n(JNIEnv *env, IjkMediaPlayer *mp)
{
jobject weak_thiz = (jobject) ijkmp_get_weak_thiz(mp);
JNI_CHECK_GOTO(weak_thiz, env, NULL, "mpjni: message_loop_n: null weak_thiz", LABEL_RETURN);
while (1) {
AVMessage msg;
int retval = ijkmp_get_msg(mp, &msg, 1);
if (retval < 0)
break;
// block-get should never return 0
assert(retval > 0);
switch (msg.what) {
case FFP_MSG_FLUSH:
MPTRACE("FFP_MSG_FLUSH:\n");
post_event(env, weak_thiz, MEDIA_NOP, 0, 0);
break;
case FFP_MSG_ERROR:
MPTRACE("FFP_MSG_ERROR: %d\n", msg.arg1);
post_event(env, weak_thiz, MEDIA_ERROR, MEDIA_ERROR_IJK_PLAYER, msg.arg1);
break;
case FFP_MSG_PREPARED:
MPTRACE("FFP_MSG_PREPARED:\n");
post_event(env, weak_thiz, MEDIA_PREPARED, 0, 0);
break;
case FFP_MSG_COMPLETED:
MPTRACE("FFP_MSG_COMPLETED:\n");
post_event(env, weak_thiz, MEDIA_PLAYBACK_COMPLETE, 0, 0);
break;
case FFP_MSG_VIDEO_SIZE_CHANGED:
MPTRACE("FFP_MSG_VIDEO_SIZE_CHANGED: %d, %d\n", msg.arg1, msg.arg2);
post_event(env, weak_thiz, MEDIA_SET_VIDEO_SIZE, msg.arg1, msg.arg2);
break;
case FFP_MSG_SAR_CHANGED:
MPTRACE("FFP_MSG_SAR_CHANGED: %d, %d\n", msg.arg1, msg.arg2);
post_event(env, weak_thiz, MEDIA_SET_VIDEO_SAR, msg.arg1, msg.arg2);
break;
case FFP_MSG_VIDEO_RENDERING_START:
MPTRACE("FFP_MSG_VIDEO_RENDERING_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_RENDERING_START, 0);
break;
case FFP_MSG_AUDIO_RENDERING_START:
MPTRACE("FFP_MSG_AUDIO_RENDERING_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_RENDERING_START, 0);
break;
case FFP_MSG_VIDEO_ROTATION_CHANGED:
MPTRACE("FFP_MSG_VIDEO_ROTATION_CHANGED: %d\n", msg.arg1);
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_ROTATION_CHANGED, msg.arg1);
break;
case FFP_MSG_AUDIO_DECODED_START:
MPTRACE("FFP_MSG_AUDIO_DECODED_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_DECODED_START, 0);
break;
case FFP_MSG_VIDEO_DECODED_START:
MPTRACE("FFP_MSG_VIDEO_DECODED_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_DECODED_START, 0);
break;
case FFP_MSG_OPEN_INPUT:
MPTRACE("FFP_MSG_OPEN_INPUT:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_OPEN_INPUT, 0);
break;
case FFP_MSG_FIND_STREAM_INFO:
MPTRACE("FFP_MSG_FIND_STREAM_INFO:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_FIND_STREAM_INFO, 0);
break;
case FFP_MSG_COMPONENT_OPEN:
MPTRACE("FFP_MSG_COMPONENT_OPEN:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_COMPONENT_OPEN, 0);
break;
case FFP_MSG_BUFFERING_START:
MPTRACE("FFP_MSG_BUFFERING_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_BUFFERING_START, msg.arg1);
break;
case FFP_MSG_BUFFERING_END:
MPTRACE("FFP_MSG_BUFFERING_END:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_BUFFERING_END, msg.arg1);
break;
case FFP_MSG_BUFFERING_UPDATE:
// MPTRACE("FFP_MSG_BUFFERING_UPDATE: %d, %d", msg.arg1, msg.arg2);
post_event(env, weak_thiz, MEDIA_BUFFERING_UPDATE, msg.arg1, msg.arg2);
break;
case FFP_MSG_BUFFERING_BYTES_UPDATE:
break;
case FFP_MSG_BUFFERING_TIME_UPDATE:
break;
case FFP_MSG_SEEK_COMPLETE:
MPTRACE("FFP_MSG_SEEK_COMPLETE:\n");
post_event(env, weak_thiz, MEDIA_SEEK_COMPLETE, 0, 0);
break;
case FFP_MSG_ACCURATE_SEEK_COMPLETE:
MPTRACE("FFP_MSG_ACCURATE_SEEK_COMPLETE:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_MEDIA_ACCURATE_SEEK_COMPLETE, msg.arg1);
break;
case FFP_MSG_PLAYBACK_STATE_CHANGED:
break;
case FFP_MSG_TIMED_TEXT:
if (msg.obj) {
jstring text = (*env)->NewStringUTF(env, (char *)msg.obj);
post_event2(env, weak_thiz, MEDIA_TIMED_TEXT, 0, 0, text);
J4A_DeleteLocalRef__p(env, &text);
}
else {
post_event2(env, weak_thiz, MEDIA_TIMED_TEXT, 0, 0, NULL);
}
break;
case FFP_MSG_GET_IMG_STATE:
if (msg.obj) {
jstring file_name = (*env)->NewStringUTF(env, (char *)msg.obj);
post_event2(env, weak_thiz, MEDIA_GET_IMG_STATE, msg.arg1, msg.arg2, file_name);
J4A_DeleteLocalRef__p(env, &file_name);
}
else {
post_event2(env, weak_thiz, MEDIA_GET_IMG_STATE, msg.arg1, msg.arg2, NULL);
}
break;
case FFP_MSG_VIDEO_SEEK_RENDERING_START:
MPTRACE("FFP_MSG_VIDEO_SEEK_RENDERING_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_SEEK_RENDERING_START, msg.arg1);
break;
case FFP_MSG_AUDIO_SEEK_RENDERING_START:
MPTRACE("FFP_MSG_AUDIO_SEEK_RENDERING_START:\n");
post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_SEEK_RENDERING_START, msg.arg1);
break;
default:
ALOGE("unknown FFP_MSG_xxx(%d)\n", msg.what);
break;
}
msg_free_res(&msg);
}
LABEL_RETURN:
;
}
Copy the code
Message_loop_n has some wireless loops, in which ijkmp_get_msg is used to get the current state information of the player, and post_event is executed to call postEventFromNative in the Java layer. So this strings the whole process together;
int retval = ijkmp_get_msg(mp, &msg, 1);
Copy the code
The while loop calls the ijkmp_get_msg function, which is supposed to get instant information about the current status of the player. ijkplayer/ijkmedia/ijkplayer/ijkplayer.c
int ijkmp_get_msg(IjkMediaPlayer *mp, AVMessage *msg, int block) { assert(mp); while (1) { int continue_wait_next_msg = 0; int retval = msg_queue_get(&mp->ffplayer->msg_queue, msg, block); if (retval <= 0) return retval; switch (msg->what) { case FFP_MSG_PREPARED: MPTRACE("ijkmp_get_msg: FFP_MSG_PREPARED\n"); pthread_mutex_lock(&mp->mutex); if (mp->mp_state == MP_STATE_ASYNC_PREPARING) { ijkmp_change_state_l(mp, MP_STATE_PREPARED); } else { // FIXME: 1: onError() ? av_log(mp->ffplayer, AV_LOG_DEBUG, "FFP_MSG_PREPARED: expecting mp_state==MP_STATE_ASYNC_PREPARING\n"); } if (! mp->ffplayer->start_on_prepared) { ijkmp_change_state_l(mp, MP_STATE_PAUSED); } pthread_mutex_unlock(&mp->mutex); break; case FFP_MSG_COMPLETED: MPTRACE("ijkmp_get_msg: FFP_MSG_COMPLETED\n"); pthread_mutex_lock(&mp->mutex); mp->restart = 1; mp->restart_from_beginning = 1; ijkmp_change_state_l(mp, MP_STATE_COMPLETED); pthread_mutex_unlock(&mp->mutex); break; case FFP_MSG_SEEK_COMPLETE: MPTRACE("ijkmp_get_msg: FFP_MSG_SEEK_COMPLETE\n"); pthread_mutex_lock(&mp->mutex); mp->seek_req = 0; mp->seek_msec = 0; pthread_mutex_unlock(&mp->mutex); break; case FFP_REQ_START: MPTRACE("ijkmp_get_msg: FFP_REQ_START\n"); continue_wait_next_msg = 1; pthread_mutex_lock(&mp->mutex); if (0 == ikjmp_chkst_start_l(mp->mp_state)) { // FIXME: 8 check seekable if (mp->restart) { if (mp->restart_from_beginning) { av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_START: restart from beginning\n"); retval = ffp_start_from_l(mp->ffplayer, 0); if (retval == 0) ijkmp_change_state_l(mp, MP_STATE_STARTED); } else { av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_START: restart from seek pos\n"); retval = ffp_start_l(mp->ffplayer); if (retval == 0) ijkmp_change_state_l(mp, MP_STATE_STARTED); } mp->restart = 0; mp->restart_from_beginning = 0; } else { av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_START: start on fly\n"); retval = ffp_start_l(mp->ffplayer); if (retval == 0) ijkmp_change_state_l(mp, MP_STATE_STARTED); } } pthread_mutex_unlock(&mp->mutex); break; case FFP_REQ_PAUSE: MPTRACE("ijkmp_get_msg: FFP_REQ_PAUSE\n"); continue_wait_next_msg = 1; pthread_mutex_lock(&mp->mutex); if (0 == ikjmp_chkst_pause_l(mp->mp_state)) { int pause_ret = ffp_pause_l(mp->ffplayer); if (pause_ret == 0) ijkmp_change_state_l(mp, MP_STATE_PAUSED); } pthread_mutex_unlock(&mp->mutex); break; case FFP_REQ_SEEK: MPTRACE("ijkmp_get_msg: FFP_REQ_SEEK\n"); continue_wait_next_msg = 1; pthread_mutex_lock(&mp->mutex); if (0 == ikjmp_chkst_seek_l(mp->mp_state)) { mp->restart_from_beginning = 0; if (0 == ffp_seek_to_l(mp->ffplayer, msg->arg1)) { av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_SEEK: seek to %d\n", (int)msg->arg1); } } pthread_mutex_unlock(&mp->mutex); break; } if (continue_wait_next_msg) { msg_free_res(msg); continue; } return retval; } return -1; }Copy the code
This function also has a while(1) loop that continuously fetches the player status information from mp-> ffPlayer ->msg_queue and assigns it to the MSG object;
int retval = msg_queue_get(&mp->ffplayer->msg_queue, msg, block);
Copy the code
Ijkplayer’s state callback mechanism can be easily understood in the following sections:
5. SetDataSource and setSurface
ijkplayer/ijkmedia/ijkplayer/ijkplayer.c ijkplayer/ijkmedia/ijkplayer/ijkplayer_internal.h ijkplayer/ijkmedia/ijkplayer/android/ijkplayer_android.c ijkplayer/ijkmedia/ijksdl/android/ijksdl_vout_android_surface.c ijkplayer/ijkmedia/ijksdl/android/ijksdl_vout_android_nativewindow.c ijkplayer/ijkmedia/ijkplayer/android/pipeline/ffpipeline_android.c
After initializing the IjkMediaPlayer instance, we need to call the setSurface and setDataSource methods, set the URL to play and the surface to display the interface, and prepare for the next playback.
SetDataSource after the URL is set successfully, the player state is set to MP_STATE_INITIALIZED, and the ijkPlayer state is switched between, as we’ll see in the next section.
SetDataSource eventually set to ijkplayer ijkmedia/ijkplayer/ijkplayer ijkmp_set_data_source_l method in c;
static int ijkmp_set_data_source_l(IjkMediaPlayer *mp, const char *url) { assert(mp); assert(url); // MPST_RET_IF_EQ(mp->mp_state, MP_STATE_IDLE); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_INITIALIZED); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_ASYNC_PREPARING); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_PREPARED); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_STARTED); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_PAUSED); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_COMPLETED); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_STOPPED); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_ERROR); MPST_RET_IF_EQ(mp->mp_state, MP_STATE_END); freep((void**)&mp->data_source); mp->data_source = strdup(url); if (! mp->data_source) return EIJK_OUT_OF_MEMORY; ijkmp_change_state_l(mp, MP_STATE_INITIALIZED); return 0; }Copy the code
Video url Settings to ijkplayer/ijkmedia/ijkplayer/ijkplayer_internal h data_source IjkMediaPlayer structure, in this in the subsequent request video url will use;
struct IjkMediaPlayer {
volatile int ref_count;
pthread_mutex_t mutex;
FFPlayer *ffplayer;
int (*msg_loop)(void*);
SDL_Thread *msg_thread;
SDL_Thread _msg_thread;
int mp_state;
char *data_source;
void *weak_thiz;
int restart;
int restart_from_beginning;
int seek_req;
long seek_msec;
};
Copy the code
Upper call setSurface calls to ijkplayer/ijkmedia/ijkplayer/android/ijkplayer_android. C
void ijkmp_android_set_surface_l(JNIEnv *env, IjkMediaPlayer *mp, jobject android_surface) { if (! mp || ! mp->ffplayer || ! mp->ffplayer->vout) return; SDL_VoutAndroid_SetAndroidSurface(env, mp->ffplayer->vout, android_surface); ffpipeline_set_surface(env, mp->ffplayer->pipeline, android_surface); }Copy the code
The SDL_VoutAndroid_SetAndroidSurface function is passed into android_surface, which is converted into ANativeWindow by the ANativeWindow_fromSurface function. ANativeWindow is the canvas window drawn at the bottom of Android OpengL. Local display uses OpengL to display;
Ijkplayer ijkmedia/ijkplayer/android/pipeline/ffpipeline_android. C initialization time custom of mp – > ffplayer – > pipeline and android_surface Pass into the bottom;
int ffpipeline_set_surface(JNIEnv *env, IJKFF_Pipeline* pipeline, jobject surface) { ALOGD("%s()\n", __func__); if (! check_ffpipeline(pipeline, __func__)) return -1; IJKFF_Pipeline_Opaque *opaque = pipeline->opaque; if (! opaque->surface_mutex) return -1; ffpipeline_lock_surface(pipeline); { jobject prev_surface = opaque->jsurface; if ((surface == prev_surface) || (surface && prev_surface && (*env)->IsSameObject(env, surface, prev_surface))) { // same object, no need to reconfigure } else { SDL_VoutAndroid_setAMediaCodec(opaque->weak_vout, NULL); if (surface) { opaque->jsurface = (*env)->NewGlobalRef(env, surface); } else { opaque->jsurface = NULL; } opaque->is_surface_need_reconfigure = true; if (prev_surface ! = NULL) { SDL_JNI_DeleteGlobalRefP(env, &prev_surface); } } } ffpipeline_unlock_surface(pipeline); return 0; }Copy the code
Ffpipeline_android. c is a pipeline management module that manages audio and video decoding. The underlying decoding module is managed by this file.
if ((surface == prev_surface) ||
(surface && prev_surface && (*env)->IsSameObject(env, surface, prev_surface))) {
// same object, no need to reconfigure
}
Copy the code
If the current incoming surface is the same as the previous surface, there is no need to process it. If not, the decoder SDL_VoutAndroid_setAMediaCodec and opaque->jsurface attributes need to be set. Data will be rendered on this object later.
6. PrepareAsync requests data
ijkplayer/ijkmedia/ijkplayer/android/ijkplayer_jni.c ijkplayer/ijkmedia/ijkplayer/ijkplayer.c ijkplayer/ijkmedia/ijkplayer/ff_ffplay.c
As you can see from the sequence diagram above, the actual parsing should be done in the stream_open function in ff_ffplay.c; This function does four things;
- 1. Initialize the queues of video streams, subtitle streams, and audio streams, and prepare to receive and parse the data of the three streams;
- 2. The child thread is ready to parse the video frame
- 3. The sub-thread parses the media URL
- 4. Initialize the decoder
3. Sub-thread parsing media URL —-> is the core work in prepareAsync process; The read_thread method is a very large function, nearly 600 lines, no code attached;
First make a brief introduction of ijkplayer/extra/ffmpeg libavformat/avformat AVFormatContext structure of h, this structure is a bearing resolution Audio and video after the raw data, such as a video file, there is a video streaming, subtitle stream, The audio stream, maybe more than one audio stream, is parsed by calling the relevant parsing function, and when it’s parsed, it’s stored in this structure;
/**
* A list of all streams in the file. New streams are created with
* avformat_new_stream().
*
* - demuxing: streams are created by libavformat in avformat_open_input().
* If AVFMTCTX_NOHEADER is set in ctx_flags, then new streams may also
* appear in av_read_frame().
* - muxing: streams are created by the user before avformat_write_header().
*
* Freed by libavformat in avformat_free_context().
*/
AVStream **streams;
Copy the code
All the track streams in the video will be placed in this AVStream pointer array;
First call IC = avformat_alloc_context(); Allocate context space;
err = avformat_open_input(&ic, is->filename, is->iformat, &ffp->format_opts);
Copy the code
The avformat_open_input function is responsible for parsing the video URL, whether it is a network URL or a local URL. This function is also quite complicated. After parsing, it is stored in the IC address, which is the AVFormatContext structure.
Now the STREAMS element in IC already contains all kinds of track flow information; Call AV_find_best_STREAM to fetch the corresponding video stream, audio stream and subtitle stream information;
st_index[AVMEDIA_TYPE_VIDEO] =
av_find_best_stream(ic, AVMEDIA_TYPE_VIDEO,
st_index[AVMEDIA_TYPE_VIDEO], -1, NULL, 0);
st_index[AVMEDIA_TYPE_AUDIO] =
av_find_best_stream(ic, AVMEDIA_TYPE_AUDIO,
st_index[AVMEDIA_TYPE_AUDIO],
st_index[AVMEDIA_TYPE_VIDEO],
NULL, 0);
st_index[AVMEDIA_TYPE_SUBTITLE] =
av_find_best_stream(ic, AVMEDIA_TYPE_SUBTITLE,
st_index[AVMEDIA_TYPE_SUBTITLE],
(st_index[AVMEDIA_TYPE_AUDIO] >= 0 ?
st_index[AVMEDIA_TYPE_AUDIO] :
st_index[AVMEDIA_TYPE_VIDEO]),
NULL, 0);
Copy the code
Then open the track stream respectively and read the contents inside, the specific reading function is stream_COMPONent_open, we can understand now, we understand the general process, these specific parsing process I will separate out to explain;
/* open the streams */
if (st_index[AVMEDIA_TYPE_AUDIO] >= 0) {
stream_component_open(ffp, st_index[AVMEDIA_TYPE_AUDIO]);
} else {
ffp->av_sync_type = AV_SYNC_VIDEO_MASTER;
is->av_sync_type = ffp->av_sync_type;
}
ret = -1;
if (st_index[AVMEDIA_TYPE_VIDEO] >= 0) {
ret = stream_component_open(ffp, st_index[AVMEDIA_TYPE_VIDEO]);
}
if (is->show_mode == SHOW_MODE_NONE)
is->show_mode = ret >= 0 ? SHOW_MODE_VIDEO : SHOW_MODE_RDFT;
if (st_index[AVMEDIA_TYPE_SUBTITLE] >= 0) {
stream_component_open(ffp, st_index[AVMEDIA_TYPE_SUBTITLE]);
}
ffp_notify_msg1(ffp, FFP_MSG_COMPONENT_OPEN);
Copy the code
All this is done before calling onPrepared; Video width height gets in case of onVideoSizeChanged(…) Functions;
if (is->video_st && is->video_st->codecpar) {
AVCodecParameters *codecpar = is->video_st->codecpar;
ffp_notify_msg3(ffp, FFP_MSG_VIDEO_SIZE_CHANGED, codecpar->width, codecpar->height);
ffp_notify_msg3(ffp, FFP_MSG_SAR_CHANGED, codecpar->sample_aspect_ratio.num, codecpar->sample_aspect_ratio.den);
}
ffp->prepared = true;
ffp_notify_msg1(ffp, FFP_MSG_PREPARED);
Copy the code
7. Play the video
Start the function call is relatively simple, direct call to ijkplayer/ijkmedia/ijkplayer/ijkplayer ijkmp_start_l function in c
static int ijkmp_start_l(IjkMediaPlayer *mp)
{
assert(mp);
MP_RET_IF_FAILED(ikjmp_chkst_start_l(mp->mp_state));
ffp_remove_msg(mp->ffplayer, FFP_REQ_START);
ffp_remove_msg(mp->ffplayer, FFP_REQ_PAUSE);
ffp_notify_msg1(mp->ffplayer, FFP_REQ_START);
return 0;
}
Copy the code
When the player is initialized, ijkPlayer starts a message loop to process messages from all parties, and ffp_notify_msg1 is called. Start this message, let’s go directly to the message processing place to see how to handle the received message;
case FFP_REQ_START:
MPTRACE("ijkmp_get_msg: FFP_REQ_START\n");
continue_wait_next_msg = 1;
pthread_mutex_lock(&mp->mutex);
if (0 == ikjmp_chkst_start_l(mp->mp_state)) {
// FIXME: 8 check seekable
if (mp->restart) {
if (mp->restart_from_beginning) {
av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_START: restart from beginning\n");
retval = ffp_start_from_l(mp->ffplayer, 0);
if (retval == 0)
ijkmp_change_state_l(mp, MP_STATE_STARTED);
} else {
av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_START: restart from seek pos\n");
retval = ffp_start_l(mp->ffplayer);
if (retval == 0)
ijkmp_change_state_l(mp, MP_STATE_STARTED);
}
mp->restart = 0;
mp->restart_from_beginning = 0;
} else {
av_log(mp->ffplayer, AV_LOG_DEBUG, "ijkmp_get_msg: FFP_REQ_START: start on fly\n");
retval = ffp_start_l(mp->ffplayer);
if (retval == 0)
ijkmp_change_state_l(mp, MP_STATE_STARTED);
}
}
pthread_mutex_unlock(&mp->mutex);
break;
Copy the code
You can see that eventually the play function will launch to play the video;
Nodule 8.
- In this chapter, we just briefly introduce the playback process of the player, as well as the main execution files and functions in the playback process;
- The principle of each module will be updated later.