“This is the 22nd day of my participation in the Gwen Challenge in November. Check out the details: The Last Gwen Challenge in 2021.”

preface

The small box’s live broadcasting business was originally intended to be switched between two sets of engines, so it needed to be packaged. In addition, the official documents of sonnet and Ali’s live SDK are not very comprehensive, and some of them even have errors (maybe the documents are not updated in time), which lead to normal operation and many problems during access. Therefore, the problems and solutions during access are recorded at the same time.

Defines the interface

First, because the two engines need to switch, the interface is defined to define common behavior

public interface RtcEngine {
    void init(Context context, RtcInfo config);
    void join(a);
    void leave(a);
    void setRtcListener(RtcListener rtcListener);
}
Copy the code

Here RtcInfo is the parameter required by the two SDKS, provided by the server. We provide it once at initialization, but we can also provide it in real time. If it is provided in real time, the join function also needs some additional parameters.

RtcInfo is defined as follows:

public class RtcInfo {
    public AgoraConfig agoraConfig;
    public AliConfig aliConfig;
    public String rtcType;
}
public class AgoraConfig {
    public String liveChannel;
    public String appId;
    public int avatarUID;
    public int liveUID;
    public String liveToken;
}
public class AliConfig {
    public String liveChannel;
    public String appId;
    public String avatarUID;
    public String liveUID;
    public String liveToken;
    public String gslb;
    public List<String> gslbList;
    public long timeStamp;
    public String nonce;
}
Copy the code

There is also a listener, RtcListener, that unites the callbacks of the two SDKS and can be customized

public interface RtcListener {
    void remoteOnline(View remoteView); // When the stream is received, add remoteView to the page to display
    void remoteOffline(a);
}
Copy the code

Sound access network

Sound network encapsulation class, RtcEngine interface:

public class AgoraEngine implements RtcEngine {
    private final String TAG = this.getClass().getSimpleName();
    private Context mContext;
    private io.agora.rtc.RtcEngine engine;
    private RtcInfo mConfig;
    private RtcListener listener;
    private SurfaceView mRemoteView;
    private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() {
        @Override
        public void onJoinChannelSuccess(String s, int i, int i1) {
            super.onJoinChannelSuccess(s, i, i1);
        }

        @Override
        public void onLeaveChannel(RtcStats rtcStats) {
            super.onLeaveChannel(rtcStats);
        }

        @Override
        public void onUserOffline(int i, int i1) {
            super.onUserOffline(i, i1);
        }

        @Override
        public void onWarning(int i) {
            super.onWarning(i);
        }

        @Override
        public void onError(int i) {
            super.onError(i);
        }

        @Override
        public void onUserJoined(final int uid, int elapsed) {
            super.onUserJoined(uid, elapsed);
            // Get the stream here, set RemoteVideo and display it
            // There are two streams, but only one is used, so we need to judge, only the teacher stream
            if(uid ! = mConfig.agoraConfig.avatarUID && uid < xxxx) {if(engine ! =null) {
                    engine.muteRemoteAudioStream(uid, true);
                    engine.muteRemoteVideoStream(uid, true);
                }
                return;
            }
            if (uid == mConfig.agoraConfig.avatarUID) {
                // Find that the uid is the same as the teacher ID, create a setting RemoteVideo and display it
                mRemoteView.setActivated(true);
                mRemoteView.setEnabled(true);
                new Handler(mContext.getMainLooper()).post(new Runnable() {
                    @Override
                    public void run(a) {
                        mRemoteView = io.agora.rtc.RtcEngine.CreateRendererView(mContext);
                        mRemoteView.setActivated(true);
                        mRemoteView.setEnabled(true);
                        if(listener ! =null) {// Display the player to the page
                            listener.joinSuccess(mRemoteView);
                        }
                        engine.setupRemoteVideo(new VideoCanvas(mRemoteView, VideoCanvas.RENDER_MODE_HIDDEN, uid));
                        engine.setRemoteRenderMode(uid, VideoCanvas.RENDER_MODE_HIDDEN, Constants.VIDEO_MIRROR_MODE_ENABLED); // Set render and mirror modes (default is no mirror). This function must be called after setupRemoteVideo and can be called multiple times}}); }}@Override
        public void onFirstRemoteVideoFrame(final int uid, int w, int h, int i3) {
            // The official documentation indicates that the first stream will be fetched here, then RemoteVideo will be set and displayed. In practice, we find that there is no callback at all and RemoteVideo is handled in onUserJoined}};@Override
    public void init(Context context, RtcInfo config) {
        mConfig = config;
        mContext = context.getApplicationContext();

        try {
            engine = io.agora.rtc.RtcEngine.create(mContext, config.agoraConfig.appId, iRtcEngineEventHandler);
            engine.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING);
            engine.setVideoProfile(Constants.VIDEO_PROFILE_240P_4, false);
            engine.setClientRole(Constants.CLIENT_ROLE_AUDIENCE);
            engine.enableVideo();
            engine.setParameters("{\"che.audio.keep.audiosession\":true}");
        } catch (Exception e) {
            Log.e(TAG, TAG, e);
            engine = null; }}@Override
    public void join(a) {
        if(engine ! =null){
            engine.joinChannel(mConfig.agoraConfig.liveToken, mConfig.agoraConfig.liveChannel, "", mConfig.agoraConfig.liveUID); }}@Override
    public void leave(a) {
        if(engine ! =null){
            engine.leaveChannel();
            io.agora.rtc.RtcEngine.destroy();
            engine = null; }}@Override
    public void setRtcListener(RtcListener rtcListener) { listener = rtcListener; }}Copy the code

Note that onFirstreMoteVideo of rame in the official documentation indicates that the first stream is captured here, then RemoteVideo is set and displayed. In practice, there is no callback at all, and RemoteVideo is handled in onUserJoined, which is also handled in the official Demo. (I don’t know it is not updated now).

OnUserOffline is not processed in the code, and the subsequent functions are actually supplemented. It is noted here that uid must be verified, otherwise it may cause problems. For example, we need to make some page adjustments when the teacher exits the broadcast room, but if there is no uid check, other people (special identity) will execute this code when they exit.

Access Ali Live

Ali’s encapsulation class also implements the RtcEngine interface:

public class AliEngine implements RtcEngine { private final String TAG = this.getClass().getSimpleName(); private Context mContext; private AliRtcEngine mEngine; private RtcInfo mConfig; //private SophonSurfaceView mRemoteView; private AliRtcEngine.AliVideoCanvas mCanvas; private RtcListener listener; private AliRtcEngineEventListener aliRtcEngineEventListener = new AliRtcEngineEventListener() { @Override public void onJoinChannelResult(int result) { super.onJoinChannelResult(result); } @Override public void onLeaveChannelResult(int result) { super.onLeaveChannelResult(result); } @Override public void onNetworkQualityChanged(String uid, AliRtcEngine.AliRtcNetworkQuality upQuality, AliRtcEngine.AliRtcNetworkQuality downQuality) { super.onNetworkQualityChanged(uid, upQuality, downQuality); } @Override public void onOccurWarning(int warn) { super.onOccurWarning(warn); } @Override public void onOccurError(int error) { super.onOccurError(error); }}; private AliRtcEngineNotify aliRtcEngineNotify = new AliRtcEngineNotify() { @Override public void onRemoteUserOnLineNotify(String uid) { super.onRemoteUserOnLineNotify(uid); } @Override public void onRemoteUserOffLineNotify(String uid) { super.onRemoteUserOffLineNotify(uid); } @Override public void onRemoteTrackAvailableNotify(final String uid, AliRtcEngine.AliRtcAudioTrack audioTrack, final AliRtcEngine.AliRtcVideoTrack videoTrack) { super.onRemoteTrackAvailableNotify(uid, audioTrack, videoTrack); // Receive the first string of the stream, To determine whether the teacher's flow if (uid) equals (mConfig. AliConfig. AvatarUID)) {/ / mEngine. ConfigRemoteAudio (mConfig. AliConfig. AvatarUID, true); // mEngine.configRemoteScreenTrack(mConfig.aliConfig.avatarUID, true); // mEngine.configRemoteCameraTrack(mConfig.aliConfig.avatarUID, true, true); // mEngine.subscribe(mConfig.aliConfig.avatarUID); new Handler(mContext.getMainLooper()).post(new Runnable() { @Override public void run() { if(mEngine == null){ return; } AliRtcRemoteUserInfo info = mEngine.getUserInfo(uid); if(info == null){ return; } AliRtcEngine.AliVideoCanvas cameraCanvas = info.getCameraCanvas(); AliRtcEngine.AliVideoCanvas screenCanvas = info.getScreenCanvas(); if(videoTrack == AliRtcEngine.AliRtcVideoTrack.AliRtcVideoTrackNo){ screenCanvas = null; cameraCanvas = null; } else if (videoTrack = = AliRtcEngine. AliRtcVideoTrack. AliRtcVideoTrackCamera) {/ / we only need the flow of cameras. Create set remoteView here, and show mCanvas = new AliRtcEngine. AliVideoCanvas (); SophonSurfaceView mRemoteView = new SophonSurfaceView(mContext); if(listener ! = null){listener.joinSuccess(mRemoteView); listener.joinSuccess(mRemoteView); } mRemoteView.setZOrderOnTop(true); mRemoteView.setZOrderMediaOverlay(true); mCanvas.view = mRemoteView; / / set the render mode mCanvas renderMode = AliRtcEngine. AliRtcRenderMode. AliRtcRenderModeFill; / / set up the mirror mode, the default not mirror mCanvas. MirrorMode = AliRtcEngine. AliRtcRenderMirrorMode. AliRtcRenderMirrorModeAllEnabled; screenCanvas = null; cameraCanvas = mCanvas; mEngine.setRemoteViewConfig(cameraCanvas, uid, AliRtcEngine.AliRtcVideoTrack.AliRtcVideoTrackCamera); }}}); } } @Override public void onFirstRemoteVideoFrameDrawn(String uid, AliRtcEngine.AliRtcVideoTrack videoTrack) { super.onFirstRemoteVideoFrameDrawn(uid, videoTrack); } @Override public void onBye(int code) { super.onBye(code); } @Override public void onAliRtcStats(AliRtcEngine.AliRtcStats aliRtcStats) { super.onAliRtcStats(aliRtcStats); } @Override public void onMessage(String tid, String contentType, String content) { super.onMessage(tid, contentType, content); }}; @Override public void init(Context context, RtcInfo config) { mContext = context; mConfig = config; mEngine = AliRtcEngine.getInstance(context); mEngine.setRtcEngineEventListener(aliRtcEngineEventListener); mEngine.setRtcEngineNotify(aliRtcEngineNotify); mEngine.setClientRole(AliRtcEngine.AliRTCSDK_Client_Role.AliRTCSDK_live); mEngine.setChannelProfile(AliRtcEngine.AliRTCSDK_Channel_Profile.AliRTCSDK_Interactive_live); mEngine.setAutoPublishSubscribe(false, true); } @Override public void join() { AliRtcAuthInfo info = new AliRtcAuthInfo(); info.setConferenceId(mConfig.aliConfig.liveChannel); info.setAppid(mConfig.aliConfig.appId); info.setNonce(mConfig.aliConfig.nonce); info.setTimestamp(mConfig.aliConfig.timeStamp); info.setUserId(mConfig.aliConfig.liveUID); int size = mConfig.aliConfig.gslbList.size(); info.setGslb(mConfig.aliConfig.gslbList.toArray(new String[size])); info.setToken(mConfig.aliConfig.liveToken); if(mEngine ! = null){ mEngine.joinChannel(info, ""); } } @Override public void leave() { if(mEngine ! = null){ mEngine.leaveChannel(); } mEngine = null; } @Override public void setRtcListener(RtcListener rtcListener) { listener = rtcListener; }}Copy the code

Similar to the sound net, the notes are similar, because the key parts are annotated, I won’t go into detail here.

conclusion

In this way, before entering the live broadcast, you can obtain the live broadcast configuration through the background and initialize different engines according to the type.