Let me just take a deep breath. In fact, from a very early start to watch video call and live broadcast related to some of the content, but because these things are too esoteric, involving a lot of coding and decoding knowledge, so that many of us programmers are expected to be inaccessible. Later, I read some information about WebRTC and learned some related applications, and temporarily felt that the threshold of this aspect is a little lower. Therefore, I overrated myself and set up a set of video call Demo, which is a small introduction and a Flag for further study of Android in the future. All right, back to the point. The phone plan I built today, it’s super, super, super simplified. Why? Because there’s no back-up, no signaling. In plain English, it is pure whole P2P video transmission. Somebody’s about to talk, so I’ll see what you get with this. I think this is a pure learning blog, we still have to really learn their own will, so as to connect with their own application of existing functions to achieve relevant business. For example, if we’re already doing text chat, there’s no need for a signaling server to connect to a peer-to-peer video call.
Without further ado, let’s begin our presentation of WebRTC point-to-point live video program.
Before we start coding, I want to say a few words about signaling. The first signaling to be passed is Candidate, which is generated when we create PeerConnection. The second signaling to be passed is the SDP, which is generated at CreateOffer and CreateAnswer. Since there is no signaling server for these two data, we temporarily put them into EditText after generation, send them to the other party through wechat or other communication tools, and the other party sets them to its own software to achieve this function. When it comes to the production environment, I will explain what needs to be modified later.
I’m not going to do this step by step. I’m going to go to the code.
1. Import dependencies
Implementation 'org. Webrtc: Google - webrtc: 1.0 +' implementation 'com. Google. Code. Gson: gson: 2.6.2'Copy the code
The first dependency here is the WebRTC dependency, which is required. The second is the dependence of Gson, which is not necessary, mainly for the convenience of data transmission here.
2. Change the minimum version
minSdkVersion 21
Copy the code
3. Set permissions
Add the following permissions to the manifest file
<uses-feature android:name="android.hardware.camera" /> <uses-feature android:name="android.hardware.camera.autofocus" /> <uses-feature android:glEsVersion="0x00020000" android:required="true" /> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" /> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" /> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-permission android:name="android.permission.BLUETOOTH" /> <uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> <uses-permission android:name="android.permission.READ_PHONE_STATE" /> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.WRITE_SETTINGS" tools:ignore="ProtectedPermissions" /> <uses-permission android:name="android.permission.NFC"/> <! Temporary use, LAN communications - > < USES - permission android: name = "android. Permission. ACCESS_WIFI_STATE" / > < USES - the permission android:name="android.permission.READ_LOGS" tools:ignore="ProtectedPermissions" />Copy the code
Don’t forget to add it under the Manifest node
xmlns:tools="http://schemas.android.com/tools"
Copy the code
4. Next, let’s build the layout file
<? The XML version = "1.0" encoding = "utf-8"? > <ScrollView xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" android:scrollbars="none" android:background="@android:color/white"> <LinearLayout android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="vertical" android:background="@android:color/white"> <org.webrtc.SurfaceViewRenderer android:id="@+id/LocalSurfaceView" android:layout_width="match_parent" android:layout_height="400dp"/> <View android:layout_width="match_parent" android:layout_height="1dp" android:background="@android:color/white" /> <org.webrtc.SurfaceViewRenderer android:id="@+id/RemoteSurfaceView" android:layout_width="match_parent" android:layout_height="400dp" /> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/et_offer" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_offer" android:layout_width="wrap_content" android:layout_height="wrap_content" Android :text=" Set offer" /> </LinearLayout> <LinearLayout Android :layout_width="match_parent" Android: Layout_height =" 50DP" android:orientation="horizontal"> <EditText android:id="@+id/et_answer" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_answer" Android :layout_width="wrap_content" Android: Layout_height ="wrap_content" Android :text=" setup answer" /> </LinearLayout> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/tv_candicate1" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_candidate1" android:layout_width="wrap_content" Android :layout_height="wrap_content" Android :text=" setup candidate" /> </LinearLayout> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/tv_candicate2" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_candidate2" android:layout_width="wrap_content" Android :layout_height="wrap_content" Android :text=" setup candidate" /> </LinearLayout> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/tv_candicate3" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_candidate3" android:layout_width="wrap_content" Android :layout_height="wrap_content" Android :text=" setup candidate" /> </LinearLayout> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/tv_candicate4" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_candidate4" android:layout_width="wrap_content" Android :layout_height="wrap_content" Android :text=" setup candidate" /> </LinearLayout> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/tv_candicate5" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_candidate5" android:layout_width="wrap_content" Android :layout_height="wrap_content" Android :text=" setup candidate" /> </LinearLayout> <LinearLayout android:layout_width="match_parent" android:layout_height="50dp" android:orientation="horizontal"> <EditText android:id="@+id/tv_candicate6" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" /> <Button android:id="@+id/bt_candidate6" android:layout_width="wrap_content" Android :layout_height="wrap_content" Android :text=" setup candidate" /> </LinearLayout> </LinearLayout> </ScrollView>Copy the code
There’s a bunch of input boxes, EditText and Button, that you can ignore, that I’m using to set up the Candidate and the SDP. Not in real projects. We mainly pay attention to the org. Webrtc. SurfaceViewRenderer controls. I wrote two of them here. This control is the finished frame layout control that will host our local and remote videos. Just change the size of these two controls.
5.MainActivity is the global variable of the call page
private EglBase mRootEglBase; private SurfaceViewRenderer mLocalSurfaceView; private SurfaceViewRenderer mRemoteSurfaceView; private PeerConnectionFactory mPeerConnectionFactory; private VideoCapturer mVideoCapturer; private SurfaceTextureHelper mSurfaceTextureHelper; private VideoTrack mVideoTrack; private AudioTrack mAudioTrack; public static final String VIDEO_TRACK_ID = "1"; //"ARDAMSv0"; public static final String AUDIO_TRACK_ID = "2"; //"ARDAMSa0"; private static final int VIDEO_RESOLUTION_WIDTH = 1280; private static final int VIDEO_RESOLUTION_HEIGHT = 720; private static final int VIDEO_FPS = 30; private PeerConnection mPeerConnection; private AudioManager audioManager; private static String TAG = "aaaaaaaaaaaaaaaaaaaa"; private EditText et_offer; private Button bt_offer; private EditText et_answer; private Button bt_answer; private TextView tv_candicate1; private TextView tv_candicate2; private TextView tv_candicate3; private TextView tv_candicate4; private TextView tv_candicate5; private TextView tv_candicate6; private int count = 0; private Button bt_candidate1; private Button bt_candidate2; private Button bt_candidate3; private Button bt_candidate4; private Button bt_candidate5; private Button bt_candidate6;Copy the code
Above the blank line are the required WebrTC-related properties, and below the blank line are some of the properties used by the Demo, not related to video calls.
6. Then we need to apply for dynamic permissions in our own way on this page. Here I provide a scheme to request dynamic permissions with the framework of Guo Lin Dashen.
Adding framework dependencies
Implementation 'com. Permissionx. Guolindev: permission - support: 1.4.0'Copy the code
Called in OnCreate
PermissionX.init(this) .permissions(Manifest.permission.READ_PHONE_STATE, Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO ) .onExplainRequestReason(new ExplainReasonCallbackWithBeforeParam() { @Override public void onExplainReason(ExplainScope scope, List<String> deniedList, Boolean beforeRequest) {scope. ShowRequestReasonDialog (deniedList, "is about to apply for permission to access the program must rely on", "I have learnt the"); } }) .onForwardToSettings(new ForwardToSettingsCallback() { @Override public void onForwardToSettings(ForwardScope Scope, a List < String > deniedList) {scope. ShowForwardToSettingsDialog (deniedList, "you need to manually open access of application Settings", "I have learnt the"); } }) .request(new RequestCallback() { @Override public void onResult(boolean allGranted, List<String> grantedList, List<String> deniedList) { if (allGranted) { // TODO: } else {toast.maketext (mainactivity.this, "You rejected the following permissions:" + deniedList, toast.length_short).show(); }}});Copy the code
The TODO place is where we’ll do the next step of coding later.
There are five main dynamic permissions applied here.
7. Add the following code to the Activity
/**
* 开始webRtc的初始化
*/
private void init() {
//初始化视频必要对象
mRootEglBase = EglBase.create();
//两个视频画面
mLocalSurfaceView = findViewById(R.id.LocalSurfaceView);
mRemoteSurfaceView = findViewById(R.id.RemoteSurfaceView);
et_offer = findViewById(R.id.et_offer);
bt_offer = findViewById(R.id.bt_offer);
et_answer = findViewById(R.id.et_answer);
bt_answer = findViewById(R.id.bt_answer);
tv_candicate1 = findViewById(R.id.tv_candicate1);
tv_candicate2 = findViewById(R.id.tv_candicate2);
tv_candicate3 = findViewById(R.id.tv_candicate3);
tv_candicate4 = findViewById(R.id.tv_candicate4);
tv_candicate5 = findViewById(R.id.tv_candicate5);
tv_candicate6 = findViewById(R.id.tv_candicate6);
bt_candidate1 = findViewById(R.id.bt_candidate1);
bt_candidate2 = findViewById(R.id.bt_candidate2);
bt_candidate3 = findViewById(R.id.bt_candidate3);
bt_candidate4 = findViewById(R.id.bt_candidate4);
bt_candidate5 = findViewById(R.id.bt_candidate5);
bt_candidate6 = findViewById(R.id.bt_candidate6);
//初始化本地视频界面
mLocalSurfaceView.init(mRootEglBase.getEglBaseContext(), null);
mLocalSurfaceView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FILL);
mLocalSurfaceView.setMirror(true);
mLocalSurfaceView.setEnableHardwareScaler(false /* enabled */);
//初始化对方视频界面
mRemoteSurfaceView.init(mRootEglBase.getEglBaseContext(), null);
mRemoteSurfaceView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FILL);
mRemoteSurfaceView.setMirror(true);
mRemoteSurfaceView.setEnableHardwareScaler(false /* enabled */);
mRemoteSurfaceView.setZOrderMediaOverlay(true);
//创建PeerConnectionFactory对象
mPeerConnectionFactory = createPeerConnectionFactory(this);
//日志输出设置
Logging.enableLogToDebugOutput(Logging.Severity.LS_VERBOSE);
//VideoCapturer对象初始化
mVideoCapturer = createVideoCapturer();
//相关视频源初始化,必备
mSurfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", mRootEglBase.getEglBaseContext());
VideoSource videoSource = mPeerConnectionFactory.createVideoSource(false);
mVideoCapturer.initialize(mSurfaceTextureHelper, getApplicationContext(), videoSource.getCapturerObserver());
//视频轨初始化,并设置到本地画布上
mVideoTrack = mPeerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
mVideoTrack.setEnabled(true);
mVideoTrack.addSink(mLocalSurfaceView);
//音频轨初始化
AudioSource audioSource = mPeerConnectionFactory.createAudioSource(new MediaConstraints());
mAudioTrack = mPeerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
mAudioTrack.setEnabled(true);
//初始化音频控制器
audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
setSpeakerphoneOn(true);
bt_offer.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
//接收到对放信令后的一个设置
try {
String description = et_offer.getText().toString();
et_offer.setText("");
mPeerConnection.setRemoteDescription(
new SimpleSdpObserver() {
@Override
public void onSetSuccess() {
Log.i(TAG, "set remote Description ok");
doAnswerCall();
}
@Override
public void onSetFailure(String msg) {
Log.i(TAG, "set remote Description unok");
}
},
new SessionDescription(
SessionDescription.Type.OFFER,
description));
} catch (Exception e) {
e.printStackTrace();
}
}
});
bt_answer.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String answer = et_answer.getText().toString();
et_answer.setText("");
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
mPeerConnection.setRemoteDescription(
new SimpleSdpObserver() {
@Override
public void onSetSuccess() {
}
@Override
public void onSetFailure(String msg) {
}
},
new SessionDescription(
SessionDescription.Type.ANSWER,
answer));
}
});
bt_candidate1.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
String json = tv_candicate1.getText().toString().trim();
tv_candicate1.setText("");
CandidateBean candidateBean = new Gson().fromJson(json, CandidateBean.class);
IceCandidate remoteIceCandidate =
new IceCandidate(candidateBean.getId(),
candidateBean.getLabel(),
candidateBean.getCandidate());
mPeerConnection.addIceCandidate(remoteIceCandidate);
}
});
bt_candidate2.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
String json = tv_candicate2.getText().toString().trim();
tv_candicate2.setText("");
CandidateBean candidateBean = new Gson().fromJson(json, CandidateBean.class);
IceCandidate remoteIceCandidate =
new IceCandidate(candidateBean.getId(),
candidateBean.getLabel(),
candidateBean.getCandidate());
mPeerConnection.addIceCandidate(remoteIceCandidate);
}
});
bt_candidate3.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
String json = tv_candicate3.getText().toString().trim();
tv_candicate3.setText("");
CandidateBean candidateBean = new Gson().fromJson(json, CandidateBean.class);
IceCandidate remoteIceCandidate =
new IceCandidate(candidateBean.getId(),
candidateBean.getLabel(),
candidateBean.getCandidate());
mPeerConnection.addIceCandidate(remoteIceCandidate);
}
});
bt_candidate4.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
String json = tv_candicate4.getText().toString().trim();
tv_candicate4.setText("");
CandidateBean candidateBean = new Gson().fromJson(json, CandidateBean.class);
IceCandidate remoteIceCandidate =
new IceCandidate(candidateBean.getId(),
candidateBean.getLabel(),
candidateBean.getCandidate());
mPeerConnection.addIceCandidate(remoteIceCandidate);
}
});
bt_candidate5.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
String json = tv_candicate5.getText().toString().trim();
tv_candicate5.setText("");
CandidateBean candidateBean = new Gson().fromJson(json, CandidateBean.class);
IceCandidate remoteIceCandidate =
new IceCandidate(candidateBean.getId(),
candidateBean.getLabel(),
candidateBean.getCandidate());
mPeerConnection.addIceCandidate(remoteIceCandidate);
}
});
bt_candidate6.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
String json = tv_candicate6.getText().toString().trim();
tv_candicate6.setText("");
CandidateBean candidateBean = new Gson().fromJson(json, CandidateBean.class);
IceCandidate remoteIceCandidate =
new IceCandidate(candidateBean.getId(),
candidateBean.getLabel(),
candidateBean.getCandidate());
mPeerConnection.addIceCandidate(remoteIceCandidate);
}
});
mVideoCapturer.startCapture(VIDEO_RESOLUTION_WIDTH, VIDEO_RESOLUTION_HEIGHT, VIDEO_FPS);
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
// doStartCall();
}
/**
* 回复
*/
public void doAnswerCall() {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
MediaConstraints sdpMediaConstraints = new MediaConstraints();
mPeerConnection.createAnswer(new SimpleSdpObserver() {
@Override
public void onCreateSuccess(final SessionDescription sessionDescription) {
mPeerConnection.setLocalDescription(new SimpleSdpObserver() {
@Override
public void onSetFailure(String msg) {
}
@Override
public void onSetSuccess() {
runOnUiThread(new Runnable() {
@Override
public void run() {
et_answer.setText(sessionDescription.description);
}
});
}
},
sessionDescription);
}
@Override
public void onCreateFailure(String msg) {
}
}, sdpMediaConstraints);
}
/**
* 创建PeerConnectionFactory对象
*
* @param context
* @return
*/
public PeerConnectionFactory createPeerConnectionFactory(Context context) {
final VideoEncoderFactory encoderFactory;
final VideoDecoderFactory decoderFactory;
encoderFactory = new DefaultVideoEncoderFactory(
mRootEglBase.getEglBaseContext(),
false /* enableIntelVp8Encoder */,
true);
decoderFactory = new DefaultVideoDecoderFactory(mRootEglBase.getEglBaseContext());
PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions.builder(context)
.setEnableInternalTracer(true)
.createInitializationOptions());
PeerConnectionFactory.Builder builder = PeerConnectionFactory.builder()
.setVideoEncoderFactory(encoderFactory)
.setVideoDecoderFactory(decoderFactory);
builder.setOptions(null);
return builder.createPeerConnectionFactory();
}
/**
* 创建VideoCapturer对象
*
* @return
*/
private VideoCapturer createVideoCapturer() {
if (Camera2Enumerator.isSupported(this)) {
return createCameraCapturer(new Camera2Enumerator(this));
} else {
return createCameraCapturer(new Camera1Enumerator(true));
}
}
/**
* 根据不同摄像头初始化VideoCapturer对象
*
* @param enumerator
* @return
*/
private VideoCapturer createCameraCapturer(CameraEnumerator enumerator) {
final String[] deviceNames = enumerator.getDeviceNames();
for (String deviceName : deviceNames) {
if (enumerator.isFrontFacing(deviceName)) {
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return videoCapturer;
}
}
}
for (String deviceName : deviceNames) {
if (!enumerator.isFrontFacing(deviceName)) {
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return videoCapturer;
}
}
}
return null;
}
@Override
protected void onDestroy() {
super.onDestroy();
// TODO: 2021/1/6 缺一个挂电话的动作
mLocalSurfaceView.release();
mRemoteSurfaceView.release();
mVideoCapturer.dispose();
mSurfaceTextureHelper.dispose();
PeerConnectionFactory.stopInternalTracingCapture();
PeerConnectionFactory.shutdownInternalTracer();
mPeerConnectionFactory.dispose();
}
@Override
protected void onResume() {
super.onResume();
// mVideoCapturer.startCapture(VIDEO_RESOLUTION_WIDTH, VIDEO_RESOLUTION_HEIGHT, VIDEO_FPS);
// //开始创建PeerConnection
// //默认开启本地视频后就创建这个PeerConnection
// if (mPeerConnection == null) {
// mPeerConnection = createPeerConnection();
// }
// doStartCall();
}
/**
* 创建PeerConnection
*
* @return
*/
public PeerConnection createPeerConnection() {
LinkedList<PeerConnection.IceServer> iceServers = new LinkedList<PeerConnection.IceServer>();
// PeerConnection.IceServer ice_server =
// PeerConnection.IceServer.builder("turn:xxxx:3478")
// .setPassword("xxx")
// .setUsername("xxx")
// .createIceServer();
PeerConnection.IceServer ice_server = PeerConnection
.IceServer
.builder("stun:stun.l.google.com:19302")
.createIceServer();
iceServers.add(ice_server);
PeerConnection.RTCConfiguration rtcConfig = new PeerConnection.RTCConfiguration(iceServers);
//TCP候选策略控制开关
rtcConfig.tcpCandidatePolicy = PeerConnection.TcpCandidatePolicy.DISABLED;
//只使用TCP中转模式,不使用p2p直连
//rtcConfig.iceTransportsType = PeerConnection.IceTransportsType.RELAY;
//rtcConfig.bundlePolicy = PeerConnection.BundlePolicy.MAXBUNDLE;
//rtcConfig.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE;
rtcConfig.continualGatheringPolicy = PeerConnection.ContinualGatheringPolicy.GATHER_CONTINUALLY;
// Use ECDSA encryption.
//rtcConfig.keyType = PeerConnection.KeyType.ECDSA;
// Enable DTLS for normal calls and disable for loopback calls.
rtcConfig.enableDtlsSrtp = true;
//rtcConfig.sdpSemantics = PeerConnection.SdpSemantics.UNIFIED_PLAN;
PeerConnection connection =
mPeerConnectionFactory.createPeerConnection(rtcConfig,
mPeerConnectionObserver);
if (connection == null) {
return null;
}
List<String> mediaStreamLabels = Collections.singletonList("ARDAMS");
connection.addTrack(mVideoTrack, mediaStreamLabels);
connection.addTrack(mAudioTrack, mediaStreamLabels);
connection.setAudioPlayout(true);
return connection;
}
/**
* 创建PeerConnection的回调
*/
private PeerConnection.Observer mPeerConnectionObserver = new PeerConnection.Observer() {
@Override
public void onSignalingChange(PeerConnection.SignalingState signalingState) {
}
@Override
public void onIceConnectionChange(PeerConnection.IceConnectionState iceConnectionState) {
}
@Override
public void onIceConnectionReceivingChange(boolean b) {
}
@Override
public void onIceGatheringChange(PeerConnection.IceGatheringState iceGatheringState) {
}
/**
* 获取到当前设备的Candidate
* @param iceCandidate
*/
@Override
public void onIceCandidate(IceCandidate iceCandidate) {
JSONObject message = new JSONObject();
try {
message.put("type", "candidate");
message.put("label", iceCandidate.sdpMLineIndex);
message.put("id", iceCandidate.sdpMid);
message.put("candidate", iceCandidate.sdp);
Log.i(TAG, "candidate:" + message.toString());
runOnUiThread(new Runnable() {
@Override
public void run() {
count++;
if (count == 1) {
tv_candicate1.setText(message.toString());
} else if (count == 2) {
tv_candicate2.setText(message.toString());
} else if (count == 3) {
tv_candicate3.setText(message.toString());
} else if (count == 4) {
tv_candicate4.setText(message.toString());
} else if (count == 5) {
tv_candicate5.setText(message.toString());
} else if (count == 6) {
tv_candicate6.setText(message.toString());
}
}
});
} catch (JSONException e) {
e.printStackTrace();
}
}
@Override
public void onIceCandidatesRemoved(IceCandidate[] iceCandidates) {
if (iceCandidates != null) {
for (int i = 0; i < iceCandidates.length; i++) {
}
mPeerConnection.removeIceCandidates(iceCandidates);
}
}
@Override
public void onAddStream(MediaStream mediaStream) {
// VideoTrack remoteVideoTrack = mediaStream.videoTracks.get(0);
// remoteVideoTrack.setEnabled(true);
// remoteVideoTrack.addSink(mRemoteSurfaceView);
// VideoTrack videoTrack = mediaStream.videoTracks.get(0);
// videoTrack.setEnabled(true);
// videoTrack.addSink(mRemoteSurfaceView);
}
@Override
public void onRemoveStream(MediaStream mediaStream) {
}
@Override
public void onDataChannel(DataChannel dataChannel) {
}
@Override
public void onRenegotiationNeeded() {
}
/**
* 接收到对方的视频
* @param rtpReceiver
* @param mediaStreams
*/
@Override
public void onAddTrack(RtpReceiver rtpReceiver, MediaStream[] mediaStreams) {
MediaStreamTrack track = rtpReceiver.track();
if (track instanceof VideoTrack) {
Log.i(TAG, "有视频");
VideoTrack remoteVideoTrack = (VideoTrack) track;
remoteVideoTrack.setEnabled(true);
remoteVideoTrack.addSink(mRemoteSurfaceView);
}
}
};
@Override
protected void onPause() {
super.onPause();
// try {
// mVideoCapturer.stopCapture();
// } catch (InterruptedException e) {
// e.printStackTrace();
// }
}
/**
* 开始呼叫
*/
public void doStartCall() {
if (mPeerConnection == null) {
mPeerConnection = createPeerConnection();
}
MediaConstraints mediaConstraints = new MediaConstraints();
mediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
mediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
mediaConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
mPeerConnection.createOffer(new SimpleSdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
mPeerConnection.setLocalDescription(new SimpleSdpObserver() {
@Override
public void onSetSuccess() {
runOnUiThread(new Runnable() {
@Override
public void run() {
et_offer.setText(sessionDescription.description);
}
});
}
@Override
public void onSetFailure(String msg) {
}
}, sessionDescription);
}
}, mediaConstraints);
}
public static class SimpleSdpObserver implements SdpObserver {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
}
@Override
public void onSetSuccess() {
}
@Override
public void onCreateFailure(String msg) {
}
@Override
public void onSetFailure(String msg) {
}
}
private void setSpeakerphoneOn(boolean on) {
if (on) {
audioManager.setSpeakerphoneOn(true);
} else {
audioManager.setSpeakerphoneOn(false);//关闭扬声器
//把声音设定成Earpiece(听筒)出来,设定为正在通话中
audioManager.setMode(AudioManager.MODE_IN_CALL);
}
}
Copy the code
8. You may find that one CandidateBean is missing, we add it, and the code is as follows
public class CandidateBean { private String type; private int label; private String id; private String candidate; public String getType() { return type; } public void setType(String type) { this.type = type; } public int getLabel() { return label; } public void setLabel(int label) { this.label = label; } public String getId() { return id; } public void setId(String id) { this.id = id; } public String getCandidate() { return candidate; } public void setCandidate(String candidate) { this.candidate = candidate; }}Copy the code
9. Add Java 8 dependencies under the Android node in the build.gradle app
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
Copy the code
10. There is no error at this point, and we finally call init() at the point where all dynamic permissions have been claimed, which is where TODO was in step 6
11. This is the end of our code. When we run, one calls doStartCall and one doesn’t. The calling party is the caller. If you use my Demo and you are not a signaling server, you should pay attention to two points. First, the caller opens the generation of Candidate and SDP, and then sets Candidate and SDP after sending to the called party. After the called party candidate and SDP are generated and sent to the calling party, the SDP is set first and then the candidate is set. Second, we SDP through any chat tool sent to the other party to pay attention to the end of the carriage return, if there is no carriage return, otherwise an error will be reported.
At this point our WebRTC point-to-point video setup is complete. Here we use a STUN server provided by Google. If you want to use it in a production environment, one is to modify our signaling mode to chat over long connections. One is that we need to have a relay server configured for our IceService.
If you have any comments or questions, please leave a message in the comments section.