In the last brief WebRTC video call, we learned about the basics of WebRTC and how to implement a P2P call. But in the actual application of the project, how do we troubleshoot problems and monitor the quality of audio and video calls? WebRTC SDK provides a nice monitor chart regardless of where you access it. How does this work?

In fact, the underlying RTCPeerConnection provides the getStats interface, which returns a snapshot of the current status of RTCPeerConnection. The content is very rich, and you can obtain or calculate the resolution, bit rate, frame rate, packet loss rate and other information according to these contents. Chrome is also nicely integrated with Internals, so you don’t need to write code to get the above core metrics, including getUserMedia interface calls and RTCPeerConnection status changes.

Chrome’s WebRTC Internals feature is very powerful, so here’s a quick guide to the basic features and how to troubleshoot problems.

1. The whole

Open Chrome and enter Chrome ://webrtc-internals/ in the address box to open the debugging tool. A similar page is displayed

You can use this if WebRTC is not used in your projectDemoSimulate the audio and video call locally.

1.1 Create Dump

Save the log of this page to a file (this file will contain all the following sections of information), and save the problem to a file or to be analyzed by others

The file

1.2 Read the Stats From



There are two standards for WebRTC to read status logs:

  • Standardized: In line with the new W3C standard, the code calls are based on Promise
  • Legacy Non-standard: Deprecated Google-defined old Standard where base code calls are based on callback
// Standardized
pc.getStats().then(stats => console.log(stats))

// Legacy Non-Standard
pc.getStats(stats => console.log(stats))
Copy the code

There is a big difference between the two standards, and the new standard based on Promise is used here

1.3 GetUserMedia Request

GetUserMedia is the interface for the browser to obtain the media stream from the camera and microphone. The Tab of getUserMedia Request can see the recent call record of this API and call parameters. Check out mediaDevices to start your local video tour.

You can see that each record contains:

  • Domain name called
  • Call time
  • Audio constraints
  • Video constraints

2. RTCPeerConnection monitoring information

You can see that in addition to GetUserMedia Requests, there are other tabs, each of which corresponds to a PeerConnection object

After being opened, it can be divided into four parts:

2.1 Setting parameters for PeerConnection

In this section, you can see that we are constructing the parameter of PeerConnection. This parameter is the result of merging the passed parameter with the default parameter. For details of the constructor of PeerConnection, please refer to MDN.

2.2 Operations and Events of PeerConnection

Some operations and callback events on PeerConnection are recorded in chronological order

2.2.1 operation

  • CreateOffer, createAnswer: Generate offer and answer, click to expand to see the call parameters (parameters are also merged values)
  • SetLocalDescription and setRemoteDescription: local SDP and remote SDP
  • AddIceCandidate: Adds the candidate on the peer to the PeerConnection

2.2.2 Callback Event

  • CreateOfferOnSuccess, createAnswerOnSuccess: Since createOffer and createAnswer are asynchronous, the results of successful calls are shown here
  • SetLocalDescriptionOnSuccess, setRemoteDescriptionOnSuccess: Also, setLocalDescription and setRemoteDescription are asynchronous, indicating that they are called correctly.
  • Signalingstatechange: The signaling status is the result of calling setLocalDescription, setRemoteDescription and other apis. The detailed changes can be seen in the following figure (signaling status is also important, for example, remote SDP cannot be set directly in the state of ‘stable’).

Signaling state changes can be a bit convoluted here, but problems with setting timing or any SDP Settings can be clearly seen if you look like the image below

  • Icecandidate: Collect local candidates. The collection action is usually done by WebRTC automatically after setLocalDescription, and then thrown to the JS layer by callback
  • Iceconnectionstatechange: Indicates that the ICE connection status changes. For details, see MDN
  • Connectionstatechange: The connection status of PeerConnection changes. For details, see MDN

2.3 Streaming Data (Numerical format & Chart format)

Here you can really see the upstream and downstream stream data, in which you can pay special attention to the following four lines (the values and chart format of the upper and lower parts of the link respectively are consistent with the meaning of the data source) :

  • Inbound-rtp: indicates downstream data, including audio and video
  • Outbound – RTP: indicates uplink data, including audio and video

2.3.1 audio outbound – RTP/remote – the inbound – RTP

Verify again from the comparison above that these are different representations of the same data source, as can be seen from the numerical version

  • Kind: audio
  • Ssrc: 2840249774 Corresponds to the MLINE of the SDP
  • [codec]: opus (111, minptime=10; Useinbandfec =1) Audio encoding

You can notice from the chart

  • BytesSent_in_bits /s: indicates the uplink bit rate. The audio rate is usually around 30K (halved after the mute). This value is not directly reflected in the return value of getStats, but requires two stats calculations:
Bit rate = (delta bytesSent)/(delta timestamp)Copy the code
  • PacketsSent /s: packetsSent per second are calculated
  • RetransmittedPacketsSent: indicates that the packet has been retransmitted

In the latest version of Chrome, bytesSent_in_bits/s is no longer available

If you see upstream packet loss, you can view the remote-inbound-RTP data matching output-RTP, which represents some data at the receiving end. There are also two display forms. Here, only the numerical version is displayed:

  • PacketsLost: Indicates the packet loss rate divided by the total number of sent packets
const deltaLost = currentLost - prevLost;

const deltaSent = (currentPacketsSent + currentLost) - (prevPacketsSent + prevLost);

const lostRate = deltaLost / deltaSent;
Copy the code
  • RoundTripTime: The unit is seconds

2.3.2 video outbound – RTP/remote – the inbound – RTP

Video uplink is similar to audio. You can focus on the following fields:

SSRC, CODEC, packetsSent, bytesSent_in_bit/s these are consistent with audio statistics

  • FrameWidth: Width of the uplink video
  • FrameHeight: Upstream video height

The packet loss rate and roundTripTime can also be calculated by viewing the receiving status of the remote end in remote-inbound-RTP

2.3.3 Audio inbound RTP and video inbound RTP

Here represents downlink data, which is similar to outcome-RTP in general, except that it contains data such as packet loss and audioLevel

3. Specific problem analysis

In practical applications, a two-end call is not directly connected, but needs to be forwarded by a server. In some cases, the server needs to cooperate with you to troubleshoot the problem

3.1 The received picture is blurred

In general to talk on the phone, the building of a just below is a video uplink data rate is larger, can see the first have a clear process of climbing, this is because the WebRTC just didn’t know how the network status at connection is established, will be a little bit of let go of uplink rate, to ensure that the video is fluent, so the receiver is the beginning of the blurred images, This is a normal phenomenon.

If the picture is blurred and not recovered during a call, it may be because the sender’s network is poor or the bit rate collected is extremely high, but the RTC’s upstream bit rate is limited. If the bit rate is limited, you can check whether there are b=AS: lines in the SDP. Or if a sender restriction is used, a transceiverModified event can be used to determine whether a maxBitrate restriction exists

3.2 The pull flow screen is black

You need to check whether the downstream bit rate of the video inbound-RTP has a value. If yes, you can troubleshoot the rendering problem. If no, you can check the bit rate of the stream sender to eliminate the abnormal sending by the stream sender. If it is normal, the server needs to troubleshoot problems.

3.3 Pull flow silent

You need to view the audioLevel and totalAudioEnergy received by inbound-RTP of audio. AudioLevel is calculated by totalAudioEnergy and the value is between [0, 1]. Normally, the value should be higher than 0.1

There may be various problems encountered in practical applications, such as openingCall the DemoTo start a call, it is normal to view audioLevel, but if you type in the console before starting a call:

Array(50).fill(0).forEach(() => new AudioContext());
Copy the code

Then start the call again, what will happen at this time, the cause of the exception is based on what, you are welcome to discuss in the comment section.

4. How to debug in Firefox

git clone https://github.com/fippo/webrtc-externals
cd webrtc-externals
sudo npm i web-ext -g
web-ext run
Copy the code

After running, Firefox will automatically open, and you can see an RTC plug-in on the right side. If you pass it normally, you can use Internals



The above is just the tip of the iceberg of WebRTC Internals debugging tools, how to use effectively in a project, is through continuous practice.