Seven cows live cloud in June 2016 after the release, help customers to solve all kinds of problems, such as live caton, Mosaic, flower screen, black screen, noise, and so on great is not synchronized, this among them, there are some network reasons, some developers use pose problems, some parameters configuration errors, of course, There are also some issues with the SDK itself.

To sum up, if developers can have a deeper understanding of the basic knowledge of the live streaming field and master some basic obstacle removal methods, many problems can be solved quickly by themselves, or even better prevented in the future.

Therefore, after the broadcast technology, a series of articles, we launched the new series of live screening incurable diseases, we will help customers solve the problem of live experience sharing, also with some audio and development experience, knowledge and the basis of optimization hopes to help the developers in the field of live.


Topics covered in this series include, but are not limited to, the following:

  • Playback failed

  • Live caton

  • The first is slow

  • High latency

  • Out of sync

  • Severe Mosaic

  • Play black screen, splintered screen and green screen

  • Play noise, noise, echo

  • Drag on demand is not correct

  • Direct heat problem

  • Other questions (to be continued)

This is the fourth article in the series “Live broadcast Troubleshooting for Difficult Diseases”, and we take a look at the delay of live broadcast.


Measurement of delay

Generally, the simplest way to measure the delay is that the push-stream end and the playback end are facing the same clock, and then subtract the time displayed on the push-stream end from the time displayed on the playback end to get the rough live delay.

Analysis of high delay

First, let’s take a look at the modules that can cause latency:

  1. Image processing delay, such as cropping, beautifying, special effects processing
  2. Video encoding/decoding delay
  3. Network transmission delay
  4. Buffers in business code

General image processing, data copy, encoding and decoding delay, are ms level, the real will produce a relatively large delay, one is the network transmission delay on the Internet, the other is the buffer in the business code.

Network transmission delay

Data transmission in the network, from a node through multistage server forwarding to another node, have physical delay is inevitable, the table below gives the theoretical data in the optical fiber network transmission time (delay is often much larger than that of the actual scene, because involves the bandwidth, the network jitter interference), such as:

According to the table, the closer the physical distance between the player end and the streaming end or the edge server node, the smaller the delay.

Buffers in business code

How do buffers in business code, mainly on the streaming side and on the player side, generate buffered data for a 30 FPS video stream with an increase in latency of 1s for every 30 frames of the buffer?

>>>> how does the data on the push stream end “accumulate”?

Collection -> encoding -> Data send -> [server] When the network jitter, “data send” will be slowed down, resulting in a certain block, so that the data will be “accumulated” in the send buffer of the push stream end.

>>>> how does the data of the player end “accumulate”?

[Server]-> Data receiving -> Decoding -> Render When the network jitter occurs, the server data cannot be transmitted to the player “in time”, and due to the reliability of TCP protocol, all the data will be accumulated by the server, when the network recovery is good, it will be quickly transmitted to the player. This data passively “accumulates” in the receive buffer.

>>>> How to eliminate the cumulative latency of the service buffer?

The send buffer on the stream push side can eliminate this accumulated delay by sending faster when the network is well recovered.

The receiving buffer of the player can quickly consume the data in the buffer by dropping frames or speeding up playback, thus eliminating the cumulative delay.

The agreement time delay

Generally, there are three standard live broadcast protocols: RTMP, HLV and HLS. Generally, the delay of RTMP/HLV is 1 to 3s, while the delay of HLS is larger. Most people choose RTMP/HLV for live broadcast applications that focus on delay, and these protocols are all based on TCP. TCP has several features that result in higher latency than udP-based private protocols, including the following:

  • Three handshakes to establish a connection
  • ACK mechanism
  • Packet retransmission

Therefore, if you want to essentially solve the live broadcast delay problem, you still have to switch to a private PROTOCOL based on UDP to transmit data.

summary

The investigation of the problem of high playback delay is roughly introduced here. In the next chapter, we will discuss the topic of uneven sound and painting.


The author of this article: Lu Jun@Qiniuyun. If you have questions that you are interested in, but are not on the above list, you can also write to [email protected] and follow @lu _ Jun on Sina Weibo or @jhuster on wechat to get the latest articles and information.