# # #
Live broadcast code
-
Real Time Streaming Protocol (RTSP) : Defines how a one-to-many application can efficiently transmit multimedia data over an IP network.
-
Real-time Transport Protocol (RTP) : RTP is a Transport layer Protocol for multimedia data streams. It specifies the standard packet format for audio and video transmission over the Internet. RTP is commonly used in streaming media systems (in conjunction with RTCP), video conferencing, and one-touch communication systems (in conjunction with H.323 or SIP), making it the technical foundation of the IP telephony industry.
-
Real-time Transport Control Protocol (RTCP) : RTCP is a companion of RTP and provides out-of-channel Control for RTP media streams. RTCP and RTP work together to package and send multimedia data, periodically transferring control data between participants in a multimedia stream session. The main function of RTCP is to provide feedback on the quality of service (QoS) provided by RTP, collecting statistics on related media connections, such as bytes transferred, packets transferred, packets lost, one-way and two-way network latency, and so on. Network applications can take advantage of the information provided by RTCP to improve their quality of service, such as limiting traffic or switching to codecs with a lower compression ratio.
#### Procedure To collect live videos and push RTMP on the client, perform the following steps: 1. Audio and video collection; 2. H264 coding for video and AAC coding for audio; 3. Perform FLV packet for encoded audio and video data; 4. Establish the RTMP connection and push the RTMP connection to the server. In the process of encoding the original video, there are two solutions: one is to use the third-party library FFmpeg for encoding, and the other is to use the AVAssetWriter of iOS or VTCompressionSession of VideoToolbox. framework for encoding. FFmpeg is widely used because of its cross-platform and rich functions. Encoding with AVAssetWriter requires writing the video to a local file, then reading the file and processing the packet by listening for changes in the file’s contents in real time. Starting with iOS8, VideoToolBox provides hardware encoding support, which can be encoded using VTCompressionSession.
####H264 encoding and AAC encoding
- H.264 coding still remember a previous interview, talking about H264 coding, at that time I did not know (-_- ‘). H264 is a high compression coding standard, how to compress? General video capture is 25 frames/second, that is, 25 screenshots per second, in fact, the content of each picture is not much different, the method of compression is to use the algorithm, only each picture changes the difference part saved, so that the video file is much smaller. Here put a few learning articles, understand H.264 coding,iOS8 system H264 video hardware codec description,H.264 hard coding code.
- AAC coding iOS audio AAC video H264 coding push stream best solution
The establishment of #### server has said so much about the establishment of the server. Here we provide the address of the establishment of two streaming media servers, according to which we can build the server to push the stream by ourselves. Because the process and screenshots are really quite detailed, I will not repeat the screenshots again. Portal: iOS video push stream based on RTMP and HLS- build Nginx streaming media server (3) and iOS mobile phone live Demo technology introduction. What we need to know is that the majority of live broadcast applications now use RTMP protocol, and some use HLS protocol. The default port of RTMP is 1935 and the TCP protocol is used. And you need to understand the FLV package format. #### pull stream live test implementation of the general idea is roughly, (open source push stream framework: Tencent, Youku, etc.)+ (Nginx+RTMP) + (ijkPlayer). We want to realize our live broadcast, which is to decompress the obtained data and restore the original data. Decoding is to turn H264 into YUV and AAC into PCM. Decoding can use soft decoding, hard decoding. Soft decoding is the use of CPU resources to decompress data, the way is FFmpeg decoding. Hard decoding. For iOS platforms, videoToolbox. Framework can be used (this Framework is only available on iOS 8.0 or later). In order to facilitate our implementation, we generally adopt third-party SDK, NetEase Cloud Live, Storm Cloud live, Tencent Cloud live, Sina Cloud, VideoCore, and Bilibili iJkPlayer. There are a number of SDKS available to help implement playback and push streams. #### Noun introduction
- HLS HTTP Live Streaming, HLS is part of Apple’s QuickTime X and iPhone software systems. It works by splitting the entire stream into small HTTP-based files for download, a few at a time. While the media stream is playing, the client can choose to download the same resource from many different alternate sources at different rates, allowing the streaming session to adapt to different data rates. When starting a streaming session, the client downloads an extended M3U (M3U8) Playlist file containing metadata to find available media streams. How to use iOS to achieve the legend of Miyue live broadcast, on-demand, cache? -HTTP Live Streaming (HLS) (I), how to use iOS to realize the Live broadcast, on-demand and caching of the Legend of Miyue? -HTTP Live Streaming (HLS) (2)
#### postscript originally wanted to write more, but I know that after reading a zhihu article, I feel that I really gained a lot of knowledge. Here is the link. How to build a complete video live broadcasting system? I’m gonna go have a good look myself. I will update it later through my study and research. Reprint with indication of source and website.