With the development of 4G and 5G network technology and the popularity of smart phones, network broadcast has gradually become the main communication mode of new media, widely used in social entertainment, product display, government affairs publicity, exhibition release and other fields.
How to ensure clear, smooth and real-time viewing and interactive experience for users in different cities and operators across the country? This paper mainly shares the application of edge computing in live video scenes, and the core content is as follows:
- What is live streaming?
- The system architecture of live broadcasting
- Edge computing accelerates live streaming
- conclusion
01 What is Live streaming?
According to the definition of Radio and TELEVISION Dictionary, live broadcast refers to the broadcast mode of late synthesis and simultaneous broadcast of radio and TELEVISION programs. In this way, the post-synthesis process of the program is the broadcast process of the program. It does not record or video in advance, but completes the production and broadcast of the program at the same time in the field, studio and studio.
Live broadcast can fully reflect the advantages of radio and TELEVISION media communication:
- Through live broadcast, the latest news can be broadcast at any time to ensure the timeliness of news reports.
- The occurrence and development of news events can be reported synchronously.
- Strong sense of the scene, to achieve a good broadcast effect.
With the development of 4G and 5G network technology and the popularity of smart phones, network broadcast based on the Internet and streaming media technology has gradually begun to develop. Relying on the extensive and convenient network resources of the Internet, network broadcast encodes, compresses and encapsulates audio and video signals, and then transmits and plays them in real time on the Internet through streaming media transmission protocol. Contrast radio and television broadcast, live webcast to anchor in professional skills, form of expression, the respect such as broadcast content more pluralism and loose, and can provide more interactive two-way communication and the audience, webcast has gradually become the main mode of transmission, new media in social entertainment, product display, openness, fair distribution, and other fields are widely used.
Network broadcast can be divided into e-commerce live broadcast, game live broadcast, reality show live broadcast, concert live broadcast and sports live broadcast according to the content and scene. According to the 48th Statistical Report on Internet Development in China released by CNNIC, as of June 2021, the number of users of network broadcast in China has reached 638 million, with a year-on-year increase of 75.39 million, accounting for 63.1% of the total Internet users. Benefiting from a wide range of users, live streaming will continue to highlight its economic and social value.
02 Architecture of the Live broadcast system
The infrastructure
The live broadcast infrastructure consists of three parts: anchor end, server end and audience end:
- Anchor end: video production source. The hardware device collects audio and video data, encodes, compresses, and encapsulates it, and streams it to the corresponding media service on the server over the network.
- Audience side: pull the target video stream from the server media server through the network, and play the audio and video decoded by the local player.
- Server: includes a series of media servers to realize unified access, management and scheduling capabilities for the control side of live video (user authentication, room management, etc.) and data side (push and pull flow of audio and video streams). At the same time, media services can also perform functions such as mix-linking, transcoding, content review and recording of video streams according to the business requirements of live broadcasting platforms.
Business architecture
Generally speaking, the live broadcast business architecture is mainly composed of the host end, media service, broadcast end and live broadcast business platform.
- Anchor side: By integrating the push stream SDK in the live broadcast APP, the collection, coding, push stream and value-added services related to live broadcast can be realized.
- Audience side: By integrating the playback SDK in the livestreaming APP, the streaming pull, decoding, playing and experience optimization related to livestreaming can be realized.
- Livestreaming business platform: Anchors and audiences need to rely on livestreaming apps when publishing/subscribing to livestreaming content: For example, anchors can create rooms for livestreaming only after they register and pass the approval of the corresponding livestreaming APP platform, and then the audience can enter the room of an anchor to watch the livestreaming and interact in real time through the livestreaming APP. The corresponding back-end platform of livestreaming APP is the platform of livestreaming business, realizing user registration and management, room management, authentication and authorization management.
- Media services: realize the ability to access, publish and forward live streaming, and combine, review, transcode and record live streaming and content based on business requirements.
In commercial livestreaming apps, there will also be business modules such as bullet screen, tipping and product recommendation to provide more interesting and novel livestreaming interaction and experience.
03 Edge computing accelerated live streaming
With the development of business, livestreaming platforms need to provide coverage and livestreaming services to users in different cities and operators across the country. The following problems will arise:
- High concurrency bottleneck: It is difficult for the live broadcast center to carry concurrent requests of millions or tens of millions of users.
- Bandwidth bottleneck: The network resources of the live broadcast center cannot meet the requirements of video access and distribution for a large number of users.
- Inconsistent experience: Users in different places experience inconsistent network delays due to different physical distances.
Edge computing aims to be closer to the edge of a user’s network, providing standard computing power and IT services. Delay-sensitive services and traffic access services can be deployed locally to reduce service response latency for nearby users. On the other hand, services can be processed locally at the edge to effectively share the bottleneck in the center and increase service capacity.
The push-pull flow accelerates
If the live streams of all anchors are pushed to the live broadcast center, and then all the audiences are also pulled from the live broadcast center, it is bound to bring great business pressure to the live broadcast center. At the same time, the network link between anchors/viewers and the live broadcast center has long distance transmission, and its network performance has bandwidth limitation, instability and other factors, which also affect the final user experience.
The edge instance provided by edge computing builds its own live broadcast network to deploy the push-pull streaming capability closer to the edge of end users, so as to realize the nearby access and forwarding of user live streaming data and reduce user delay. At the same time, the node where the edge instance is located has the high-quality dedicated line exit of the operator, which can effectively guarantee the network transmission quality between the user, the edge node and the live broadcast center and improve the service stability.
- Upstream streaming acceleration: Based on user location affinity, users (anchors) are intelligently scheduled to the nearest edge computing node to provide live video streaming services, which reduces user access response delay and improves user experience.
- Pull-down streaming acceleration: Based on the affinity between user addresses and locations, the user (audience) is intelligently scheduled to the nearest edge compute node to provide the pull-down streaming service. If the live streaming is cached locally, it is directly distributed. If there is no local cache, the traffic is pulled back to the source live broadcast center, effectively reducing the delay of the user’s pull flow and greatly saving the cost of bandwidth for the live broadcast center.
Media processing acceleration
General edge computing work force to support the user local media users on the live video data processing, such as video broadcast media services transfer code, slicing, local deployment services directly at the edge of the confluence, on the one hand, the local media data processing can improve user interaction related business response efficiency, On the other hand, data compression and optimization at the data source can improve the edge-center data transmission efficiency and optimization cost. At the same time, from the perspective of global system architecture, the distributed deployment of live streaming services can be realized by adopting edge computing architecture with scattered physical locations, thus achieving higher concurrency and more stable service capabilities.
Consider implementing the following media service processing on edge nodes:
- Narrowband HD/transcoding: On the premise of ensuring the video quality of users, the live streaming video is compressed to reduce the demand for return bandwidth.
- Confluence: Multiple video streams of the user are combined at the edge to reduce the demand for return bandwidth.
- Transfer encapsulation: Dynamically convert streaming media encapsulation protocols based on different types of users’ terminals (IOS, Android, and HTML5).
- Transcoding: realize the conversion of different encoding formats of video stream.
- Super resolution: use algorithms to improve the quality of original pictures to meet the needs of users with high quality pictures.
- Dynamic resolution: dynamically switch videos at different bit rates based on the user’s real-time network environment to ensure video smoothness.
It is worth mentioning that more and more interesting gameplay and special effects emerge in the live broadcast scenes, such as the popular “Ant Ah Hey” and “Comic Style” video effects in Douyin, which can greatly enrich users’ interest and improve user engagement. High-quality video effects also put forward higher resource requirements for user terminals, but many mid-range and low-end terminals are limited by the hardware performance bottleneck, which makes it difficult for users to obtain the desired effect of the product, thus affecting the experience. Edge computing node provides general GPU computing power resources, which can effectively assist terminals to achieve high-quality video rendering and ensure users to obtain the expected live broadcast experience.
AI application acceleration
With the development and maturity of AI technology, more and more AI technologies are applied to live broadcast scenes. For example, AI algorithms are used to achieve better and innovative live broadcast experience in content understanding, content innovation and content transmission.
From the perspective of computing power sources, the central cloud, edge nodes and user terminals can all provide computing power resources required by AI services:
- Central cloud, with large-scale and high-performance AI computing power advantages, can meet the resource needs of various AI business scenarios; Limited by the distance of users, it has the disadvantage of low timeliness when serving remote users.
- User terminals, some medium and high-end mobile terminals can provide certain AI computing power for AI business. The AI computing power of the user terminal has the lowest delay and best experience for the user, while the business side does not need to invest resources and has the best cost. However, due to the variety of user terminal models and uneven performance levels, the service side needs to invest considerable resources for compatibility and adaptation, and the final effect is limited. The hardware performance is difficult to achieve the expected effect.
- Edge computing: it can provide universal computing power at the edge of users closer to it. For example, THE GPU graphics card is consistent with the data center, which can meet the computing power resources required by AI business flow pushing or rendering. Meanwhile, it is closer to users to ensure the corresponding timeliness of services. Therefore, it can well coordinate the differences between center AI computing power and end side AI computing power in delay, compatibility, computing power capacity and cost.
04 summary
Edge calculation provide global coverage, and operators all over the country provinces and cities in the video broadcast scenario will be associated with traffic access and user interaction of push-pull flow and transcoding confluence service on the edge of the closer to the user’s deployment, realize the host and audience to the nearest the live video of push-pull flow, to ensure low latency business response, improve fluency of hd broadcast business experience. At the same time, with the increasing demand for interestingness, innovation and immersion of live video content, edge computing provides computing power resources with low delay, standardization and heterogeneity, which will continue to help live video scenes provide more extreme business experience.
Volcano Engine Edge computing Node is committed to providing users with a stable, high-performance and features-rich new generation of edge computing cloud platform services. Through edge nodes covering all provinces and carriers across the country, the volcano Engine edge computing node facilitates the rapid deployment of business to each edge layer between users and the cloud center.
At present, huohua Engine edge calculation has been implemented in many applications, such as Douyin Live, Toutiao, Watermelon video and Feishu. Welcome to pay attention to edge compute node or open service ~