Author: Idle fish technology — Chu Rui
1. The background
Mobile Internet follows the iteration and development of mobile communication technology, and each iteration will bring huge changes to people’s lives. One GIGAByte network, allowing people to use mobile phones; 2G networks, which enable people to surf the Internet via mobile phones; 3G network, promote the rapid development of mobile Internet; 4G networks, live streaming and short video have exploded.
The appearance of short video is not only the change of the form of content display, but also the creators make the content richer and more vivid through clips and special effects, providing the audience with a better user experience.
Xianyu is a C2C scene, and the proportion of ordinary users Posting video babies is not high. How to video content, improve browsing experience, Xianyu gave a solution.
2. Idle Fish is not a short video
The first solution we came up with was to combine the images into short videos. After thinking about it, I found that there are some problems:
• Directing user video compositing will affect the success rate of the release
Weak boot, doesn’t work, doesn’t live up to expectations. Strong boot, will break the release link, affect the release success rate. The reason why xiyu can become the no.1 in the industry is that xiyu has a huge commodity pool, and the number of newly developed commodities is the cornerstone of xiyu ecology. Xianyu has always advocated light release, xiao Bai users can easily release goods on Xianyu. Our goal is to leave simplicity to the user and complexity to ourselves.
• Asynchronous video generation in the cloud
After the product is successfully released, the video is generated in the cloud, and then a Push message is sent to the user, asking whether the video should be applied. The problem is that users may not receive Push messages and may not use the generated video, wasting service resources.
• Inventory problems
The main reason why I don’t use composite video is that I can’t solve the problem of stock goods. The new hair of idle fish commodity compares whole commodity pool, account for ratio after all very small. Only dealing with newly released products cannot solve the problem of a large number of products.
Photo compositing video has many of the same problems. The scheme we give is “dynamic image display scheme based on dynamic effect template”.
3. Chaplin project
Film began with silent films, most notably Charlie Chaplin. He not only promoted the development of film history, but also represented an era. I hope to promote the development of xianyu’s overall content in video through the Chaplin project.
Chaplin project, inspired by video editors. In the stage of video editing, various special effects can be used to make the content richer and more vivid. We separated the material from the effects and finally abstracted the entire editing process into a template. During the presentation, different materials and templates are combined to achieve the effect of video playback.
demo
GIF
Video:
Wantu-xm4-xianyu-video-hz.oss-cn-hangzhou.aliyuncs.com/aus/xianyu\…
design
Business landing plan for Chaplin Project:
• Template creation
The template editor mainly refers to the general video editor and provides a visual creation tool for designers. Special effects library is extensible, cool effects are mainly developed by students through writing OpenGL Shader. Because OpenGL is a standard library, editor – and client-side consistency is possible.
The resource management platform mainly relies on the group’s Vertical and Horizontal platform (material management center) to realize the management of music library, special effects library and template library. Among them, templates support feature marking and playback control. Because the template is to be delivered to the client, there must be a complete online publishing process, which is isolated from pre-delivery.
• the client
Template fetching is incrementally synchronized and requires the implementation of local template library management, as well as Shader and material resource management.
Template recommendation engine based on business scenarios, through the understanding of goods and content, as well as the image body and color gamut analysis, to achieve personalized recommendation to users. And through the fatigue and exposure configuration, to achieve the template scattered display.
Dynamic display (playback), the first step is to synthesize the material and template, and the synthesized playback source is a non-linear playback sequence based on the time line. The playback process is synthesized and rendered in real time according to the timeline. Visual synthesis, both ends are based on OpenGL. Audio synthesis for iOS is based on AVMutableAudioMix (AVFoundation); Android has no audio synthesis process, it is remix playback.
Circular template
In the process of business landing, the number of playback material is uncertain, so a circular template is designed.
The benefits of circular templates are obvious: (1) the number of template files is greatly reduced, and there is no need to design multiple templates for the same vision; (2) Greatly reduce the size of the template file, so that the template file download faster.
At the beginning of the design, there are two sets of cyclic template scheme: (1) material and template synthesis processing; (2) Let the player support circular configuration. Considering the independence and purity of sub-modules and reducing the coupling degree of modules, the composite module of material and template is abstracted.
The design of circular template is to solve the uncertainty in the business, so the design should not only consider the extensibility of the protocol, but also abstract the uncertainty:
• Broadcast Protocol
• Start with a picture; • End with a picture; • There must be a transition node between two images. The transition time can be zero.
• Head and tail design of circular templates
• Must start with a picture node; • It can be the end of the picture; • It can be a transition ending.
• Definition of cyclic and acyclic sections
• All nodes support once (only one execution). The second execution will skip the node. • After the node is skipped, if a neighboring node of the same type appears, the node is skipped again.
The entire playback process is quite complex, with events and data being passed between different threads. I don’t want to do all this multithreading, but if you don’t do it asynchronously, you’ll get clogging and stalling. ! [] (gw.alicdn.com/imgextra/i3…).
• Play the control operation queue
The player is strongly dependent on the state machine, and the lock operation will stall (the Stop operation will release other resources such as threads), so the asynchronous operation queue is chosen.
•GL render threads
Because it is a multi-track playback, it needs to render the composition in real time, and then go to the screen after the composition.
• Resource loading threads
Resource loading is an I/O operation and can be time-consuming (especially for fonts and large images). The image is first loaded into memory by the resource loading thread, then turned into a texture by the GL rendering thread, and then rendered.
•UI refresh thread
The current refresh rate is set to 30FPS, which means that all load synthesis operations must be completed within 30ms, otherwise there will be a lag.
Multi-track play
Multi-track playback (also called non-linear playback) requires the composition and rendering of multi-track content on the screen in real time.
PS: Loading resources or composited slowly will appear playback card, so do resources preloading, texture preprocessing and other operations.
A template agreement
OpenGL vertex coordinate range is [-1.0, 1.0] and texture coordinate range is [0.0, 1.0]. For the convenience of development, coordinate unification is done. All coordinate ranges in the protocol are [0.0, 1.0], and the transformation will be done at the bottom.
All effects are implemented in the cloud. This object complies with the SHAder protocol, including vertexURL, fragmentURL, and UNIFORMS. Examples:
•
Material (pictures)
{"name":" zoom ", "type":"image", "duration":2.0, "shader":{ "vertexURL":"https://xianyu-chaplin-bucket.oss-cn-beijing.aliyuncs.com/ShaderV2/ImageEffect_Move2_V2/ImageEffect_Move2_V 2.vert", "fragmentURL":"https://xianyu-chaplin-bucket.oss-cn-beijing.aliyuncs.com/ShaderV2/ImageEffect_Move2_V2/ImageEffect_Move2 Allowed :{"u_FirstScaleBegin":1.00, "u_FirstScaleRange":0.04, "u_FirstMoveBeginX":0.00, "U_FirstMoveRangeX ":0.00, "u_FirstMoveBeginY":0.00, "u_FirstMoveRangeY":0.00}}Copy the code
•
transitions
{" name ":" solution ", "type" : "the transition", "duration" : 0.5, "shader":{ "vertexURL":"https://xianyu-chaplin-bucket.oss-cn-beijing.aliyuncs.com/ShaderV2/TransitionEffect_Mix2_V2/TransitionEffec t_Mix2_V2.vert", "fragmentURL":"https://xianyu-chaplin-bucket.oss-cn-beijing.aliyuncs.com/ShaderV2/TransitionEffect_Mix2_V2/TransitionEff SRC :{"u_FirstScaleBegin":1.04, "u_FirstScaleRange":0.01, "u_SecondScaleBegin":1.00, "U_SecondScaleRange" : 0.01, "u_FirstMoveBeginX" : 0.00, "u_FirstMoveRangeX" : 0.00, "u_FirstMoveBeginY" : 0.00, "U_FirstMoveRangeY" : 0.00, "u_SecondMoveBeginX" : 0.00, "u_SecondMoveRangeX" : 0.00, "u_SecondMoveBeginY" : 0.00, "U_SecondMoveRangeY" : 0.00}}}Copy the code
•
stickers
"tracks":[
{
"type":"ShaderTrack",
"list":[
{
"type":"PopImage",
"position":0,
"duration":3600,
"data":{
},
"shader":{
"vertexURL":"https://xianyu-chaplin-bucket.oss-cn-beijing.aliyuncs.com/ShaderV3/PopImage/PopImageV3.vert",
"fragmentURL":"https://xianyu-chaplin-bucket.oss-cn-beijing.aliyuncs.com/ShaderV3/PopImage/PopImageV3.frag",
"vertexCoordinatesName":"a_Position",
"vertexCoordinatesValue":{
"x":0,
"y":0,
"width":1,
"height":1,
"rect":{
"top":"8dp",
"right":"8dp",
"width":"60dp",
"height":"24dp"
}
},
"images":[
{
"textureCoordinatesName":"a_FirstTexCoord",
"textureCoordinatesValue":{
"x":0,
"y":0,
"width":1,
"height":1
},
"textureName":"u_FirstTexture",
"imageURL":"https://gw.alicdn.com/tfs/TB1QOt3nIVl614jSZKPXXaGjpXa-120-48.png"
}
],
"uniforms":{
"u_MoveX":0,
"u_MoveY":0,
"u_VertexScaleBegin":1,
"u_VertexScaleRange":0
}
}
}
]
}
]
Copy the code
With the popularity of short videos, Xianyu gives a new solution to how to promote the video of content. This article also explains why you don’t use photo composite video. This paper introduces how to use dynamic effect template to play and display picture content, which breaks the limitation of picture synthesis video.
Although the solution is up and running, there are still some ideas that have not been implemented.
• Create templates intelligently
A single template can cause user fatigue, so template production needs to be consistent. At present, template production relies on designers to create, develop and write special effects, and the cost is high. There are also some intelligent creation solutions in the industry, and we will consider how to combine them later.
• A combination of intelligent recognition and special effects
Identify the main area in real time, and do special effect processing for the main area. For example, if you currently have a zoom in transition effect, the ideal is to zoom in around the main area, not the center of the image. The current material is played in order, you can add intelligent sort.
• Scene expansion
The current solution is mainly used in the presentation scenario, and the UGC production scenario will be considered later.