Share the screen capture solution of Flutter mobile terminal
Now, with the wider application of Flutter, there are more and more pure Flutter projects. This article will share the implementation of Flutter mobile (iOS + Android) screen capture.
An overview of the
Screen sharing is one of the most common features in video conferencing, online classes and live games. Screen sharing is the real-time sharing of screen pictures. There are mainly several steps end-to-end: screen recording and collection, video encoding and encapsulation, real-time transmission, video unsealing and decoding, and video rendering.
Generally, real-time screen sharing, sharing a end with a fixed sampling frequency (typically 8 to 15 frames) enough to specify the source of the screen picture (including specified screen, the designated area, specified procedures, etc.), after video coding compression (should choose to keep the text/graphics edge information that never really scheme), with corresponding frame rate on real-time network distribution.
Therefore, screen capture is the basis of real-time screen sharing, and its application scenarios are also very wide.
implementation
To prepare
First let’s take a look at what the native system provides for on-screen recording.
-
IOS 11.0 provides ReplayKit 2 for capturing global screen content across apps, but it can only be launched from the control center; IOS 12.0 builds on this with the ability to launch ReplayKit from within the App.
-
Android 5.0 provides MediaProjection, where global screen content can be captured by popup and user approval.
Let’s take a look at the differences in Android/iOS screen capture capabilities.
-
IOS ReplayKit collects screen data by starting a Broadcast Upload Extension sub-process, which needs to solve the communication and interaction between the main App process and the screen capture sub-process. Meanwhile, Child processes also have limitations such as a maximum of 50M runtime memory.
-
Android’s MediaProjection is a Surface that runs directly within the main App process, making it easy to access screen data.
The solution
While native code cannot be avoided, we can implement Flutter screen capture with as little native code as possible. Abstracting the screen capture capabilities at both ends into a common Dart layer interface, you can happily start and stop screen capture at the Dart layer after just one deployment.
iOS
Open Runner Xcode Project and create a Broadcast Upload Extension Target that handles the business logic of the ReplayKit child process.
First, we need to deal with the cross-process communication between the main App process and the ReplayKit sub-process. Due to frequent callbacks to the audio/video buffer captured on the screen, for the consideration of performance and the ecology of the Flutter plugin, Handling audio and video buffer on the native side is clearly the most promising solution, so all that remains is to start and stop signaling and transfer the necessary configuration information.
To start ReplayKit operation, can be through the Flutter MethodChannel in primary side new a RPSystemBroadcastPickerView, this is a system to provide the View, Contains a Button that pops up the start screen capture window when clicked. The problem of launching ReplayKit is solved by traversing the Sub View to find the Button and triggering the click action.
Then there is the synchronization of configuration information.
-
One is to use the iOS App Group capability to share configuration information between processes through NSUserDefaults persistent configuration. Enable the App Group capability in Runner Target and Broadcast Upload Extension Target respectively and set the same App Group ID. Then you can read and write the configuration within the App Group by using -[NSUserDefaults initWithSuiteName].
-
The second is to use across processes to inform CFNotificationCenterGetDarwinNotifyCenter carry configuration information to achieve interprocess communication.
Next, stop the ReplayKit operation. Also use the CFNotification above to initiate a notification to end screen capture in the main Flutter App. ReplayKit child receives the notification, the call – [RPBroadcastSampleHandler finishBroadcastWithError:] to end screen capture.
Android
When screen capture is started, a popup asking the user for screen capture permission pops up on the native side of Flutter via MediaProjectionManager directly via the MethodChannel of Flutter. After receiving the confirmation can invoke MediaProjectionManager. GetMediaProjection MediaProjection object () function.
Then, according to the requirements of business scenes, consumers who collect buffer from the screen get the Surface. For example, to save the screen recording, they get the Surface from MediaRecoder. To record the screen live, they can call the interface of audio and video live SDK to get the Surface.
With MediaProjection and consumer’s Surface, the next is called MediaProjection. CreateVirtualDisplay () function is introduced to the Surface to create VirtualDisplay instance, Thus the screen capture buffer is obtained.
Finally, to finish the screen capture, Android only needs to release VirtualDisplay and MediaProjection instance objects compared to the complex operation on iOS.
Practical example
The ZEGO Express Audio and Video Flutter SDK can be used to capture the iOS/Android screen.
Github.com/zegoim/zego…