An overview,

Screen sharing is one of the most common features in video conferencing, online classes and live games. Screen sharing is the real-time sharing of screen pictures. There are mainly several steps end-to-end: screen recording and collection, video encoding and encapsulation, real-time transmission, video unsealing and decoding, and video rendering.

Generally, real-time screen sharing, sharing a end with a fixed sampling frequency (typically 8 to 15 frames) enough to specify the source of the screen picture (including specified screen, the designated area, specified procedures, etc.), after video coding compression (should choose to keep the text/graphics edge information that never really scheme), with corresponding frame rate on real-time network distribution.

Therefore, screen capture is the basis of real-time screen sharing, and its application scenarios are also very wide.

With more and more applications of Flutter and more and more pure Flutter projects, this article will focus on the screen capture implementation of Flutter.

Second, the implementation process

Before going into the details of the implementation process, let’s take a look at what the native system provides for screen recording.

IOS 11.0 provides ReplayKit 2 for capturing global screen content across apps, but it can only be launched from the control center; IOS 12.0 builds on this with the ability to launch ReplayKit from within the App.

2. Android 5.0 provides the function of MediaProjection, which can collect global screen content only by popup and obtaining user’s consent.

Let’s take a look at the differences in Android/iOS screen capture capabilities.

1. IOS ReplayKit collects screen data by starting a Broadcast Upload Extension sub-process, which needs to solve the communication and interaction between the main App process and the screen capture sub-process. Meanwhile, Child processes also have limitations such as a maximum of 50M runtime memory.

2. Android’s MediaProjection is a Surface that runs directly within the main App process and can easily access screen data.

While native code cannot be avoided, we can implement Flutter screen capture with as little native code as possible. Abstracting the screen capture capabilities at both ends into a common Dart layer interface, you can happily start and stop screen capture at the Dart layer after just one deployment.

Next, we will introduce the implementation process of iOS and Android respectively.

1, the iOS

Open the Runner Xcode Project in the iOS folder of the Flutter App Project and create a Broadcast Upload Extension Target to process the business logic of the ReplayKit subprocess.

First, we need to deal with the cross-process communication between the main App process and the ReplayKit sub-process. Due to frequent callbacks to the audio/video buffer captured on the screen, for the consideration of performance and the ecology of the Flutter plugin, Handling audio and video buffer on the native side is clearly the most promising solution, so all that remains is to start and stop signaling and transfer the necessary configuration information.

To start ReplayKit operation, can be through the Flutter MethodChannel in primary side new a RPSystemBroadcastPickerView, this is a system to provide the View, Contains a Button that pops up the start screen capture window when clicked. The problem of launching ReplayKit is solved by traversing the Sub View to find the Button and triggering the click action.

staticFuture<bool? > launchReplayKitBroadcast(String extensionName) async {
    return await _channel.invokeMethod(
        'launchReplayKitBroadcast', {'extensionName': extensionName});
}
Copy the code
- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result {
    if ([@"launchReplayKitBroadcast" isEqualToString:call.method]) {
        [self launchReplayKitBroadcast:call.arguments[@"extensionName"] result:result];
    } else{ result(FlutterMethodNotImplemented); }} - (void)launchReplayKitBroadcast:(NSString *)extensionName result:(FlutterResult)result {
    if (@available(iOS 12.0, *)) {
        RPSystemBroadcastPickerView *broadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0.0.44.44)];
        NSString *bundlePath = [[NSBundle mainBundle] pathForResource:extensionName ofType:@"appex" inDirectory:@"PlugIns"];
        if(! bundlePath) {NSString *nullBundlePathErrorMessage = [NSString stringWithFormat:@"Can not find path for bundle `%@.appex`", extensionName];
            NSLog(@ "% @", nullBundlePathErrorMessage);
            result([FlutterError errorWithCode:@"NULL_BUNDLE_PATH" message:nullBundlePathErrorMessage details:nil]);
            return;
        }
​
        NSBundle *bundle = [NSBundle bundleWithPath:bundlePath];
        if(! bundle) {NSString *nullBundleErrorMessage = [NSString stringWithFormat:@"Can not find bundle at path: `%@`", bundlePath];
            NSLog(@ "% @", nullBundleErrorMessage);
            result([FlutterError errorWithCode:@"NULL_BUNDLE" message:nullBundleErrorMessage details:nil]);
            return;
        }
​
        broadcastPickerView.preferredExtension = bundle.bundleIdentifier;
        for (UIView *subView in broadcastPickerView.subviews) {
            if ([subView isMemberOfClass:[UIButton class]]) {
                UIButton *button = (UIButton *)subView;
                [button sendActionsForControlEvents:UIControlEventAllEvents];
            }
        }
        result(@(YES));
    } else {
        NSString *notAvailiableMessage = @ "RPSystemBroadcastPickerView is only available on iOS 12.0 or above." ";
        NSLog(@ "% @", notAvailiableMessage);
        result([FlutterError errorWithCode:@"NOT_AVAILIABLE" message:notAvailiableMessage details:nil]); }}Copy the code

Then there is the synchronization of configuration information:

The first option is to use the iOS App Group capability to share configuration information between processes through NSUserDefaults persistent configuration. Enable the App Group capability in Runner Target and Broadcast Upload Extension Target respectively and set the same App Group ID. Then you can read and write the configuration within the App Group by using -[NSUserDefaults initWithSuiteName].

Future<void> setParamsForCreateEngine(int appID, String appSign, bool onlyCaptureVideo) async {
    await SharedPreferenceAppGroup.setInt('ZG_SCREEN_CAPTURE_APP_ID', appID);
    await SharedPreferenceAppGroup.setString('ZG_SCREEN_CAPTURE_APP_SIGN', appSign);
    await SharedPreferenceAppGroup.setInt("ZG_SCREEN_CAPTURE_SCENARIO".0);
    await SharedPreferenceAppGroup.setBool("ZG_SCREEN_CAPTURE_ONLY_CAPTURE_VIDEO", onlyCaptureVideo);
}
Copy the code
- (void)syncParametersFromMainAppProcess {
    // Get parameters for [createEngine]
    self.appID = [(NSNumber*) [self.userDefaults valueForKey:@"ZG_SCREEN_CAPTURE_APP_ID"] unsignedIntValue];
    self.appSign = (NSString*) [self.userDefaults valueForKey:@"ZG_SCREEN_CAPTURE_APP_SIGN"];
    self.scenario = (ZegoScenario)[(NSNumber*) [self.userDefaults valueForKey:@"ZG_SCREEN_CAPTURE_SCENARIO"] intValue];
}
Copy the code

Scheme 2 is used across processes notification CFNotificationCenterGetDarwinNotifyCenter carry configuration information to achieve interprocess communication.

Next, stop the ReplayKit operation. Also use the CFNotification above to initiate a notification to end screen capture in the main Flutter App. ReplayKit child receives the notification, the call – [RPBroadcastSampleHandler finishBroadcastWithError:] to end screen capture.

staticFuture<bool? > finishReplayKitBroadcast(String notificationName) async {
    return await _channel.invokeMethod(
        'finishReplayKitBroadcast', {'notificationName': notificationName});
}
Copy the code
- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result {
    if ([@"finishReplayKitBroadcast" isEqualToString:call.method]) {
        NSString *notificationName = call.arguments[@"notificationName"];
        CFNotificationCenterPostNotification(CFNotificationCenterGetDarwinNotifyCenter(), (CFStringRef)notificationName, NULL.nil.YES);
        result(@(YES));
    } else{ result(FlutterMethodNotImplemented); }}// Add an observer for stop broadcast notification
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(),
                                (__bridge const void(*)self),
                                onBroadcastFinish,
                                (CFStringRef)@"ZGFinishReplayKitBroadcastNotificationName".NULL.CFNotificationSuspensionBehaviorDeliverImmediately);

// Handle stop broadcast notification from main app process
static void onBroadcastFinish(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo) {
​
    // Stop broadcast
    [[ZGScreenCaptureManager sharedManager] stopBroadcast:^{
        RPBroadcastSampleHandler *handler = [ZGScreenCaptureManager sharedManager].sampleHandler;
        if (handler) {
            // Finish broadcast extension process with no error
            #pragma clang diagnostic push
            #pragma clang diagnostic ignored "-Wnonnull"
            [handler finishBroadcastWithError:nil];
            #pragma clang diagnostic pop
        } else {
            NSLog(@"⚠️ RPBroadcastSampleHandler is null, can not stop broadcast upload extension process"); }}]; }Copy the code

(diagram of iOS implementation process)

2, the Android

The implementation of Android is relatively simple compared with iOS. When starting screen collection, you can directly use the MethodChannel of Flutter to pop up a popup window to request the user for screen collection permission on the native side through MediaProjectionManager. After receiving the confirmation can invoke MediaProjectionManager. GetMediaProjection MediaProjection object () function.

It is important to note that due to the increasing restrictions on Permissions on Android, you will need to start an additional foreground service if the Target API version of your App (Android Q 10.0) is 29 or greater. According to the Migration documentation for Android Q, features such as MediaProjection that require the use of a foreground service must run in a separate foreground service.

First you need to implement a class that extends android.app.Service and call the getMediaProjection() function in the onStartCommand callback to retrieve the MediaProjection object.

@Override
public int onStartCommand(Intent intent, int flags, int startId) {

    int resultCode = intent.getIntExtra("code", -1);
    Intent resultData = intent.getParcelableExtra("data");

    String notificationText = intent.getStringExtra("notificationText");
    int notificationIcon = intent.getIntExtra("notificationIcon", -1);
    createNotificationChannel(notificationText, notificationIcon);

    MediaProjectionManager manager = (MediaProjectionManager)getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    MediaProjection mediaProjection = manager.getMediaProjection(resultCode, resultData);
    RequestMediaProjectionPermissionManager.getInstance().onMediaProjectionCreated(mediaProjection, RequestMediaProjectionPermissionManager.ERROR_CODE_SUCCEED);

    return super.onStartCommand(intent, flags, startId);
}
Copy the code

You also need to register this class in androidmanifest.xml.

<service
    android:name=".internal.MediaProjectionService"
    android:enabled="true"
    android:foregroundServiceType="mediaProjection"
/>
Copy the code

If it is running on Android Q or later, start the foreground service. Otherwise, MediaProjection objects can be directly retrieved.

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
private void createMediaProjection(int resultCode, Intent intent) {
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
        service = new Intent(this.context, MediaProjectionService.class);
        service.putExtra("code", resultCode);
        service.putExtra("data", intent);
        service.putExtra("notificationIcon".this.foregroundNotificationIcon);
        service.putExtra("notificationText".this.foregroundNotificationText);
        this.context.startForegroundService(service);
    } else {
        MediaProjectionManager manager = (MediaProjectionManager) context.getSystemService(Context.MEDIA_PROJECTION_SERVICE);
        MediaProjection mediaProjection = manager.getMediaProjection(resultCode, intent);
        this.onMediaProjectionCreated(mediaProjection, ERROR_CODE_SUCCEED); }}Copy the code

Then, consumers who collect buffer from the screen will get the Surface according to business scene requirements. For example, to save screen recording, they will get the Surface from MediaRecoder. To record screen live, they can call the interface of audio and video live SDK to get the Surface.

With MediaProjection and consumer’s Surface, the next is called MediaProjection. CreateVirtualDisplay () function is introduced to the Surface to create VirtualDisplay instance, Thus the screen capture buffer is obtained.

VirtualDisplay virtualDisplay = mediaProjection.createVirtualDisplay(“ScreenCapture”, width, height, 1, DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, handler);

Finally, to finish the screen capture, Android only needs to release VirtualDisplay and MediaProjection instance objects compared to the complex operation on iOS.

3. Practical examples

Here is a Demo that implements iOS/Android screen capture and streams live with the Zego RTC Flutter SDK.

Download link: github.com/zegoim/zego…

The Zego RTC Flutter SDK provides an entry to the native side of the video frame data. The screen capture buffer obtained in the above process can be sent to the RTC SDK for rapid screen sharing and streaming.

After obtaining the SampleBuffer from the system, the iOS terminal can directly send it to the RTC SDK, which can automatically process video and audio frames.

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
    [[ZGScreenCaptureManager sharedManager] handleSampleBuffer:sampleBuffer withType:sampleBufferType];
}
Copy the code

A SurfaceTexture can be obtained from the RTC SDK and initialized. The Handler then creates a VirtualDisplay object from the MediaProjection object obtained by the above process, at which point the RTC SDK can retrieve the screen capture video frame data.

SurfaceTexture texture = ZegoCustomVideoCaptureManager.getInstance().getSurfaceTexture(0);
texture.setDefaultBufferSize(width, height);
Surface surface = new Surface(texture);
HandlerThread handlerThread = new HandlerThread("ZegoScreenCapture");
handlerThread.start();
Handler handler = new Handler(handlerThread.getLooper());

VirtualDisplay virtualDisplay = mediaProjection.createVirtualDisplay("ScreenCapture", width, height, 1,
    DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, handler);
Copy the code

Iv. Summary and outlook

Finally, we summarize the main content of the Screen capture implementation of Flutter.

This paper first introduces the principle of screen capture capability provided by iOS/Android native, then introduces the interaction between Flutter and native, and how to control the start and stop of screen capture on the side of Flutter. Finally, an example of how to connect Zego RTC SDK to achieve screen sharing push stream.

Currently, Flutter on Desktop is stable. The Zego RTC Flutter SDK has provided initial support for Flutter on Windows. We will continue to explore Flutter on Desktop.