More and more apps need to share mobile screens for others to view, especially in the online education industry. Android has support for MediaProjection since 5.0. With MediaProjection, screen capture and recording can be implemented.
This library encapsulates the screen capture code. Simple call can realize MediaProjection permission application, H264 hard coding, error handling and other functions.
The characteristics of
- For Higher Versions of Android
- Asynchronous hard coding using MediaCodec
- The encoding information is configurable
- Notification display
- Chain calls
use
ScreenShareKit.init(this)
.onH264{ buffer, isKeyFrame, ts ->
}.start()
Copy the code
Github
The source address
implementation
1 Request the user to authorize screen capture
@TargetApi(Build.VERSION_CODES.M) fun requestMediaProjection(encodeBuilder: EncodeBuilder){ this.encodeBuilder = encodeBuilder; mediaProjectionManager = activity? .getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager startActivityForResult(mediaProjectionManager? .createScreenCaptureIntent(), 90000) }Copy the code
StartActivityForResult is used in activities or fragments, and the authorization result is called back in onActivityResult. So we need to encapsulate this step so that it can get the result as a callback. Here we use an unbounded Fragment, and many libraries use this form.
private val invisibleFragment : InvisibleFragment get() { val existedFragment = fragmentManager.findFragmentByTag(FRAGMENT_TAG) return if (existedFragment ! = null) { existedFragment as InvisibleFragment } else { val invisibleFragment = InvisibleFragment() fragmentManager.beginTransaction() .add(invisibleFragment, FRAGMENT_TAG) .commitNowAllowingStateLoss() invisibleFragment } } fun start(){ invisibleFragment.requestMediaProjection(this) }Copy the code
This allows us to retrieve authorization results and MediaProjection objects from onActivityResult in an unbounded Fragment.
2. Android 10
If targetSdkVersion is set to 29 or above, an exception will be received when createVirtualDisplay is called after MediaProjection is obtained
java.lang.SecurityException: Media projections require a foreground service of type ServiceInfo.FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION
Copy the code
This means that this operation needs to be done in the foreground service.
Let’s write a service and pass in all the results that onActivityResult gets.
override fun onStartCommand(intent: Intent? , flags: Int, startId: Int): Int { intent? .let { if(isStartCommand(it)){ val notification = NotificationUtils.getNotification(this) StartProjection (it. GetIntExtra (RESULT_CODE, RESULT_CANCELED), it.getParcelableExtra( DATA )!! ) }else if (isStopCommand(it)){ stopProjection() stopSelf() } } return super.onStartCommand(intent, flags, startId) }Copy the code
In the startProjection method, we need to get the MediaProjectionManager, then get the MediaProjection, and then create a virtual display.
private fun startProjection(resultCode: Int, data: Intent) { val mpManager = getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager if (mMediaProjection == null) { mMediaProjection = mpManager.getMediaProjection(resultCode, data) if (mMediaProjection ! = null) { mDensity = Resources.getSystem().displayMetrics.densityDpi val windowManager = getSystemService(WINDOW_SERVICE) as WindowManager mDisplay = windowManager.defaultDisplay createVirtualDisplay() mMediaProjection?.registerCallback(MediaProjectionStopCallback(), mHandler) } } } private fun createVirtualDisplay() { mVirtualDisplay = mMediaProjection!! .createVirtualDisplay( SCREENCAP_NAME, encodeBuilder.encodeConfig.width, encodeBuilder.encodeConfig.height, mDensity, DisplayManager.VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY or DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, mHandler ) }Copy the code
In the createVirtualDisplay method, we have a Surface parameter, and all the actions on the screen are mapped to the Surface, so here we use MediaCodec to create an input Surface that takes the output from the screen and codes it.
3. MediaCodec coding
private fun initMediaCodec() { val format = MediaFormat.createVideoFormat(MIME, encodeBuilder.encodeConfig.width, encodeBuilder.encodeConfig.height) format.apply { SetInteger (MediaFormat KEY_COLOR_FORMAT, MediaCodecInfo. CodecCapabilities. COLOR_FormatSurface) / / color format SetInteger (MediaFormat KEY_BIT_RATE, encodeBuilder. EncodeConfig. Bitrate) / / stream setInteger (MediaFormat KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR) setInteger(MediaFormat.KEY_FRAME_RATE, EncodeBuilder. EncodeConfig. FrameRate) / / frames setInteger (MediaFormat KEY_I_FRAME_INTERVAL, 1) } codec = MediaCodec.createEncoderByType(MIME) codec.apply { setCallback(object : MediaCodec.Callback() { override fun onInputBufferAvailable(codec: MediaCodec, index: Int) { } override fun onOutputBufferAvailable( codec: MediaCodec, index: Int, info: MediaCodec.BufferInfo ) { val outputBuffer:ByteBuffer? try { outputBuffer = codec.getOutputBuffer(index) if (outputBuffer == null){ return } }catch (e:IllegalStateException){ return } val keyFrame = (info.flags and MediaCodec.BUFFER_FLAG_CODEC_CONFIG) ! = 0 if (keyFrame){ configData = ByteBuffer.allocate(info.size) configData.put(outputBuffer) }else{ val data = createOutputBufferInfo(info,index,outputBuffer!!) encodeBuilder.h264CallBack? .onH264(data.buffer,data.isKeyFrame,data.presentationTimestampUs) } codec.releaseOutputBuffer(index, false) } override fun onError(codec: MediaCodec, e: MediaCodec.CodecException) { encodeBuilder.errorCallBack? .onError(ErrorInfo(-1,e.message.toString())) } override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) { } }) configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE) surface = createInputSurface() codec.start() } }Copy the code
MediaFormat can set some parameters for the encoder, such as bit rate, frame rate, keyframe interval, etc.
The MediaCodec code provides both synchronous and asynchronous methods, using the asynchronous setting callback method (asynchronous API 21 is available above).
4. Encapsulation
In the onOutputBufferAvailable callback, I have called back the encoded data and determined whether it is a keyframe or a normal frame. 😂 In fact, you can combine some third-party audio and video SDK, directly push the encoded screen stream data through the third-party SDK to achieve the screen sharing function.
Here, take the pushExternalVideoFrame method of anyRTC audio and video SDK as an example
val rtcEngine = RtcEngine.create(this,"",RtcEvent())
rtcEngine.enableVideo()
rtcEngine.setExternalVideoSource(true,false,true)
rtcEngine.joinChannel("","111","","")
ScreenShareKit.init(this)
.onH264 {buffer, isKeyFrame, ts ->
rtcEngine.pushExternalVideoFrame(ARVideoFrame().apply {
val array = ByteArray(buffer.remaining())
buffer.get(array)
bufType = ARVideoFrame.BUFFER_TYPE_H264_EXTRA
timeStamp = ts
buf = array
height = Resources.getSystem().displayMetrics.heightPixels
stride = Resources.getSystem().displayMetrics.widthPixels
})
}.start()
Copy the code
A few lines of code can achieve screen acquisition code transmission ~ very convenient
reference
reference