Recently, when we used WebRTC for live broadcast, we encountered the problem of overexposure of the anchor screen. It took us a long time to find out the reason, and it was estimated that we could share it to avoid people’s pit mining

  1. Pixel formats in iOS

IOS video capture supports three data formats: 420V, 420F, and BGRA

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, Video-range (LumA =[16 235] Chroma =[16 240]). BaseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr8BiPlanarFullRange = '420f', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, Full-range (luma=[0,255] chroma=[1,255]). BaseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_32BGRA = 'BGRA', /* 32 bit BGRA */Copy the code

Including the YpCbCr respectively in kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange Y, U, V three components, namely the YUV format of the data, at the back of the 8 refers to 8 bit to preserve a component, 420 refers to the use of YUV format 4:2:0 storage. BiPlanar refers to the BiPlanar mode, in which Y and UV are stored separately, and VideoRange refers to the color space

420F and 420V are both YUV formats. YUV is a color coding method divided into three components, with Y representing brightness (Luma), also known as grayscale. U and V chromaticity (chroma) describe Color and saturation for WebRTC codec pixel format is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange the difference between 420 f and 420 V Color Space. F is Full Range, V is Video Range. The Y component of Full Range is [0,255]. The Y component of Video Range is [16,235]. The color space Settings must be consistent throughout the whole process from acquisition coding to decoding rendering. If the capture is in Full Range and the player is in Video Range, it is possible to see an overexposure effect.

  1. WebRTC70 Branch causes overexposure problems

In WebRTC70 Branch, the new activeFormat of iOS is used to set formate formats such as FPS. When we select AVCaptureDeviceFormat, we only judge the resolution and frame rate, but not the OutputPixelFormat. Therefore, the first formate format that conforms to the resolution and frame rate will be selected for setting, which is Video Range format.

Format: <AVCaptureDeviceFormat: 0x174006680 'vide'/' 420V '1280x 720, {3-120 FPS}, FOV :58.080, Binned, Supports, Max Zoom :52.00 (upscales @1.16), AF System:1, ISO:34.0-1088.0, SS:0.000012-0.333333> Format: <AVCaptureDeviceFormat: 0x174006680 'vide'/'420f' 1280x 720, {3-120 FPS}, FOV :58.080, Binned, Supports, Max Zoom :52.00 (upscales @1.16), AF System: 1, ISO: 34.0-1088.0, 0.000012 0.333333 > SS:Copy the code

Because iOS formate take into the array in the array contains 420 v and 420 f format, so want to choose, use kCVPixelFormatType_420YpCbCr8BiPlanarFullRange WebRTC is internal, so want to choose 420 f, If the selection is wrong, due to the different value ranges of Y components in Full Range and Video Range, subsequent beauty will be caused, and the display will process the exposure value incorrectly, resulting in abnormal exposure

Modify the code

for (AVCaptureDeviceFormat *format in formats) {
        CMVideoDimensions dimension = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
        if ((dimension.width == targetWidth &&
            dimension.height == targetHeight) || (dimension.width == targetHeight && dimension.height == targetWidth)) {
          for (AVFrameRateRange* frameRateRange in
               [format videoSupportedFrameRateRanges]) {
            if (frameRateRange.minFrameRate <= self.fps &&
                self.fps <= frameRateRange.maxFrameRate) {
              if (CMFormatDescriptionGetMediaSubType(format.formatDescription) == [self preferredOutputPixelFormat]) {
                selectedFormat = format;
              }
              break; }}}if(selectedFormat){
          break; }}Copy the code