How are images displayed on the screen
Image display is completed by CPU and GPU together, CPU: responsible for image decoding (video decoding is completed by GPU) GPU: responsible for texture mixing, vertex rendering calculation, pixel coloring calculation, rendering to the frame buffer image display in two cases: ##### normal display mode
Normal flow
- CPU computing decoding
- The GPU to render
- Frame buffer
- The video controller reads the data from the frame cache, converts it digital-analog and displays it line by line on the screen
As shown below:
In the process of GPU rendering, the oil painting algorithm will be followed to render layers from far to near, and then the results of each layer will be put into the frame buffer. When the data of the frame buffer is rendered to the screen, the data of the frame buffer will be discarded. As shown in figure:
Off-screen rendering
Will rendering as described above satisfy all our needs? This is where the problem arises. If we crop an image and the frame buffer is discarded, we can’t manipulate the layer, so we need to create some extra memory space to store the frame data. This new memory space is called the off-screen buffer. At this point, the results of each layer are not directly placed in the frame buffer, but stored in the off-screen buffer, and then when the cropping operation is complete, the data is stored from the off-screen buffer to the frame buffer, and then read to the display by the video controller. The process should look like the following:
Off-screen rendering limitations:
- Off-screen render space – more than 2.5 times the screen pixel size, cannot be used
- Off-screen render time — not used within 100ms, will be destroyed
- Cannot render in one step, will trigger to off-screen rendering
Off-screen render exploration
How can we check whether the off screen rendering is triggered in the ios project? The Simulator has a function to detect the off screen rendering (Color Off_screen Render). Turn this function on, if there is an off screen rendering, it will turn yellow
/ / 1. The button is the background image UIButton * btn1 = [UIButton buttonWithType: UIButtonTypeCustom]; btn1.frame = CGRectMake(100, 30, 100, 100); btn1.layer.cornerRadius = 50; [self.view addSubview:btn1]; [btn1 setImage:[UIImage imageNamed:@"btn.png"] forState:UIControlStateNormal]; btn1.clipsToBounds = YES; / / 2. The button does not exist the background image UIButton * btn2 = [UIButton buttonWithType: UIButtonTypeCustom]; btn2.frame = CGRectMake(100, 180, 100, 100); btn2.layer.cornerRadius = 50; btn2.backgroundColor = [UIColor blueColor]; [self.view addSubview:btn2]; btn2.clipsToBounds = YES; //3.UIImageView sets image + background color; UIImageView *img1 = [[UIImageView alloc]init]; img1.frame = CGRectMake(100, 320, 100, 100); img1.backgroundColor = [UIColor blueColor]; [self.view addSubview:img1]; img1.layer.cornerRadius = 50; img1.layer.masksToBounds = YES; img1.image = [UIImage imageNamed:@"btn.png"]; //4.UIImageView sets only the image, no background color; UIImageView *img2 = [[UIImageView alloc]init]; img2.frame = CGRectMake(100, 480, 100, 100); [self.view addSubview:img2]; img2.layer.cornerRadius = 50; img2.layer.masksToBounds = YES; img2.image = [UIImage imageNamed:@"btn.png"];Copy the code
The first button has the image set and clipped, while the second button has no image set and triggers the off-screen rendering condition, which is automatically triggered when there are multiple layers that need to be rendered and manipulated (such as clipping). So the first UIImageView set the image and set the background, the second UIImageView doesn’t set the background, so the first one will trigger an off-screen render, the second one won’t. So why doesn’t the first button have a background color and trigger an off-screen render? Look at the properties of button
@ property (nullable, nonatomic, readonly, strong) UILabel * titleLabel API_AVAILABLE (ios (3.0)); @ property (nullable, nonatomic, readonly, strong) UIImageView * imageView API_AVAILABLE (ios (3.0));Copy the code
The button has an imageView layer on it. When we round the corner of the button and set layer.masksToBounds = YES, it triggers an off-screen render. The cornerRadius corner will only set the backgroundColor and border rounded corners, not the content rounded corners. Content will only be rounded if layer.masksToBounds = YES
Can the off-screen buffer be infinitely large? When is the off-screen buffer data released
The size of the off-screen buffer is limited. According to apple’s official documentation, the off-screen buffer can be 2.5 times the size of a screen pixel. Data in the off-screen buffer will be discarded if it is not used within 100ms
Several situations that trigger off-screen rendering
- A mask is used
- The cutting layer. MaskToBounds/view. ClipsToBounds
- Set up a group to YES, transparency and transparency of 1 layer (layer. AllowsGroupOpacity/layer. Opacity)
- Set shadow shadow
- Rasterize layer. ShouldRasterize
- Layer with Text drawn (UILabel, CATextLayer, Core Text, etc.)
The rasterizer (shouldRasterize)
/* When true, the layer is rendered as a bitmap in its local coordinate * space ("rasterized"), then the bitmap is composited into the * destination (with the minificationFilter and magnificationFilter * properties of the layer applied if the bitmap needs scaling). * Rasterization occurs after the layer's filters and shadow effects * are applied, but before the opacity modulation. As an implementation * detail the rendering engine may attempt to cache and reuse the * bitmap from one frame to the next. (Whether it does or not will have * no affect on the rendered output.) * * When false the layer is composited directly into the destination * whenever possible (however, certain features of the compositing * model may force rasterization, e.g. adding filters). * * Defaults to NO. Animatable. */ @property BOOL shouldRasterize;Copy the code
Basically, it rasters after the shadow and cropping filter, and then renders the layer as a bitmap and stores it in an off-screen buffer, and the rendering engine will cache or reuse the data from this frame to the next frame so if the layer is reusable it will change frequently. For example, when cells are being reused, turning on rasterization will result in a higher performance
YYKit processing of rounded corners
- (UIImage *)imageByRoundCornerRadius:(CGFloat)radius corners:(UIRectCorner)corners borderWidth:(CGFloat)borderWidth borderColor:(UIColor *)borderColor borderLineJoin:(CGLineJoin)borderLineJoin { if (corners ! = UIRectCornerAllCorners) { UIRectCorner tmp = 0; if (corners & UIRectCornerTopLeft) tmp |= UIRectCornerBottomLeft; if (corners & UIRectCornerTopRight) tmp |= UIRectCornerBottomRight; if (corners & UIRectCornerBottomLeft) tmp |= UIRectCornerTopLeft; if (corners & UIRectCornerBottomRight) tmp |= UIRectCornerTopRight; corners = tmp; } UIGraphicsBeginImageContextWithOptions(self.size, NO, self.scale); CGContextRef context = UIGraphicsGetCurrentContext(); CGRect rect = CGRectMake(0, 0, self.size.width, self.size.height); CGContextScaleCTM(context, 1, -1); CGContextTranslateCTM(context, 0, -rect.size.height); CGFloat minSize = MIN(self.size.width, self.size.height); if (borderWidth < minSize / 2) { UIBezierPath *path = [UIBezierPath bezierPathWithRoundedRect:CGRectInset(rect, borderWidth, borderWidth) byRoundingCorners:corners cornerRadii:CGSizeMake(radius, borderWidth)]; [path closePath]; CGContextSaveGState(context); [path addClip]; CGContextDrawImage(context, rect, self.CGImage); CGContextRestoreGState(context); } if (borderColor && borderWidth < minSize / 2 && borderWidth > 0) { CGFloat strokeInset = (floor(borderWidth * Self.scale) + 0.5)/self.scale; CGRect strokeRect = CGRectInset(rect, strokeInset, strokeInset); CGFloat strokeRadius = radius > self.scale / 2 ? radius - self.scale / 2 : 0; UIBezierPath *path = [UIBezierPath bezierPathWithRoundedRect:strokeRect byRoundingCorners:corners cornerRadii:CGSizeMake(strokeRadius, borderWidth)]; [path closePath]; path.lineWidth = borderWidth; path.lineJoinStyle = borderLineJoin; [borderColor setStroke]; [path stroke]; } UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); return image; }Copy the code