This is the 12th day of my participation in the Gwen Challenge in November. Check out the details: The Last Gwen Challenge in 2021″
In iOS, most apps have an indispensable image resource, and it’s easy to get poor performance due to improper image processing. Don’t let images become a performance killer for your APP.
An image is loaded from disk and displayed on the screen at the same time, undergoing a series of complex processes that involve decoding the image
Picture display process
We usually use the following code for the display of pictures:
UIImage *image = [UIImage imageNamed:@"icon"]; self.imageView.image = image;
Copy the code
Two simple lines of code actually contain the following steps:
- The first call to image/ IO is to load an image from disk before it is decoded.
- Copy image to imageView’s image
- An implicit CATransaction then captures the layer tree changes
- When the next iteration of the main runloop arrives, Core Animation commits this implicit transaction, which copies the image and, depending on the image, may involve some or all of the following steps.
I. Allocate memory cache area for file management I/O and decompression operations II. Read data from disk into memory III. Decode compressed image data into uncompressed image data, which is usually a very frequent and time-consuming CPU operation. IV. CoreAnimation renders uncompressed bitmap data to layer.Copy the code
As can be seen from the above steps, the decoding of pictures is the main reason for the main time consuming. If only a few pictures in an APP are set in this way, of course it is no problem, but if there are a large number of pictures in a TableView scrolling, if the decoding is constantly displayed at this time, it will inevitably lead to interface lag.
Why we need to decode it
In fact, the JPEG or PNG format we use is a compressed bitmap graphics format, except that PNG is lossless and supports alpha channels, whereas JEPG is lossy and can specify compression ratios. Here’s how to get an image in the above format in iOS:
UIKIT_EXTERN NSData * __nullable UIImagePNGRepresentation(UIImage * __nonnull image); // return image as PNG. May return nil if image has no CGImageRef or invalid bitmap format UIKIT_EXTERN NSData * __nullable UIImageJPEGRepresentation(UIImage * __nonnull image, CGFloat compressionQuality); // return image as JPEG. May return nil if image has no CGImageRef or invalid bitmap format. compression is 0(most).. 1(least)Copy the code
Next we’ll look at bitmaps:
A bitmap image (or sampled image) is an array of pixels (or samples). Each pixel represents a single point in the image. JPEG, TIFF, and PNG graphics files are examples of bitmap images.
In fact, a bitmap is an array of pixels, each pixel represents an independent point in the image, and each point contains the following contents:
14. Bits per component: The number of Bits used by each independent color component in a pixel; 2. Bits per pixel: The total number of Bits used in a pixel; Bytes per row: The number of Bytes used for each row in the bitmap.
I won’t go into details here, but you can check the pixel format.
We know that the bitmap is compressed, and we can pass
UIImage *image = [UIImage imageNamed:@"icon"];
CFDataRef rawData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
Copy the code
Get the original data size. Generally, the size of the data is calculated as follows:
Image pixel width Image pixel height The number of bytes per pixel 4
Therefore, the current size of the image is not equal to the size after decoding, so we need to get the original data size after decoding, only using the original data can correctly display the image.
Correct decoding posture
It has been known above that the picture display will be decompressed in the main thread and then rendered to the screen. First, we can do decoding operation in the child thread, redraw the picture in the child thread, get the decoded picture, and then rendered to the screen.
Let’s start with the code:
- (void)decodeImage:(UIImage *)image completion:(void(^)(UIImage *image))completion{ if (! image) return; Dispatch_async (dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0)) ^{ CGImageRef imageRef = image.CGImage; Size_t width = CGImageGetWidth(imageRef); size_t height = CGImageGetHeight(imageRef); if (width == 0 || height == 0) return ; CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef) & kCGBitmapAlphaInfoMask; BOOL hasAlpha = NO; / / determine whether color contains alpha channel if (alphaInfo = = kCGImageAlphaPremultipliedLast | | alphaInfo = = kCGImageAlphaPremultipliedFirst | | alphaInfo == kCGImageAlphaLast || alphaInfo == kCGImageAlphaFirst) { hasAlpha = YES; } // In iOS, we use the small-ende mode, and in MAC we use the big-ende mode. For compatibility, we use kCGBitmapByteOrder32Host, 32-bit byte order, which is automatically assembled into different modes on different platforms. /* #ifdef __BIG_ENDIAN__ # define kCGBitmapByteOrder16Host kCGBitmapByteOrder16Big # define kCGBitmapByteOrder32Host kCGBitmapByteOrder32Big #else //Little endian. # define kCGBitmapByteOrder16Host kCGBitmapByteOrder16Little # define kCGBitmapByteOrder32Host kCGBitmapByteOrder32Little #endif */ CGBitmapInfo bitmapInfo = kCGBitmapByteOrder32Host; / / according to whether containing alpha channel, if use the kCGImageAlphaPremultipliedFirst, use kCGImageAlphaNoneSkipFirst ARGB otherwise, RGB bitmapInfo | = hasAlpha? kCGImageAlphaPremultipliedFirst : kCGImageAlphaNoneSkipFirst; / / create a bitmap context CGContextRef context = CGBitmapContextCreate (NULL, width, height, 8, 0, CGColorSpaceCreateDeviceRGB (), bitmapInfo); if (! context) return; CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef); / / create a new bitmap after decompression CGImageRef newImage = CGBitmapContextCreateImage (context); CFRelease(context); UIImage *originImage =[UIImage imageWithCGImage:newImage scale:[UIScreen mainScreen].scale orientation:image.imageOrientation]; // Call dispatch_async(dispatch_get_main_queue(), ^{completion(originImage); }); }); }Copy the code
The performance comparison
Image size | Undecoded direct Render time (ms) | Render time after decoding (ms) |
---|---|---|
128×96.jpg | 0.99 | 0.08 |
128×96.png | 0.80 | 0.08 |
256×192.jpg | 3.01 | 0.14 |
256×192.png | 2.30 | 0.17 |
512×384.jpg | 4.83 | 0.28 |
512×384.png | 6.03 | 0.28 |
1024×768.jpg | 13.83 | 1.43 |
1024×768.png | 18.31 | 1.12 |
2048×1536.jpg | 31.72 | 3.99 |
2048×1536.png | 75.05 | 5.16 |
From the above comparison, we can see that the rendering speed of the decoded image is much higher than that of the undecoded image, and since the decoding is carried out in the child thread, the UI of the main thread will not lag.