Session 202 – What’s New in Cocoa Touch

  • Performance optimizations for old UIKit features
    • Scroll to add the data prefetch function
    • Reduce unnecessary memory overhead for image rendering
    • Optimized the time complexity of Auto Layout calculation
  • API tweaks to old FEATURES of UIKit
    • The framework’s older global variables, constants, and functions are nested into objects, making them look more Swift
    • Obsolete versions of the codec API, new more secure codec API
  • New features of UIKit
    • Interaction, grouping, and setting of the Notification
    • Dynamic stickers for Messages and interactive responses to Home Bar on iPhone X
    • The password is automatically generated and filled
    • Siri Shortcut

UIKit performance improvements


  • UIKit does scrolling optimizations with “data preloading” and “CPU power tuning,” both of which work for UITableView, UICollectionView and even your own custom UIScrollView.
  • UIKit optimizes memory overhead for image rendering related apis
  • UIKit optimizes the cost of Auto Layout operations

The story behind “data preloading” is this:

Taking UITabelView as an example, a complete Cell loading process looks like this:

  1. In the UITableView DataSource- tableView:cellForRowAt:Method, the engineer pulls the Cell instance from the cache queue or initializes a new Cell directly
  2. The engineer takes the data and assigns it to the Cell instance
  3. UIKit will call layoutSubviews to layout the cell’s child views
  4. UIKit draws the cell and displays it

As long as you do these four things in one frame, the UITableView won’t get stuck scrolling.

Of the above four steps, the cost of step 2 depends on the different business logic in different situations. Because sometimes the data you use to show this Cell may have been prearranged and can be used directly without any big overhead. However, sometimes the data you use to display this Cell is from the database, or local files, or other places, so there will be an additional process to collect data from other places, which will add an additional cost, which is likely to lead to frame lag.

As the figure above shows, in previous IMPLEMENTATIONS of UIKit, the prefetch behavior itself was also available, and also asynchronous. However, since the fetching time starts at the same time as the loading time of the Cell, the CPU has to fetch the data in advance and also load and render the Cell. This causes the CPU to do two things at the same time in a frame, which can cause frames to drop.

This time, the pre-fetch behavior is put after the Cell rendering is complete. This frees the CPU from data preloading transactions at the beginning of a frame and allows the Cell to be rendered as early as possible. The CPU then does the asynchronous preloading of the data.

In this round of optimizations, UIKit provides a new Protocol for data preloading. UITableView, for one, it provides UITableViewDataSourcePrefetching, the Protocol contains two methods:

func tableView(_ tableView: UITableView, prefetchRowsAt indexPaths: [IndexPath])
func tableView(_ tableView: UITableView,
cancelPrefetchingForRowsAt: indexPaths [IndexPath])
Copy the code
- (void)tableView:prefetchRowsAt:
- (void)tableview:cancelPrefetchingForRowsAt:
Copy the code

To implement this protocol, only the prefetchRowsAt method is required. CancelPrefetchingForRowsAt the can by system calls, such as when you cell is need to download to get data, then combined with the breakpoint continuingly functions and the protocol, you can very gracefully handle cell load data.


Here’s the story behind the “power adjustment” :

When apple engineers were constantly doing performance testing, they found another such situation: if the user was looking at some Cell with relatively low load cost, and then the user suddenly rolled to the Cell with relatively high load cost, UIKit would still appear frame drop phenomenon. The system itself is not under load, and there are no expensive background tasks running, thus eliminating the possibility that the CPU is overloaded.

So why is there a lag in the CPU when there is no load on it? They looked at one round and found that it was due to CPU performance scheduling logic, which went like this:

  1. When the user has been looking at a Cell with a low load cost, which itself requires less CPU performance, the CPU performance scheduling manager is more inclined to maintain low performance to save power.
  2. When the user suddenly reaches an expensive Cell, the CPU scheduling manager does not know that you want to load an expensive Cell, so it continues to load the expensive Cell in a low performance state.
  3. When the CPU performance scheduler realizes how much overhead the Cell originally needs, it will switch to high performance to absorb the load. However, it is often too late to improve CPU performance, and there is a high chance that it will stall.

This process tends to happen in applications similar to Feed streams. For example, the Cell you’ve been looking at is all text content, which is very inexpensive. All of a sudden you’re on the video Cell, which is a very expensive Cell, and the CPU scheduling manager doesn’t try to improve the performance level until the Cell is halfway loaded, so it’s easy to get stuck.

Once you find out why, you can optimize. Now apple engineers have gone through the process so that UIKit information can be passed to the CPU performance scheduling manager. When UIKit loads the next Cell, if it’s a Cell that’s expensive to load, UIKit will go to the CPU performance scheduler and tell the CPU performance scheduler to improve performance. This way, because the CPU performance is already on the front line, the system will be able to avoid frame drops as much as possible.

In other words, the previous CPU scheduling situation looked like this:

After optimization, the CPU scheduling will look like this:

Because UIKit predicted that the next Cell would be expensive, it notified the CPU performance scheduler in advance, and CPU performance improved immediately. It’s much less likely to stall than it was before, when CPU performance slowly improved.


Image rendering related API memory optimization

When there is not much memory left in the system and your application needs a large amount of memory, the system needs to reclaim memory or force the closure of background apps to free up memory for your application. The overhead of freeing up memory for you causes your App to wait longer, which in turn affects the UI performance of your App.

To mitigate this, iOS 12 introduces the Automatic Backing Store. This technology is mainly for the drawing class App, for other occasions invalid. In essence, the implementation is to use less data to express the color of a pixel on the basis of ensuring the color is true.

As shown in the figure below, in the case of 64-bit color, the data amount of the left and right images is the same, about 2.2m. Although there are only black and white colors on the right side, 64 bits are used to store the color values for each pixel.

So, Apple came up with an idea and said, well, if the image on the right is only black and white, why would I use 64 bits for a color? Eight is enough. As a result, the image on the right becomes only 275K when the color per pixel is expressed in 8 bits, thus saving memory:

So why not just use the 1 bit? It’s only black and white anyway. Sorry, because the standard color meter is 8 bit minimum, if you use 1 bit color, the screen will not support this standard. The Automatic Backing Store doesn’t apply to the image on the left, because it’s not an image compression algorithm. Instead of storing a single pixel’s color value in 64 bits, you’ll be storing it in 8 bits while you’re painting and rendering the image. This process can cause some color distortion. So the Automatic Backing Store is only suitable for drawing applications, and is enabled by default in the following three scenarios:

  • UIView.draw()
  • UIGraphicsImageRenderer
  • UIGraphicsImageRenderer.Range

For more information on the Automatic Backing Store, see Session 219-Image and Graphics Best Practices


Optimization of Auto Layout calculation

In iOS 12, Auto Layout has been heavily optimized.

When a View contains N independent child views on the Layout, the calculation time of Auto Layout increases linearly with the increase of the number of child views. Thanks to apple’s engineers, iOS 12 takes a little less time to compute than iOS 11, as shown here:

When a View contains N child views whose layout depends on each other. The calculation time of Auto Layout in iOS 11 increases exponentially as the number of child views increases. After the iOS 12 optimization, the computation time increases linearly, which is a big optimization. (I always thought this was a linear time task, but how did the iOS 11 engineers make it exponential? This engineer should be killed), as shown below:

When N views are nested among each other, the calculation time of Auto Layout in iOS 11 increases exponentially with the increase of the number of child views. After iOS 12 optimization, the computation time increased linearly. I still think this is a linear time task in itself, and the engineers who made it exponential in iOS 11 should be killed. The diagram below:

See Session 220-High Performance Auto Layout for more information


UIKit is a modification of the old API


  • A Swift UIKit
  • More secure codec

A Swift UIKit

In Swift 4.2, some global types were moved into objects, including but not limited to:

Other global constants have also been moved into related objects, including but not limited to:

Other global functions have been moved closer to the object, including but not limited to:

As a result, UIKit does look more Swift, but with the 4.2 update, there are a lot of compile errors.


More secure codec

The speaker just mentioned here that iOS 12 provides a more secure codec API, and the old API will become obsolete. Session 222-data You Can Trust


UIKit new features


  • Notification
  • Messages
  • Automatic Passwords and Security Code AutoFill
  • Siri Shortcuts

Notification API changes

  • Interaction

    • IOS 12 has enhanced the interaction with Notification, allowing developers to customize most of the interface.
  • Grouping

    • IOS 12 groups and presents many notifications, and provides apis that allow developers to decide to some extent how to group them.
  • Settings

    • IOS 12 provides an interface for developers to enable users to directly turn notifications on or off.

The speaker only skimmed over the above three points. For more, see:

Session 710 – What’s New in User Notifications

Session 711 – Using Grouped Notifications


Messages

In iOS 12, you can bring a face recognition sticker from Message into the camera. If you need to do more for these custom, you will have to stay in the Info. Add the following content, Plist MSMessagesAppPresentationContextMedia is actually refers to the camera:

<key>MSSupportedPresentationContexts</key>
<array>
    <string>MSMessagesAppPresentationContextMessages</string>
    <string>MSMessagesAppPresentationContextMedia</string>
</array>
Copy the code

You can also check if a Message is dynamic using the following API:


var presentationContext: MSMessagesAppPresentationContext

enum MSMessagesAppPresentationContext : UInt {
 case messages
 case media
}
Copy the code

IOS 12 also adds a new interaction to Message: Swiping horizontally at the bottom bar will automatically switch between apps. Now your Message App can receive this event, and you can choose to override the system’s action of switching apps, or perform your own action.


Automatic Passwords and Security Code AutoFill

In previous versions of iOS, automatic password entry has been implemented:

In iOS 12, engineers can also prompt users to save their passwords on the iCloud Keychain when changing passwords:

IOS 12 also automatically generates passwords for you, both during the registration process and during the Password change process, as long as the developer tags the Password field to tell you that it is a Password field:

In the process of entering the phone number, sending the captcha, and entering the captcha, iOS 12 provides the function of reading the captcha on the keyboard, so that the user does not have to cut and cut:

Session 204 -Automatic Strong Passwords and Security Code AutoFill


Safe Area

The speaker went over the SafeArea Insets that already exist, without mentioning anything new. I hope you can use Safe Area Insets to adapt your UI to different devices as soon as possible.

Finally, I gave Session 235-UIKit: Apps for Every Size and Shape. According to the speaker, there may be new content in this Session. If you are interested, you can take a look.


Siri Shortcuts

Siri Shortcurs enable your App to respond to Siri’s scheduling. This is done by responding to Intents from each App. Of course, you can also set Intents for your App. Each Intents has a different type:

Siri finds the App based on the Intent, then parses the user’s instructions to determine what type of action to take, and the developer responds accordingly.

The specific situation depends on the following Session:

Session 211 – Introduction to Siri Shortcuts

Session 214 – Building for Voice with Siri Shortcuts

Session 217 – Siri Shortcuts on the Siri Watch Face


WWDC 2018 – Session 202 – What’s New in Cocoa Touch