When you tap the screen, the whole system wakes up like a sleeping creature, goes on a bloody journey, and finally falls silent again.

The whole iOS touch event from generation to extinction is roughly as follows:

System response stage

  1. When your finger touches the screen, the screen hardware senses the input event and sends it to the IOKit driver for processing.

I/O Kit is a collection of system frameworks, libraries, tools, and other resources for creating device drivers. It implements an object-oriented programming model based on limited c++ forms (mainly inheritance and overloading), simplifying the process of passing device drivers to you for further development. Related driver development command line tools: kextLoad /kextunload, kextstat, kextCache, iostat(display terminal, disk and CPU operation kernel I/O statistics), ioalloccount, GCC/GDB, etc.

  1. IOKit user space framework IOKit. Framework encapsulates touch events into an IOHIDEvent object and passes it to springboard.app process through Mach port.

Springboard. app is the basic program for iOS and iPadOS to manage the home screen and start Windows Server, start applications (applications that do this are called app initiators) and set certain Settings on the device when the device boots up. The home screen is sometimes used as a shorthand for SpringBoard. It mainly processes several events such as buttons (lock screen/mute, etc.), touch, acceleration, distance sensor, etc., and then forwards them to the required APP through MAC Port interprocess communication. Mac OSX uses Launchpad, which allows users to launch applications by clicking ICONS from an ios-like Interface called SpringBoard. Before the launch pad, users could launch apps as a Dock, Finder, Spotlight or terminal. But Launchpad doesn’t take up the entire home screen. It’s more like a Space (like a dashboard).

Desktop Response stage

  1. The springboard. app main thread RunLoop wakes up when it receives a message from IOKitSource1The callback__IOHIDEventSystemClientQueueCallback();
  2. The Springboard. app process determines whether there is a foreground application on the desktop and forwards it directly to the foreground application if there is. If there is none (such as page turning on the desktop), the springboard. app internal main thread RunLoop is triggeredSource0Event callback, consumed internally by desktop applications;

APP response stage

  1. This is enabled when the application startscom.apple.uikit.eventfetch-threadthreadRunLoopAnd registersouce1Type event used to receiveSpringBoard.appSend themach port source1Message;
  2. com.apple.uikit.eventfetch-threadThread receivessource1Message after execution__IOHIDEventSystemClientQueueCallbackCallback, and willmain runloopIn the__handleEventQueueThe correspondingsource0Event set tosignalled=YesState while waking up the main threadrunloop, the main thread is called__handleEventQueueTo process the event queue, as shown in the figure below:

    The main threadRunLoopKey event sources related to events are as follows:

<CFRunLoopSource 0x600001608240 [0x7fff8062ce40]>{signalled = No, valid = Yes, order = -1, context = <CFRunLoopSource context>{version = 0, info = 0x6000018101a0, callout = __handleEventQueue (0x7fff48c64d04)}}
Copy the code

Its callback is __handleEventQueue of type source0, so the main thread does not handle events sent by the springboard. app process;

Com. Apple. Uikit. Eventfetch runloop – thread thread is as follows:

DefaultMode
CommonMode
source1
__IOHIDEventSystemClientQueueCallback

  1. Event queue processing is to add touch events toUIApplicationAfter an event is queued,UIApplicationBegin the process of finding the best respondersHit-Testing, the process is as follows:

The general process is that events are passed from bottom to top recursively asking whether subviews can respond to events, in which UIWindow inherits from UIView and can also be used as a view, and if it is at the same level, the subview added later has a higher priority (for UIWindow, UIWindow displayed later has a higher priority), the specific process is as follows:

  • UIApplicationwillUIEventEvents are passed to window objectsUIWindow, if there are multiple of the same levelUIWindow, the rear display has a higher priority, that is, the top-level window has a higher priority and the view has the same priority.
  • If the window cannot respond to the event, the event is passed to another window; If the window can respond to events, it asks whether the window can respond to events from the bottom up.
  • If the sub-view can respond to the event, it continues to pass the query from bottom to top until no sub-view can respond, then it is the most suitable responder.

UIView hitTest:withEvent method is used to determine whether the above event can be responded to. The specific rules are as follows:

  • If the current view cannot respond to the event, return nil;

    There are several states of unresponsive events:

    • Interaction is not allowed:userInteractionEnabled = NO;
    • Hidden:hidden = YESIf the parent view is hidden, the child view is also hidden, and the hidden view cannot receive the time;
    • Transparency:Alphs < 0.01If the view transparency is <0.01, the transparency of the subview is directly affected, that is, the subview is transparent and will not receive events.
  • If the current view can respond to events, but a subview can respond to events, returns itself as the event receiver in the current view hierarchy;

  • If the current view can respond to an event and a child view can respond, the event responder in the child view hierarchy is returned.

HitTest :withEvent call stack

- (UIView *) hitTest: (CGPoint) point withEvent: UIEvent *) event {/ / three state unable to respond to events if (self. UserInteractionEnabled = = NO | | Self. Hidden = = YES | | the self. The alpha < = 0.01) return nil; If ([self pointInside:point withEvent:event] == NO) return nil; Int count = (int)self.subviews.count; for (int i = count - 1; i >= 0; UIView *childView = self.subviews[I]; ChildP = [self convertPoint:point toView:childView]; UIView *fitView = [childView hitTest:childP withEvent:event]; If (fitView) {// Return fitView if there is a more suitable subview; }} return self;}} return self; }Copy the code

The pointInside:withEvent method is used to determine whether the touch point is within its own coordinate range. The default implementation is to return YES if it is within the coordinate range, otherwise return NO. Therefore, you can modify the flow of events by overriding UIView’s hitTest:withEvent and pointInside:withEvent methods.

  1. After finding the best responders,UIApplicationthroughsendEvent:Passes the event to the owner of the eventUIWindow.UIWindowThrough the samesendEvent:And pass the event tohit-tested viewFor the best responders, the process is as follows:

Response to events

// touch began - (void)touchesBegan:(NSSet< uittouch *> *)touches withEvent:(nullable UIEvent *)event; Touches :(NSSet< uittouch *> *)touches withEvent:(nullable UIEvent *)event; // touch ended - (void)touches ended :(NSSet<UITouch *> *)touches withEvent:(nullable UIEvent *)event; // Before the touch ends, some system event interrupts the touch, such as an incoming phone call, Cancelled:(NSSet<UITouch *> *)touches withEvent:(nullable UIEvent *)event;Copy the code

Each method responding to a touch event receives two parameters, namely touch object creation and event object UIEvent.

For hit-tested View, the best responder has the highest priority and absolute control of the response event: it can monopolize the event, or it can pass the event down, namely the event delivery (response chain). The specific operation of the response chain is as follows:

  • No interception, default operation

    Events are automatically passed down the default response chain

  • Intercept, and no longer distribute events down

    Overriding ‘touchesBegan withEvent’ to do event handling without calling the parent class ‘touchesBegan withEvent’;

  • Intercept and continue to distribute events

    Overwrite ‘touchesBegan withEvent’ to handle the event, and call the parent class ‘touchesBegan withEvent’ to pass the event down.

It should be noted that the bottom-up transmission of the event is different from the bottom-up transmission of the event here, where the event is passed down as the response of the event, while the previous event is passed down as the search for the best responder. The former is “search”, while the latter is “response”.

The response chain relationship is shown in the figure below:

UIResponder
UIView
UIViewController
UIWindow
UIApplication
nextResponder
nextResponder

  • UIView If the view is the root view of the controller, then its nextResponder is the controller object; Otherwise, its nextResponder is the parent view.

  • UIViewController If the controller’s view is the root view of the window, then its nextResponder is the window object; If the controller is presented by another controller, the next sponder is the presenting view controller.

  • UIWindow nextResponder is UIApplication object.

  • UIApplication If the current app delegate is a UIResponder object, and it’s not UIView, UIViewController, or the app itself, The next Sponder of UIApplication is the app delegate.

Printing response chain objects can be done by:

- (void)printResponderChain { UIResponder *responder = self; printf("%s",[NSStringFromClass([responder class]) UTF8String]); while (responder.nextResponder) { responder = responder.nextResponder; printf(" --> %s",[NSStringFromClass([responder class]) UTF8String]); }}Copy the code
  1. If the view has a gesture recognizer, since the gesture recognizer has a higher event response priority than the UIResponder object, UIWindow passes the event to the gesture recognizer first and then to hit-Tested View. Once the gesture recognizer successfully recognizes the gesture, UIApplication will cancel the hit-tesed view’s response to the event, and no further events will be received; If the gesture recognizer fails to recognize the gesture and the touch is not finished, the event is stopped and only sent to the Hit-Tested View.

    If the gesture recognizer option cancelsTouchesInView = NO(default: YES), the event will be delivered to hit-Tested View after the gesture recognizer successfully recognizes the gesture.

    DelayTouchesBegan = YES(default: NO), which means that the gesture recognizer truncates the event during gesture recognition, that is, it does not send the event to hit-tested View.

    DelayTouchesEnded = NO(default YES), When the gesture recognizer fails, the UIApplication object is immediately notified to send an End UITouch event to hit-Tested View to end the event response by calling touchEnded:withEvent.

  2. If a view contains UIControl objects that inherit from UIView, such as UIButton, UISegmentedControl, or UISwitch, when UIControl traces a touch event, it sends an event to the target added to it to execute an action.

    Since UIControl inherits from UIView, it also has UIResponder event handling, but its method tracing is different as follows:

    - (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
    - (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
    - (void)endTrackingWithTouch:(nullable UITouch *)touch withEvent:(nullable UIEvent *)event;
    - (void)cancelTrackingWithEvent:(nullable UIEvent *)event;
    Copy the code

    In fact, the above methods in UIControl are called inside the UITouch method, for example, beginTrackingWithTouch is called inside the touchesBegan method. When UIControl traces events and identifies that the event interaction meets the response conditions, it triggers target-Action to respond.

    In fact, when UIControl listens for an interaction event that needs to be handled, it calls sendAction:to:forEvent: to send target, Action, and Event objects to UIApplication objects, UIApplication object through sendAction: to: the from: forEvent: send the action to the target, therefore, can override this method to customize event target and action of execution.

    If UIControl adds a gesture recognizer, it cannot respond to the target-Action event;

Reference

  1. Flow of iOS touch events
  2. IOS event handling mechanism and image rendering process
  3. IOS Rendering full Rendering parsing
  4. IOS touch event family bucket
  5. Using Responders and the Responder Chain to Handle Events
  6. SpringBoard
  7. /System/Library/CoreServices/SpringBoard.app
  8. IOKit Fundamentals
  9. Complete guide to iOS RunLoop
  10. OS X and iOS Kernel Programming