1. Generated iOS events

When it comes to events, it’s easy to think of Runloop because Runloop is the event handling framework in iOS. The generation of events that are common in development is also consumed by Runloop.

There are several types of events in iOS:

  1. The touch event is generated when a finger touches the screen
  2. Acceleration events, the phone’s gyroscope and accelerometer
  3. Remote control events, controlled using other remote control devices, such as Bluetooth devices

The following process is drawn in the iOS touch event family bucket, and the complete life cycle of the event is as follows:

  1. The finger touches the screen and encapsulates events as IOHIDEvent objects through IOKit. Framework
  2. The system looks for the foreground APP, which registers the Source1 of the Mach port in the Runloop to listen for events
  3. The callback method in Runloop that specifies the Mach port to listen for Source1 events is fired__IOHIDEventSystemClientQueueCallback()In the callback method, the IOHIDEvent object is added to the Application’s time processing queue EventQueue
  4. Source0, the callback method of Source0, is triggered when there is an event waiting to be processed in the EventQueue__processEventQueue()The EventQueue is wrapped as UIEvent to distribute the event and find the appropriate object!!
  5. Find the appropriate event corresponding object by calling UIWindowhitTest:withEvent:methodshit-testing
  6. In the findhit-tested viewLater, the UIEvent will be passed in the responder-chain and eventually consumed or discarded by the Responder
  7. In addition, UIEvent can not only be consumed by UIResponder, but also be captured by UIGestureRecognizer or target-Action mode, which involves the priorities of the three

Step 5: There is almost no key content in UIEvent during the execution of UIWindow hitTest:withEvent: method

When the hitTest goes through the complete process and finds the best responder, the content in the UIEvent object is very rich

The complete life cycle of UIEvent events in iOS APP is shown as follows:

The whole life cycle of UIEvent event has the following key and difficult points:

  1. UITouch, UIEvent, UIResponder
  2. Hit – testing process
  3. The response to the event and its passing through the responder chain
  4. UIResponder and UIGestureRecorgnizer, UIControl handles the priority of events

1. UITouch, UIEvent, UIResponder

UITouch:

As the name suggests, touching the screen with a finger creates a UITouch object! Touching the object records some key information, including the following:

@interface UITouch : NSObject @property(nonatomic,readonly) NSTimeInterval timestamp; @property(nonatomic, readOnly) UITouchPhase Phase; @property(nonatomic, readOnly) NSUInteger tapCount; // Quickly double-click @property(nonatomic,readonly) UITouchType type API_AVAILABLE(ios(9.0)); / / finger/pencil @ property (nullable, nonatomic, readonly, strong) UIWindow * window; @property(nullable,nonatomic,readonly,strong) UIView *view; @property(nullable,nonatomic,readonly,copy) NSArray <UIGestureRecognizer *> *gestureRecognizers ... Typedef NS_ENUM(NSInteger, UITouchPhase) {UITouchPhaseBegan, // whenever a finger touches the surface. UITouchPhaseMoved, // whenever a finger moves on the surface. UITouchPhaseStationary, // whenever a finger is touching the surface but hasn't moved since the previous event. UITouchPhaseEnded, // whenever a finger leaves the surface. UITouchPhaseCancelled, // whenever a touch doesn't end but we need to stop tracking (e.g. putting device to face) UITouchPhaseRegionEntered API_AVAILABLE (ios (13.4), tvos (13.4)) API_UNAVAILABLE (watchos), // Whenever a touch is entering the region of a user interface UITouchPhaseRegionMoved API_AVAILABLE(ios(13.4), Tvos (13.4)) API_UNAVAILABLE(watchos), // When a touch is inside the region of a user interface, But hasn't yet made contact or left the region UITouchPhaseRegionExited API_AVAILABLE(ios(13.4), Tvos (13.4)) API_UNAVAILABLE(watchos), // When a touch is no longer withdraw the region of a user interface};Copy the code

UIEvent:

The goal of the touch is to generate an event, UIEvent, which is presented to UIResponder for consumption

In addition, UIEvent can also be consumed by UIGestureRecorgnizer or UIControl.

UIResponder:

Every responder is a UIResponder object, and the usual UIView, UIViewController, UIApplication, UIAppDelegate are all subclasses of UIResponder. They can respond to UIEvent because they implement the four key methods of UIResponder:

// Generally, all responders which do custom touch handling should override all four of these methods.
// Your responder will receive either touchesEnded:withEvent: or touchesCancelled:withEvent: for each
// touch it is handling (those touches it received in touchesBegan:withEvent:).
// *** You must handle cancelled touches to ensure correct behavior in your application. Failure to
// do so is very likely to lead to incorrect behavior or crashes.
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(nullable UIEvent *)event;
- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(nullable UIEvent *)event;
- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(nullable UIEvent *)event;
- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(nullable UIEvent *)event;
Copy the code

These methods describe how a UIResponder object responds to a UITouchEvent when it receives it!!

You can also call the super method to pass the event through the Responder chain.

2. Hit-testing process

When the APP receives a touch event, it will be put into the event distribution queue of the Application, and then find an optimal UIResponder to respond to the event through hit-testing. And this UIView is called hit-tested View.)

The search process is bottom-up (UIWindow is at the bottom, from the bottom view to the top view).

Recursively asks a SubView if it can respond to an event, and if there are multiple SubViews, start with subviews.lastObject and work backwards.Copy the code

The key nodes of the judgment process are two methods, two methods of UIView:

- (nullable UIView *)hitTest:(CGPoint)point withEvent:(nullable UIEvent *)event;   // recursively calls -pointInside:withEvent:. point is in the receiver's coordinate system

- (BOOL)pointInside:(CGPoint)point withEvent:(nullable UIEvent *)event;   // default returns YES if point is in bounds
Copy the code

The concrete implementation is as follows:

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{// disallow interaction, hide, Transparent View cannot respond to events if (self userInteractionEnabled = = NO | | the self. The hidden = = YES | | the self. The alpha < = 0.01) {return nil; If ([self pointInside:point withEvent:event] == NO) {return nil; if ([self pointInside:point withEvent:event] == NO) {return nil; } // Traverse the subview array from back to front! Int count = (int)self.subviews.count; for (int i = count - 1; i >= 0; i--) { UIView *childView = self.subviews[i]; ChildP = [self convertPoint:point toView:childView]; UIView *fitView = [childView hitTest:childP withEvent:event]; If (fitView) {// If the best View exists in the SubView, return fitView; }} return self;}} return self; }Copy the code

If you have an instance, click View E:

├── B │ ├── C │ ├── FCopy the code

So the order of hit-testing is:

A->C->F->E
Copy the code

If we implement UIView -(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event method, calling super in UIView will recursion to hit-test in subview!

When overwriting pointInside: WithEvent, Rect the center of a raised circle, Inside Tabbar!! At this point, the event can continue to the SubViews hit-test judgment!!

/ / TabBar rewrite PointInside! - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {// convert the touch point coordinates to the CircleButton coordinates CGPoint pointTemp = [self convertPoint:point toView:_CircleButton]; // CricleButton returns YES if ([_CircleButton pointInside:pointTemp withEvent:event]) {return YES; } return [super pointInside:point withEvent:event]; } the author: silent encoder links: https://juejin.cn/post/6954549838474117127 source: the nuggets copyright owned by the author. Commercial reprint please contact the author for authorization, non-commercial reprint please indicate the source.Copy the code

UIEvent’s response is the same as the UIResponder Chain

Hit-Tested-View = Hit-Tested-View The system calls UIApplication’s sentEvent: method and hit-tested View’s related method via UIWindow’s method of the same name:

UIApplication -> UIWindow -> hit-tested View
Copy the code

Here is a question, how does UIApplication acknowledge UIWindow?! — Because in the process of hit-testing, the key content in the whole process will be recorded in UIEvent! (Actually UITouch in UIEvent will bind UIWindow and UIView saved when touched)

In addition, if you want to respond to UIEvent, it must be a UIResponder object, which can implement the following methods to handle touch events:

- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
    NSLog(@"%s",__func__);
    [super touchesBegan:touches withEvent:event];
}

- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
    NSLog(@"%s",__func__);
    [super touchesMoved:touches withEvent:event];
}

- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
    NSLog(@"%s",__func__);
    [super touchesEnded:touches withEvent:event];
}

- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
    NSLog(@"%s",__func__);
    [super touchesCancelled:touches withEvent:event];
}
Copy the code

If hit-Tested View is called best responder with highest priority, it can either respond to the event or pass the event down the response chain!! The process is top-down. The previous hit-testing process is called a search, and the process that responds to an event is called a response.

UIResponder responder receives the event by calling touchesBegin:withEvent. The response to the event can be as follows:

  1. The default implementation is to callsuper touchesBegin:withEvent:So that the event is passed down the response chain
  2. Intercept the event, swallow the event. Override this method without calling the super touchesBegin:withEvent: method
  3. Intercept the event and continue distribution. Override methods, handle events, and call simultaneously super touchesBegin:withEvent:methods

UIResponder has a nextResponder property that gets the nextResponder, and during hit-testing, the responder chain is fixed!!

The process of UIResponder Chain is as follows:

  1. UIView: If it’s the root view, then the nextResponder is its ViewController, otherwise it’s the father view
  2. UIViewController:
    1. If it’s a RootViewController for a Window, then the nextResponder is UIWindow
    2. If the UIViewController was presented by another VC, the nextResponder is presented VC
    3. If UIViewController is the childVC of another VC, such as UINavigationController, then the next sponder is some View in Father VC
  3. UIWindow: nextResponder is a UIApplication object
  4. UIApplication, maybe an APP Delegate object

Responder Chains:

CircleButton --> CustomeTabBar --> UIView --> UIViewController --> UIViewControllerWrapperView --> UINavigationTransitionView --> UILayoutContainerView --> UINavigationController --> UIWindow --> UIApplication --> AppDelegate author: silent encoder links: https://juejin.cn/post/6954549838474117127 source: the nuggets copyright owned by the author. Commercial reprint please contact the author for authorization, non-commercial reprint please indicate the source.Copy the code

UIGestureRecognizer is related to UIResponder

Gesture recognizers (UIGestureRecognizer) can also respond to touch events.

Gestures in iOS fall into two categories:

  1. Discrete gestures, for exampleUITapGestureRecognizer.UISwipeGestureRecognizer, identify the state changes of success and failure as follows:
    1. Recognized successful: Possible -> Recognized
    2. Identify failure: Possible -> Failed
  2. Persistent gestures, for exampleUILongPressGestureRecorgnizerAnd more!! They identify state changes in success and failure as follows:
    1. Possible -> Begin -> [Changed] -> Ended
    2. Incomplete identification: Possible -> Begin -> [Changed] -> Cancel

Priority of UIResponder and UIGestureRecognizer

UIGestureRecognizer also has the same four touchesXXXX methods as UIResponder:

- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event;
Copy the code

The discrete gesture is TapGesture

Add a TapGestureRecognizer to the father View and Tap YellowView to create the following log:

// Father View Gesture Touch begin 2021-09-07 12:29:46.232375+0800 TouchEventLib-master[62247:2632529] -[GLWindow HitTest :withEvent:] 2021-09-07 12:29:46.232951+0800 TouchEventLib-master[62247:2632529] -[GLWindow hitTest:withEvent:] 2021-09-07 12:29:46.233875+0800 TouchEventLib-master[62247:2632529] -[GLApplication sendEvent:] 2021-09-07 12:29:46.234152+0800 TouchEventLib-master[62247:2632529] -[LXFTapGestureRecognizer touchesBegan:withEvent:] 2021-09-07 12:29:46.234515+0800 TouchEventLib-master[62247:2632529] -[YellowView touchesBegan:withEvent:] 2021-09-07 12:29:46.298401+0800 TouchEventLib-master[62247:2632529] -[GLApplication sendEvent:] 2021-09-07 12:29:46.298667+0800 Toucheventlib-master [62247:2632529] -[LXFTapGestureRecognizer touchesEnded:withEvent:] 2021-09-07 12:29:46.298903+0800 Toucheventlib-master [62247:2632529] View Taped 2021-09-07 12:29:46.299070+0800 TouchEventlib-master [62247:2632529] -[YellowView touchesCancelled:withEvent:]Copy the code

Implement touchesBegin: in custom YellowView and add a TapGestureRecognizer to yourself, then Tap YellowView and the following log will appear:

2021-09-07 12:31:50.317633+0800 TouchEventLib-master[63563:2642415] -[GLWindow hitTest:withEvent:] 2021-09-07 12:31:50.318237+0800 TouchEventLib-master[63563:2642415] -[GLWindow hitTest:withEvent:] 2021-09-07 12:31:50.319273+0800 Toucheventlib-master [63563:2642415] -[GLApplication sendEvent:] 2021-09-07 12:31:50.319619+0800 Toucheventlib-master [63563:2642415] -[LXFTapGestureRecognizer touchesBegan:withEvent:] 2021-09-07 12:31:50.319939+0800 Toucheventlib-master [63563:2642415] -[YellowView touchesBegan:withEvent:] 2021-09-07 12:31:50.384460+0800 Toucheventlib-master [63563:2642415] -[GLApplication sendEvent:] 2021-09-07 12:31:50.384791+0800 TouchEventLib- Master [63563:2642415] -[LXFTapGestureRecognizer touchesEnded:withEvent:] 2021-09-07 12:31:50.385104+0800 Toucheventlib-master [63563:2642415] View Taped 2021-09-07 12:31:50.385339+0800 TouchEventlib-master [63563:2642415] -[YellowView touchesCancelled:withEvent:]Copy the code

The official document explains:

A window delivers touch events to a gesture recognizer before it delivers them to the hit-tested view attached to the gesture recognizer.

Generally, if a gesture recognizer analyzes the stream of touches in a multi-touch sequence and doesn’t recognize its gesture, the view receives the full complement of touches. 

If a gesture recognizer recognizes its gesture, the remaining touches for the view are cancelled. The usual sequence of actions in gesture recognition follows a path determined by default values of the cancelsTouchesInView, delaysTouchesBegan, delaysTouchesEnded properties.
Copy the code

Summary:

  1. In UIView, its parent (or its own) UIGestureRecognizer has a higher event response priority than its own UIResponder.
  2. UIWindow first passes events to these gesture recognizers and then to hit-Tested View
  3. Once a gesture recognizer successfully recognizes the gesture, the UIApplication cancels the hit-Tested View’s response to the event by calling cancel

The persistent gesture PanGesture

Add a PanGestureRecognizer to father View and Pan YellowView will have the following log:

2021-09-07 12:42:18.759136+0800 TouchEventLib-master[67569:2666886] -[GLWindow hitTest:withEvent:] 2021-09-07 12:42:18.759402+0800 TouchEventLib-master[67569:2666886] -[GLWindow hitTest:withEvent:] 12:42:18.759896+0800 Toucheventlib-master [67569:2666886] -[GLApplication sendEvent:] 2021-09-07 12:42:18.760113+0800 Toucheventlib-master [67569:2666886] -[LXPanGestureRecognizer touchesBegan:withEvent:] 2021-09-07 12:42:18.760293+0800 Toucheventlib-master [67569:2666886] -[YellowView touchesBegan:withEvent:] 2021-09-07 12:42:18.781836+0800 Toucheventlib-master [67569:2666886] -[GLApplication sendEvent:] 2021-09-07 12:42:18.782021+0800 Toucheventlib-master [67569:2666886] -[LXPanGestureRecognizer touchesMoved:withEvent:] 2021-09-07 12:42:18.782159+0800 TouchEventLib- Master [67569:2666886] -[YellowView touchesMoved:withEvent:] 2021-09-07 12:42:18.805533+0800 Toucheventlib-master [67569:2666886] -[GLApplication sendEvent:] 2021-09-07 12:42:18.805856+0800 Toucheventlib-master [67569:2666886] -[LXPanGestureRecognizer touchesMoved:withEvent:] 2021-09-07 12:42:18.806108+0800 TouchEventLib- Master [67569:2666886] -[YellowView touchesMoved:withEvent:] 2021-09-07 12:42:18.840644+0800 Toucheventlib-master [67569:2666886] -[GLApplication sendEvent:] 2021-09-07 12:42:18.840874+0800 Toucheventlib-master [67569:2666886] -[LXPanGestureRecognizer touchesMoved:withEvent:] 2021-09-07 12:42:18.841192+0800 Toucheventlib-master [67569:2666886] View Panned 2021-09-07 12:42:18.841367+0800 TouchEventLib-master[67569:2666886] - [YellowView touchesCancelled: withEvent:] the 2021-09-07 12:42:18. 861238 + 0800 TouchEventLib - master (67569-2666886) -[GLApplication sendEvent:] 2021-09-07 12:42:18.861484+0800 TouchEventLib-master[67569:2666886] -[LXPanGestureRecognizer TouchesMoved :withEvent:] 2021-09-07 12:42:18.861646+0800 TouchEventLib-master[67569:2666886] View PannedCopy the code

Here’s what happens when you swipe your finger:

  1. The Recognizer starts swiping the finger and is in recognition state, at which point the swiping generates events, called UIApplicaiton, that are sent to the Recognizer and then to the YelloView.

  2. After some time, the Recognizer recognizes the gesture, at which point the action is called, and the UIApplication sends cancel to the YelloView’s event response. Only the Recognizer responds to events

  3. In addition, if the gesture remains unrecognized during sliding, the event is passed to the Hit-Tested View until the touch ends

Summary of Recorgnizer and UIResponder stages

Whenever a touch occurs or the state of the touch changes, the Window passes events seeking a response

  1. UIWindow first passes the UITouch bound UIEvent to the gesture recognizer bound in the UITouch and then sends it to the Hit-Tested View of the UITouch
  2. When the UITouch’s touch State changes during gesture recognition by the Recognizer, the event is first sent by the UIApplication to the Recognizer and then to the Hit-test View
  3. If the Recognizer successfully recognizes the gesture (state change), it tells the Application to call the hit-Tested View’s cancelTouch method and subsequently stop sending events to the Hit-Tested View
  4. If the Recognizer fails to recognize the gesture, and the touch isn’t over at that point, it stops sending events to the gesture Recognizer and just sends events to the Hit-test View.
  5. If the gesture recognizer fails to recognize the gesture and the touch has ended at this point, the touch event in the end state is sent to hit-Tested View to stop responding to the event.

By Lotheve Link: juejin.cn/post/684490… The copyright belongs to the author. Commercial reprint please contact the author for authorization, non-commercial reprint please indicate the source.

Key properties of the gesture Recorgnizer

UIRecorgnizer has three key properties to set:

// a UIGestureRecognizer receives touches hit-tested to its view and any of that view's subviews @property(nullable, nonatomic,readonly) UIView *view; // the view the gesture is attached to. set by adding the recognizer to a UIView using the addGestureRecognizer: method @property(nonatomic) BOOL cancelsTouchesInView; // default is YES. causes touchesCancelled:withEvent: or pressesCancelled:withEvent: to be sent to the view for all touches or presses recognized as part of this gesture immediately before the action method is called. @property(nonatomic) BOOL delaysTouchesBegan; // default is NO. causes all touch or press events to be delivered to the target view only after this gesture has failed  recognition. set to YES to prevent views from processing any touches or presses that may be recognized as part of this gesture @property(nonatomic) BOOL delaysTouchesEnded; // default is YES. causes touchesEnded or pressesEnded events to be delivered to the target view only after this gesture  has failed recognition. this ensures that a touch or press that is part of the gesture can be cancelled if the gesture is recognizedCopy the code

These three properties are all configured to affect UIResponder when the Recorgnizer handles events. Refer to the notes.

UIControl’s connection to UIResponder

UIControl is another system-provided Target:Action: event handling control that inherits from UIView (and therefore UIResponder), IOS controls such as UIButton, UISegmentedControl, and UISwitch are subclasses of UIControl. When UIControl tracks a touch event, it sends an event to the target added on it to execute the action.

Here are two things to understand:

  1. Target-action Specifies the execution time and process
  2. Touch event priority

Target – the Action mechanism

UIControl is a control that responds to events. Therefore, UIControl must wait until the event interaction conditions are met before responding to events. Therefore, UIControl also tracks the event process. Unlike UIControl and The UIGestureRecognizer, which uses the Touches method, UIControl has its own unique approach to tracing:

// Called when a touch event enters the control’s bounds.
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
// Called when a touch event for the control updates.
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
// Called when a touch event associated with the control ends.
- (void)endTrackingWithTouch:(nullable UITouch *)touch withEvent:(nullable UIEvent *)event; // touch is sometimes nil if cancelTracking calls through to this.
// Tells the control to cancel tracking related to the specified event.
- (void)cancelTrackingWithEvent:(nullable UIEvent *)event;   // event may be nil if cancelled for non-event reasons, e.g. removed from window
Copy the code

These 4 methods are very similar to the previous UIResponder touchesxxx series methods!!

Because all the Responder related methods in UIControl of the system have been overwritten.

The Tracking family of UIControl methods are called inside the Touch family of methods, for example, beginTrackingWithTouch is called inside the touchesBegan method.

When UIControl Tracking reaches the specified event, the Target:Action: response is triggered as follows:

  1. UIControl throughaddTarget:action:forControlEvents:Target, action are cached
  2. It is then called when the event firessendAction:to:for:Event:Notify the UIApplication
  3. UIApplication callsendAction:to:from:forEvent:Method triggers the target action

Therefore, if you want to do the whole buried point, can HOOK UIApplication sendAction: : the from: forEvent: method to monitor the UIControl click event!!!!!!

UIControl and Gesturecognizer priorities

The official document explains:

In iOS 6.0 and later, default control actions prevent Overlapping gestures recognizer behavior. For example, The default action for a button is a single tap. If you have a single tap Gesture recognizer attached to a button's parent view, and the user taps the button, Then the Button's Action Method receives the touch event instead of the gesture recognizer. Source: author: Lotheve links: https://juejin.cn/post/6844903493640290311 the nuggets copyright owned by the author. Commercial reprint please contact the author for authorization, non-commercial reprint please indicate the source.Copy the code

Summary:

  1. UIControl blocks GestureRecognizer!!!! on father View
  2. UIControl GestureRecognizer has a higher priority than UIControl target-action.

Juejin. Cn/post / 684490… There are two demos in:

// Set scene: add a button to the parent BlueView and add a target-action event to the button. Example 1 -- Add GestureRecognizer -[CLTapGestureRecognizer touchesBegan:withEvent:] -[CLButton touchesBegan:withEvent:] -[CLButton beginTrackingWithTouch:withEvent:] -[CLTapGestureRecognizer touchesEnded:withEvent:] After called state = 5 / / UIGestureRecognizerStateFailed gesture recognition fail - [CLButton touchesEnded: withEvent:] - [CLButton EndTrackingWithTouch: withEvent:] Button click / / sample 2 - in the child View Button to add GestureRecognizer - [CLTapGestureRecognizer touchesBegan:withEvent:] -[CLButton touchesBegan:withEvent:] -[CLButton beginTrackingWithTouch:withEvent:] - [CLTapGestureRecognizer touchesEnded: withEvent:] after called state = 3 / / UIGestureRecognizerStateEnded gesture recognition Gestures trigger -[CLButton touchesCancelled:withEvent:] -[CLButton cancelTrackingWithEvent:]Copy the code

Among the above phenomena:

  1. Important (whether Gesture is added to Father or to yourself): After clicking on the Button, the UIEvent is still sent Recognizer first and then passed on to hit-Tested View!!
  2. The UIControltouchesBeganWill be calledbeginTrackingWithTouchMethod!!!!!!
  3. In example 1, UIControl blocks the Father View GestureRecognizer!! Recognizer fails!! State = 5!! Namely UIGestureRecognizerStateFailed.
  4. UIControl prevents after Gesture, subsequent events intouchesEndedIn the callendTrackingWithTouchMethod and respond
  5. In example 2, because UIControl can’t block the GestureRecognizer on its own, it ends up recognizing gestures, while UIResponder doestouchesCancelIs called

Special case, special gesture in ScrollView/TableView!!

Special gestures that may be added by the system in ScrollView and TableView!! — UIScrollViewDelayedTouchesBeganGestureRecognizer

reference

SMNH. Me/hit – testing…

Juejin. Cn/post / 684490…

www.jianshu.com/p/c294d1bd9…