Responder chains and event delivery

events

How are events generated and transmitted?
  1. When a touch event occurs, the system adds the event to an event queue managed by UIApplication. UIApplication pulls the uppermost event from the event queue and dispatches the event for processing, usually first to the application’s main window (keyWindow).
  2. The event object is first passed from bottom to top (from parent to child).
  3. If you find the most appropriate control to handle calls to the most appropriate control touches… (touchesBegan, touchesMoved, touchedEnded) methods. If [super Touch… ‘moves the event up the responder’s chain, passing it to the last responder, who then calls touches… Methods.
  4. If no suitable control is found to handle the event, the event is passed back to the window, which does not handle the event and passes it to UIApplication. If the UIApplication cannot handle the event, it is discarded.
  • How do I find the best control to handle events?
    1. Can I receive touch events? There are three cases in which UIView does not receive touch events and does not receive user interactions:
      • UserInteractionEnabled = NO Hidden
      • hidden = YES;
      • Transparency :alpha = 0.0 ~ 0.01;
    2. Is the touch point on your body? So the view has a method, pointInside, so if you return NO, it’s not on you, you don’t iterate over the child control, if you return YES, it’s on you, you iterate over the child control, from back to front, repeat the first two steps if there’s NO child control that matches the criteria, So you’re the best control to work with and when you find the best control to receive, you call the controls’ touchesBegan, ‘touchesMoved,’ touchedEnded methods.
    3. Traverse the child control from back to front, repeating the previous two steps
    4. If there is no child control that matches the criteria, then it is best suited to handle itself
Event passing Example

Touch events are passed from parent control to child control. If the parent control cannot receive touch events, then the child control cannot receive touch events.

  1. Go to the green View: UIApplication -> UIWindow -> white -> green
  2. Go to the blue view: UIApplication -> UIWindow -> white -> Orange -> blue
  3. Click on the yellow view: UIApplication -> UIWindow -> white -> Orange -> blue -> yellow

Incident response

concept
  • Responder objects not all objects in iOS can handle events, only objects that inherit from UIResponder can receive and handle events. We call this the “responder object.” UIApplication, UIViewController, and UIView all inherit from UIResponder, so they’re all responder objects that can receive and handle events.

  • What is a responder? The object that inherits UIResponder is the responder.

  • What is the last responder? If the current view is the controller’s view, then the controller is the last responder; If the current view is not the view of the controller, the parent control is the previous responder.

  • What is a responder chain It is an event processing mechanism, a hierarchy of responder objects connected so that events can be passed along these objects. Using the responder chain, we can create multiple responders responding to the event simultaneously by calling the super method of Touches.

  • Responder chain enables multiple controls to handle the same “touch event”. At the end, the appropriate control calls the Super ‘touchBegan’ method, passing the event to the previous response, and the previous responder can process the event.

The event passing process of the responder chain

If the view’s controller exists, it’s passed to the controller. If the controller does not exist, it is passed to its superview at the top level of the view hierarchy. If it also cannot process the received event or message, it passes the event or message to the Window object for processing. If the Window object does not process either, it passes the event or message to the UIApplication object. If THE UIApplication also cannot handle the event or message, it is discarded.

Responder chain diagram

  1. The system checks whether the current view responds to the event. If the event transmission is complete, go to Step 2
  2. The system checks the controller of the currently touched view. If the controller responds, the event transfer ends. If the view has no controller or the controller does not respond to the event, go to Step 3
  3. The system checks the parent view, and then checks the controller of the parent view…
  4. Finally, if the top-level view/controller doesn’t respond either, pass it to the window
Event response chain hitTest:withEvent: pseudocode
// When to call: As soon as the event is passed to a control, UIApplication -> [UIWindow hitTest:withEvent:] Find the most appropriate view to tell the system // point: the current finger is touching the point // point: is the point in the method caller's coordinate - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {// 1. Window can receive events under the if (self userInteractionEnabled = = NO | | the self. The hidden = = YES | | the self. The alpha < = 0.01) return nil. If ([self pointInside:point withEvent:event] == NO) return nil; Int count = (int)self.subviews.count; for (int i = count - 1; i >= 0; UIView *childView = self. Subviews [I]; ChildP = [self convertPoint:point toView:childView]; UIView *fitView = [childView hitTest:childP withEvent:event]; If (fitView) {// Return fitView; }} return self;}} return self; }Copy the code

Events in iOS

Events can be divided into three main types

  • Touch events
  • Accelerometer event- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event; - (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event; - (void)motionCancelled:(UIEventSubtype)motion withEvent:(UIEvent *)event;
  • Remote control event- (void)remoteControlReceivedWithEvent:(UIEvent *)event;

Touch events (UITouch)

The properties of UITouch
  • @property(nonatomic,readonly,retain) UIWindow *window;The window in which the touch was generated
  • @property(nonatomic,readonly,retain) UIView *view;The view in which the touch was generated
  • @property(nonatomic,readonly) NSUInteger tapCount;The number of times you tap the screen in a short period of time. You can determine clicks, double clicks, or more clicks based on the tapCount
  • @property(nonatomic,readonly) NSTimeInterval timestamp;It records the time, in seconds, when a touch event occurs or changes
  • @property(nonatomic,readonly) UITouchPhase phase;The current state of the touch event
The method of UITouch
  • - (CGPoint)locationInView:(UIView *)view;The return value represents the position of the touch on the view. The returned position is the coordinate system of the view (starting from the upper left corner of the view (0, 0)); If it’s called with view argument nil, it’s going to return the position of the touch point on UIWindow.
  • - (CGPoint)previousLocationInView:(UIView *)view;This method records the position of the previous touch point.
UIEvent

UIEvent: Called the event object, records the time and type of the event. Each time an event is generated, a UIEvent object is generated. UIEvent also provides a way to get a touch object (UITouch) on a view. Common properties

  • @property(nonatomic,readonly) UIEventType type;,@property(nonatomic,readonly) UIEventSubtype subtype;The event type
  • @property(nonatomic,readonly) NSTimeInterval timestamp;The time when the event occurred
The role of UITouch

It holds information about the finger, such as the position, time and phase of the touch. When the finger moves, the system updates the same UITouch object so that it keeps the finger in the same touch position. When the finger leaves the screen, the system destroys the corresponding UITouch object. Tip: Avoid double-clicking in iPhone development!

Touch the process

A complete touch process will experience three states:

  • Touch start:- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
  • Touch movement:- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
  • End of touch:- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
  • Touch cancel (may be experienced) :- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event

In the four touch event handling methods, there are two parameters NSSet * Touches and UIEvent *event. Details: When the user touches the screen with a finger, a UITouch object is created associated with the finger. One finger corresponds to one UITouch object.

  • In a complete touch process, only one event object will be generated, and all four touch methods are the same event parameter.
  • If two fingers touch a view at the same time, the view will only be called oncetouchesBegan:withEvent:The Touches argument contains two UITouch objects.
  • If these two fingers touch the same view separately, then the view will be called twicetouchesBegan:withEvent:Method, and contains only one UITouch object in the Touches argument each time it is invoked.
  • Single or multi-touch touches can be judged by the number of touches in the device.
  • Hint: Touches are all UITouch objects
UITouch code examples
- (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view. _imageView = [[UIImageView alloc]initWithFrame:CGRectMake(100, 100, 100, 100)]; _imageView.image = [UIImage imageNamed:@"player2"]; [self.view addSubview:_imageView]; //userInteractionEnabled is a property inherited from UIView. The default value is YES for many views, but NO for a few, including UIImageview _imageView.userInteractionEnabled = YES; 1. To interact with the user, a view must have its own userInteractionEnabled set to YES and its parent's userInteractionEnabled set to YES //2. If a view coordinates is beyond the parent view cannot interact with the user UIButton * BTN = [UIButton buttonWithType: UIButtonTypeInfoLight]; btn.frame = CGRectMake(20, 20, 50, 50); [_imageView addSubview:btn]; / / example [BTN addTarget: self action: @ the selector (onBtnClick) forControlEvents: UIControlEventTouchUpInside]; //btn.userInteractionEnabled = NO; } -(void)onBtnClick {NSLog(@" button was punched "); } // create a view that touches any view or view controller automatically when touched - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@" touch start "); for (UITouch *touch in touches) { CGPoint p = [touch locationInView:self.view]; _imageView.center = p; } // touch moved :(NSSet *)touches withEvent:(UIEvent *)event {NSLog(@" touches "); NSLog(@"touch. View :%@",touch. View); If (touch. View == _imageView) {NSLog(@" touch "); CGPoint p = [touch locationInView:self.view]; //[touch locationInView:] _imageView.center = p; }}} - (void)touches ended :(NSSet *)touches withEvent:(UIEvent *)event {NSLog(@" touches stopped "); } // cancelled :(NSSet *)touches withEvent:(UIEvent *)event {}Copy the code
UIGestureRecognizer

Using UIGestureRecognizer, you can easily identify some common gestures that users make on a view. UIGestureRecognizer is an abstract class that defines the basic behavior of all gestures. Subclasses of UIGestureRecognizer can only handle concrete gestures. Used for scaling) UIPanGestureRecognizer (drag) UISwipeGestureRecognizer sweep (light) UIRotationGestureRecognizer (rotation) UILongPressGestureRecognizer (long press)

  • The state of gesture recognition
Typedef NS_ENUM(NSInteger, UIGestureRecognizerState) {// No touch event occurred, All the default state of gesture recognition UIGestureRecognizerStatePossible, / / a gesture has been started but not yet change or complete UIGestureRecognizerStateBegan, / / gesture state change UIGestureRecognizerStateChanged, / / complete UIGestureRecognizerStateEnded gestures, / / gestures to cancel, Back to Possible state UIGestureRecognizerStateCancelled, / / gesture fails, Back to UIGestureRecognizerStateFailed Possible state, / / recognition to the gesture recognition UIGestureRecognizerStateRecognized = UIGestureRecognizerStateEnded};Copy the code
  • Specific code
- (void)creatGestureRecognizer {// Tap = [UITapGestureRecognizer alloc]initWithTarget:self  action:@selector(onTap:)]; //tap.numberOfTapsRequired = 2; / / a few times before it response, the default value is 1 / / tap. NumberOfTouchesRequired = 2; // Add a gesture to the view [_view addGestureRecognizer:tap]; UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc]initWithTarget:self action:@selector(onPinch:)]; [_view addGestureRecognizer:pinch]; pinch.delegate = self; / / require the system to identify the scaling and rotation at the same time, set the agent / / rotation gestures UIRotationGestureRecognizer * rots = [[UIRotationGestureRecognizer alloc] initWithTarget: self action:@selector(onRotata:)]; [_view addGestureRecognizer:rot]; rot.delegate = self; Uiswipepegesturerecognizer *swipeDown = swipeDown = swipeDown = swipeDown = swipeDown = swipeDown = swipeDown = swipeDown = swipeDown [[UISwipeGestureRecognizer alloc]initWithTarget:self action:@selector(onSwipe:)]; swipeDown.direction = UISwipeGestureRecognizerDirectionDown; / / set the sliding direction to down [_view addGestureRecognizer: swipeDown]; [swipeDown release]; UISwipeGestureRecognizer *swipeUp = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:@selector(onSwipe:)]; swipeUp.direction = UISwipeGestureRecognizerDirectionUp; / / set the sliding direction to up [_view addGestureRecognizer: swipeUp]; Pan = [[UIPanGestureRecognizer alloc]initWithTarget:self *pan = [[UIPanGestureRecognizer alloc]initWithTarget:self  action:@selector(onPan:)]; [_view addGestureRecognizer:pan]; / / long press gesture UILongPressGestureRecognizer * longPress = [[UILongPressGestureRecognizer alloc] initWithTarget: self action:@selector(onLongPress:)]; [_view addGestureRecognizer:longPress]; [longPress release]; } - (void)onTap:(UITapGestureRecognizer *)sender {NSLog(@" click "); / / to get signal relative to the coordinates of a view / / sender. The sender's view said view CGPoint p = [sender locationInView: sender. View]; NSLog(@"%f,%f",p.x,p.y); } - (void)onPinch: UIPinchGestureRecognizer *)sender {NSLog(@" scale, %f",sender.scale); Sender.view. transform = CGAffineTransformScale(sender.view.transform, sender.scale, sender.scale); sender.scale = 1; / / tell zoom gestures to the basis of the current ratio (the default value is the proportion at the start of the gesture as a benchmark)} - (void) sender onRotata: (UIRotationGestureRecognizer *) { NSLog(@" rotate gesture :%f",sender.rotation); Sender.view. transform = CGAffineTransformRotate(sender.view.transform, sender.rotation); sender.rotation = 0; } - (void)onSwipe:(UISwipeGestureRecognizer *)sender {NSLog(@" swipe "); if (sender.direction == UISwipeGestureRecognizerDirectionUp) { CGPoint p = sender.view.center; p.y -= 10; sender.view.center = p; }else { CGPoint p = sender.view.center; p.y += 10; sender.view.center = p; }} - (void)onPan:(UIPanGestureRecognizer *)sender {//sender.state If (sender. State = = UIGestureRecognizerStateBegan) {NSLog (@ "gesture to start"); } else if (sender. State = = UIGestureRecognizerStateChanged) {NSLog (@ "gesture"); } else if (sender. State = = UIGestureRecognizerStateEnded) {NSLog (@ "gesture to end"); } CGPoint p = [sender locationInView:self.view]; sender.view.center = p; } - (void) onLongPress: (UILongPressGestureRecognizer *) sender {NSLog (@ "long-press gestures"); If (sender. State = = UIGestureRecognizerStateBegan) {NSLog (@ "gesture to start"); } else if (sender. State = = UIGestureRecognizerStateChanged) {NSLog (@ "gesture"); } else if (sender. State = = UIGestureRecognizerStateEnded) {NSLog (@ "gesture to end"); } } - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer { return YES; // Any two gestures can be recognized simultaneously}Copy the code