About Event Distribution

When the user touches the screen or presses a button, the hardware driver first triggers the event. After the driver receives the event, it writes the corresponding event to the input device node, which generates the most primitive kernel event. Then, the input system takes out the original event, which becomes KeyEvent or MotionEvent after layer upon layer encapsulation. Finally, the input event is delivered to the appropriate target Window to consume.

A set of events: starting with a finger touching the screen and ending with the finger leaving the screen.

Look at event distribution at the process level

Back in 2016, Gityuan wrote about the underlying source code for the Input system, detailing the Android Input event handling mechanism.

It is clear that input events are generated and ultimately consumed by the system process system_server and the App process that consumes the event:

InputManagerService

When the system process SysterServer starts, it starts core services and other services, including InputManagerService, which handles input events, such as touch and keystroke events, and peripheral typing events.

As you can see, the touch events we care about are only part of the many input events.

InputChannle

InputChannle is the file descriptor specified to send input events to a window in another process.

InputChannle creates a socket pair for the two threads to communicate with each other.

During the Activity launch process, the ActivityThread’s handleResumeActivity creates the ViewRootImpl and calls its setView method.

The Viewrootimpl. setView creates the InputChannle and initializes channle in the Native layer via WindowInputEvent while registering listening events.

The InputChannle in ViewRootImpl is the client of the socket, and the InputChannle obtained from the SYSTEM service IMS is the server of the socket.

Event distribution from an in-app view

For in-app views, event distribution starts in the Activity and, after a recursive loop, returns the final event processing result in the Activity.

Since this is already a cliche point, it is not necessary to carry out a large length of code analysis, the details can be moved to the Android event distribution mechanism of Carson_Ho: Here’s a summary of the Activity, ViewGroup, and View levels, as well as the overall workflow, using just a few images:

The Activity level

The ViewGroup level

The View level

Overall workflow

Look at event distribution in terms of specific event actions

When distributing touch events, there are different logical routes for different actions, which can be roughly divided into three types: ACTION_DOWN, ACTION_MOVE&ACTION_UP and ACTION_CANCLE. These three types of events will have different logical lines:

Distribute logical cabling in ViewGroup

View distributes logical routing

Event Distribution Key Points

If you don’t raise your hand in ACTION_DOWN, you don’t have a chance for this set of events

As you can see from the figure above, when the ViewGroup distributes the current set of events, it only looks for the TouchTarget, which is the child View that wants to process the set of events, when it is ACTION_DOWN. So if the subview doesn’t “raise its hand” on ACTION_DOWN, then it doesn’t have a chance to process that set of events.

ViewGroup find TouchTarget

When a set of events is distributed, the ViewGroup looks for the TouchTarget first, and if it finds one, the set of events is handed to the TouchTarget. ViewGroup filters the final subview through the following steps when looking for the TouchTarget:

Two points to note here:

  • The child View added later will receive the event first, that is, the higher the View, the earlier it receives the event

  • Changing the z-axis offset of the sub-view will affect the order of receiving events. The larger the z-value is, the more the event is received first

About FLAG_DISALLOW_INTERCEPT

  1. When ACTION_DOWN is triggered, this flag is reset to ensure that the onInterceptTouchEvent of the ViewGroup can be called in each set of events
  2. When the child View processes the current group of events, it can timely disable the parent View’s interception of the subsequent events of the current group of events:
/ / prohibited intercept getParent (). RequestDisallowInterceptTouchEvent (true) / / open the intercept getParent (). RequestDisallowInterceptTouchEvent (false)
Copy the code

The super ViewGroup. DispatchTouchEvent ()

As you can see from the previous diagram, when the touchTarget is not found or the ViewGroup decides to intercept, the ViewGroup will consider whether to consume the event itself from the View perspective.

View.canReceivePointerEvents

This method also plays a decisive role in the ViewGroup traversal to find the TouchTarget.

/**
* @hide
*/
protected boolean canReceivePointerEvents() {// Animation is not null for visible or currently executing Animationreturn(mViewFlags & VISIBILITY_MASK) == VISIBLE || getAnimation() ! = null; }Copy the code

Clear view.clearAnimation() when the animation is finished.

Multi-touch processing

In some scenarios, users can interact with multiple fingers, so it is necessary to properly handle users’ multi-touch behaviors in this scenario.

TouchTarget

Before we do our multi-touch analysis, let’s take a look at TouchTarget:

Private static final class TouchTarget {@unsupportedAppusage public View child; Public int pointerIdBits; Public TouchTarget next; // Omit some codeCopy the code

The TouchTarget contains the child View of the consumption event, the finger IDS that the child View is currently tracking, and a next that points to the next TouchTarget.

We know that the top ViewGroup is going to reset some properties and also look for the TouchTarget when the finger is pressed. The reset attribute requires an ACTION_DOWN event, but finding the TouchTarget also allows the touch event to be ACTION_POINTER_DOWN.

ACTION_POINTER_DOWN: indicates that when a finger is pressed, other fingers already exist in the event sequence, that is, it has not been lifted. Then there may be a situation where the first finger is pressed on one Button, but the second finger is pressed on another Button. Then when the second finger is pressed, the second TouchTarget will be found. Next points to mFirstTouchTarget, and then mFirstTouchTarget points to the new TouchTarget.

If you’re using a gold-digging APP, try holding down the upper right corner to bring up share, then holding down the article content area to bring up menu.

MotionEvent.getActionMask

MotionEvent is the main character in event distribution, and you need to understand it to properly handle touch events.

Check out GcsSloop’s MotionEvent for details.

  • Get event type

    1. GetAction: single touch touch event

    2. GetActionMask: Multitouch must use this method to getAction

  • When tracking events, you should look for pointerids

    int index = event.getActionIndex()
    int pointerId = event.getPointerId(index)
    Copy the code

When multiple fingers are pressed and lifted, the pointerId corresponding to each finger is fixed and the corresponding index will change. The rule of change is to ensure that the index is continuous.

Multi-finger press:

When lifting with multiple fingers:

Multi-touch processing solution

  • The relay type

    Track only one finger at a time (e.g. always track the movement of the latest finger).

  • collaborative

    At the same time, track the movement of all touch fingers to judge user behavior (pinching and zooming, multi-finger translation, etc.).

  • For the stack

    Track the movements of all touching fingers at the same time, but do not affect each other (e.g. each finger can draw).

ViewGroup.setMotionEventSplittingEnabled(false)

This method disables multi-touch support for the called ViewGroup. Let’s look at the dispatchTouchEvent method for the ViewGroup:

Public Boolean dispatchTouchEvent(MotionEvent ev) {final Boolean split = (mGroupFlags &) FLAG_SPLIT_MOTION_EVENTS) ! = 0; // Omit some codeifActionMasked == motionEvent. ACTION_DOWN // When ACTION_POINTER_DOWN occurs, it must be split astrue. Will launch of TouchTarget looking for | | (split && actionMasked = = MotionEvent. ACTION_POINTER_DOWN) | | actionMasked = = ACTION_HOVER_MOVE) {// omit part of findTouchTarget}}Copy the code

GestureDetector can handle user touch events more easily

Using gesture monitor can more easily process user gestures. For details, go to Carson_Ho’s blog Android GestureDetector.