| commit | be5e6fd660c8723eb042ce0af247e990622b57f8 | [log] [tgz] |
|---|---|---|
| author | Sairah Amuthan <[email protected]> | Mon Jul 28 23:35:33 2025 |
| committer | Chromeos LUCI <[email protected]> | Mon Aug 11 16:37:12 2025 |
| tree | d1687e2cbb0728b3fa45bc80dfe77a68d2b33f28 | |
| parent | 96a898bb950ec4c6061091407c5e02396399ec45 [diff] |
Introduce Drag-and-Scroll gesture handling Enable users to scroll with two fingers while a physical button is held down. This allows for navigating scroll-able content after initiating a click. The interpreter logic is updated to check for multiple fingers on the touch-pad during a button press. It validates that these fingers display a scrolling motion before recognizing the action as a Scroll gesture instead of a standard Move. It uses the two fastest moving fingers to validate whether a scroll gesture was performed. A user can lift all scrolling fingers to initiate a Fling, mirroring normal scroll behavior. A user can also lift a single scrolling finger to revert the gesture back to a standard drag. A new `drag_scroll_enable` property is added to control this feature, along with unit tests to cover the new interaction scenarios. BUG=b:322199922 TEST=cros_sdk cros_workon_make --test --board=rex chromeos-base/gestures TEST=touchtests TEST=Manual testing Change-Id: Id9a21fb8ce46b18c3e882ffdaffecc98bbf84e9b Reviewed-on: https://chromium-review.googlesource.com/c/chromiumos/platform/gestures/+/6795666 Tested-by: Sairah Amuthan <[email protected]> Commit-Queue: Sairah Amuthan <[email protected]> Reviewed-by: Harry Cutts <[email protected]> Code-Coverage: Zoss <[email protected]> Reviewed-by: Henry Barnor <[email protected]>
The Gestures library takes input events from touchpads, mice, and multitouch mice, detects gestures, and applies quality improvements to the incoming data. It is used on ChromeOS and Android.
The Gestures library consumes hardware states, which describe the state of an input device at one instant in time. (These are represented by struct HardwareState.) For a touchpad, this is the set of touches currently down (including coordinates, dimensions, and orientations, if available) and the set of buttons currently pressed.
Hardware states flow up a stack of interpreter objects, each of which can modify them to improve data quality or report gestures that they have detected, which are represented by struct Gesture. Once a gesture is detected it passes back down the stack until it reaches the bottom, with each interpreter able to make modifications to it before it is reported to the user of the library.
The actual stacks used for different device types can be found in the relevant GestureInterpreter::Initialize... functions (e.g. InitializeTouchpad) in gestures.cc.
For example, a simplified touchpad stack might be made up of the following interpreters:
ImmediateInterpreter (the core of the touchpad stack, which detects gestures)PalmClassifyingFilterInterpreterScalingFilterInterpreterLoggingFilterInterpreterWhen a new hardware state is passed in to the library, it will first be passed to LoggingFilterInterpreter, which will add it to its activity log, and then pass it on unmodified to ScalingFilterInterpreter. ScalingFilterInterpreter will scale the values in the hardware state to be in mm, rather than whatever units and resolution the touchpad reported them in. Next, PalmClassifyingFilterInterpreter will look for any touches that look like they could be palms, and mark them with a flag. Lastly, ImmediateInterpreter will detect gestures from the remaining touches.
Any gestures that ImmediateInterpreter detects will then pass back down the stack. ScalingFilterInterpreter may apply a scale factor to them, and LoggingFilterInterpreter will add them to its activity log. Finally, the Gesture struct will be passed to the user of the library via a callback.
In addition to hardware states, the library also takes a struct HardwareProperties, which describes the unchanging capabilities of an input device. For a touchpad, this includes the ranges and resolutions of its X, Y, and orientation axes, as well as the maximum number of touches it can report simultaneously. This information is accessible to all interpreters in the stack.
Each interpreter can declare configurable parameters using gesture properties. These can include everything from simple booleans controlling whether scrolling is reversed, to arrays describing acceleration curves to be applied to pointer movement.