The goals are to implement touch input on the input-overhaul branch, which will ship in 1.10. Preferably this would be done in an abstract way so that mouse input, touch input and graphics tablets(and possible other pointing devices) can be handled with the same user code.
This spec is only concerned with the low-level device/events API for now.
The implementation needs to satisfy these requirements:
Needs to support multiple touches at once
Needs to have an API for querying active pointers
Needs to generate events in a useful way
Events need to uniquely identify different fingers(for gesture detection)
Make it possible to handle click from any pointing device without difficulty
Distinguish types of pointer: mouse, stylus, finger, eraser, kinect, Oculus Touch finger point, etc.
Expose pressure, size and tilt angle
Keep track of pointer velocity
Scrolling with fingers needs to be distinguishable from click-and-drag
Support tracking pens that are hovering over the surface
Clicking buttons with finger / stylus
Dragging to pan camera
Making a drag-and-drop interface
Scrolling through GUI panels with finger(but not with mouse drag)
On-screen“thumbsticks” like in many mobile FPS games
Detecting multi-finger gestures
We wish to create an abstraction that supports the following devices, each of which behaves differently:
Mouse(incl. laptop trackpad)
Only one pointer
Always has a position whether pressed or not
Arbitrary amount of pointers
No buttons(or one“button”, in a sense)
Each touch only has a position when pressed
Pen / stylus(for digitizer / graphics tablet)
Usually only one
May report a position if hovering over surface
Can have buttons
VR laser pointer(for future consideration)
Can be two, enabling eg. scaling gestures
Similar to mouse
Buttons are actually gamepad buttons
Controlled by the app
Useful for interacting with GUI elements when only relative-motion devices are available(e.g. joysticks, raw USB mice) or when the app implements its own pointing paradigm(e.g. OpenCV-based vision processing)
In order to support all of these, we create the following abstractions:
This is a list of pointers that are making physical contact, which participate in gesture recognition. This includes all touch contacts, and may include the mouse pointer iff one of the buttons on the mouse is depressed.
When a pointer becomes“active” it gets an identifier that persists until it stops becoming“active” so that individual fingers can be tracked by gesture handling code.