Thank You for your continued support and contributions!
Maybe the gesture system (say gesture.c) should be closer to the touchpad driver than the button system: after all, a button press can be regarded as a gesture in its own right, and the gesture code should be able to differentiate between button presses (taps) and other gestures. If a tap is recognized, the coordinates of the tap can get passed to button_<target>.c .
The gesture layer would "eat" all gesture related touch events and pass on only the rest of events to touch key layer.This way gesture layer could be seamlessly integrated into Rockbox structure and switched on/off. And no code change would be needed in touch key layer. The only problém, I see, might be, that some key events might be delayed (if gesture layer "waits" whether consequtive event comes or not...). One would have to try it in real life.
Page created in 0.05 seconds with 17 queries.