Rockbox Development > Feature Ideas

Gestures for Touchpad / Touchscreen targets

(1/2) > >>

NarcoticV:
Discussion started in: http://forums.rockbox.org/index.php/topic,43431.0.html

Gestures such as flicking through songs and lists, etc, would be a nice addition to the existing control scheme for touchscreens and touchpads.

A few ways to fit this into Rockbox were stated in the topic above. In this topic we can brainstorm about ways to do it, what to use it for, etc.


 - Would there have to be a setting for it?
                 - Yes, because people will strongly disagree about (some) gestures
                 - No, because having customizable controls is a lot of hassle and there should be a "best" way
                 - etc

 - How to combine gesture code with the existing button system?
                 - Have the gesture support at a low, target-specific level, and convert gestures to button presses
                 - Implement a whole new control system parallel to the existing button system, starting from scratch
                 - etc

 - How to give plug-ins access to gesture support and also directly to touchpad input data?

 - Ideas for gestures that would be cool to have?
                 - flicking through items
                 - a virtual scrollwheel a la old IPods
                 - zooming with multiple fingers for devices that support it
                 - an unlock sequence like smartphones have
                 - etc

metaphys:
regarding how many target are on the line with touchpad I think that the fuze+ is pretty much alone for the moment:
There are other HAVE_TOUCHPAD targets but when ones have a closer look at them it turns out that those are not actual touchpad in a sense of a plain square sensitive board. Some have only a virtual scroll bar, some other are a collection of several touch button and so on.

regarding a way to implement cleanly a general way to add gesture for all touchpad target, It would start in my opinion by having a mandatory function in all touchpad driver providing at least absolute coordinate of a touch, the number of them.

regarding the simulator, when I started with idea of implementing gestures, I figured out that it will be almost mandatory for testing purpose. One idea would be to be able to use the device interface in usb mode to control the simulator. Pamaury had some thought about it but he doesn't had time and it's should be a lot of work.

I don't have a lot of time either but I would be happy to help from time to time if someone starts giving some time to this

NarcoticV:
@metaphys: I agree about the touchpad driver providing generic touchpad data, which every touchpad configuration should be able to comply to. So coordinates, touchpad size, number of fingers, so plugins and gesture code can at least poll this data.

Tinkering with the code, I am also starting to think that mapping gestures to buttons (aka a flick gesture would return BUTTON_UP to button_read_device() ) seems to be a really bad idea. In combination with button modifiers such as BUTTON_REPEAT and BUTTON_REL it quickly becomes a puzzle to "emulate" a normal button press to fool the button system.

I think there should be a gesture system parallel to the button system. They would come together only in two places: in the low-level, target-specific code that processes touchpad data, and in the keymap file. Gestures would be reported parallel to buttons, the common code would know that they are gestures and not buttons, and they would be directly mapped to actions in the keymap file.

Maybe the gesture system (say gesture.c) should be closer to the touchpad driver than the button system: after all, a button press can be regarded as a gesture in its own right, and the gesture code should be able to differentiate between button presses (taps) and other gestures. If a tap is recognized, the coordinates of the tap can get passed to button_<target>.c .

Lorenzo92:
I also like the idea about gestures very much! Nowadays there are more and more touchscreens and Rockbox must be in the future too :)
(personally I don't like touch-based players, but I have one and I'm willing to improve rockbox support for it!)
Unfortunately I don't have much time yet to spend on this, but I want to help as much as possible....

monoid:

--- Quote from: NarcoticV on September 09, 2013, 06:35:26 PM ---Maybe the gesture system (say gesture.c) should be closer to the touchpad driver than the button system: after all, a button press can be regarded as a gesture in its own right, and the gesture code should be able to differentiate between button presses (taps) and other gestures. If a tap is recognized, the coordinates of the tap can get passed to button_<target>.c .

--- End quote ---

I do not know Rockbox code and structure, but this sounds correct. The gesture layer IMO should be closer to the physical layer and touch key layer above gesture layer. The gesture layer would "eat" all gesture related touch events and pass on only the rest of events to touch key layer.

This way gesture layer could be seamlessly integrated into Rockbox structure and switched on/off. And no code change would be needed in touch key layer.

The only problém, I see, might be, that some key events might be delayed (if gesture layer "waits" whether consequtive event comes or not...). One would have to try it in real life.

Navigation

[0] Message Index

[#] Next page

Go to full version