Gestures in short

After Brad has blogged about multi-touch support in Qt, it became perfectly clear that there is a need for better description of a high-level API that should be built on top of user input (mouse- and touch-based and possibly others) providing complex input events support known as "gestures".

I think what the gesture api should provide to the widget developers an ability to drop-in gesture support without modifying the widget source code much. However some modification is required - at least the widget should announce that it supports gesture input and implement gesture event handler. Good news are that for standard Qt widgets this can be handled by cute Qt developers.

Thinking about gesture event - I'd say that they should be as context-free as possible - i.e. instead of a QZoomGesture it should be called QPinchGesture with a distance delta, QGoToNextImageGesture should be called QSlideGesture with a direction argument that specifies slide direction. It should be up to the application developer to decide which action should be executed as a result of a gesture. Of course for the most commonly used gestures there could be a recommended action documented somewhere - for example Pinch is usually used for zooming and rotating an item, and TwoFingerTap (as well as Tap-And-Hold) usually triggers a context menu.

Native and Non-native Gestures

The gesture api is intended to be extendable so that application developers can implement custom gestures support (for example Linja Zax gesture). The idea is that all you need to add a new gesture is subclass the QGestureRecognizer class, implement a state machine that recognizes a gesture out of a list of QMouseEvent/QTouchEvents and "register" it in a gesture framework.

Some OSes also have support for native gestures - for example - Windows 7 with its new WM_GESTURE window message that supports most common gestures like Panning (with either one or two fingers), Pinching (zoom and rotate), etc. Of course we would also want Qt to benefit from having native gestures support and automatically translate them to QGestureEvents - this is done automatically by Qt itself and the application developer won't event bother what was the source of the gesture.

However implementing native gesture support adds some limitations to our gesture api - for example there are some tricks on Windows7 - if it decides that the gesture started or is about to start, we will not get raw mouse/touch events that resulted in the gesture, but get WM_GESTURE events only. Moreover a single window cannot receive both WM_TOUCH and WM_GESTURE events. That means that there could be no direct mapping between mouse/touch events and gesture events, hence the application developer that uses Qt gesture api cannot rely on receiving QTouchEvent before a QGestureEvent. But I suppose in most gesture usecases this should be acceptable.

While looking into gesture handling on Windows 7, I've noticed that it delays WM_LBUTTONDOWN message delivery when you touch the screen until either a finger was released or a finger was moved far enough to figure out if this is a beginning of a gesture or not. The current implementation of Qt Gesture API follows the same concept and delays the QMouseEvent delivery to gesture-enabled widgets and replays these events back to the widget if either a mouse release event received or gesture recognizers didn't come to a solid conclusion in a fixed amount of time. That allows to have gesture support in complex widget like QWebView without modifying the widget source code to work around issues when a gesture starts over a link or an image (to tell apart a gesture, a drag-n-drop sequence and a click on a link). That might not be something that suits  all applications, and that's why the delay is configurable and can be turned off completely, allowing a widget to receive both mouse/touch events and gesture events.

Custom gestures

To add a custom gesture support the developer need to subclass a QGestureRecognizer class and add it to the Gesture framework. For widgets that subscribe to gestures provided by that recognizer, the recognizer object will become an input event filter which decides if the input event should be forwarded to the widget or a QGestureEvent should be sent instead.

gestures.png

The source code for the QGestureRecognizer should be self explanatory:

/*!
class QGestureRecognizer

brief The QGestureRecognizer class is the base class for
implementing custom gestures.

This is a base class, to create a custom gesture type, you should
subclass it and implement its pure virtual functions.

Usually gesture recognizer implements state machine, storing its
state internally in the recognizer object. The recognizer receives
input events through the l{QGestureRecognizer::}{filterEvent()}
virtual function and decides whether the parsed event should
change the state of the recognizer - i.e. if the event starts or
ends a gesture or if it isn't related to gesture at all.
*/
class Q_GUI_EXPORT QGestureRecognizer : public QObject
{
public:
/*!
This enum type defines the state of the gesture recognizer.

value NotGesture Not a gesture.

value GestureStarted The continuous gesture has started. When the
recognizer is in this state, a l{QGestureEvent}{gesture event}
containing QGesture objects returned by the
l{QGestureRecognizer::}{getGesture()} will be sent to a widget.

value GestureFinished The gesture has ended. A
l{QGestureEvent}{gesture event} will be sent to a widget.

value MaybeGesture Gesture recognizer hasn't decided yet if a
gesture has started, but it might start soon after the following
events are received by the recognizer. This means that gesture
manager shouldn't reset() the internal state of the gesture
recognizer.
*/
enum Result
{
NotGesture,
GestureStarted,
GestureFinished,
MaybeGesture
};

explicit QGestureRecognizer(const QString &type, QObject *parent = 0);

/*!
Returns the name of the gesture that is handled by the recognizer.
*/

QString gestureType() const;

/*!
The main method that processes input events and changes the
internal state accordingly.
Return the current state of the gesture recognizer.
*/

virtual QGestureRecognizer::Result filterEvent(const QEvent* event) = 0;

/*!
Returns a gesture object that will be send to the widget. This
function is called when the gesture recognizer changed its state
to QGestureRecognizer::GestureStarted or
QGestureRecognizer::GestureFinished.
*/

virtual QGesture* getGesture() = 0;

/*!
Resets the internal state of the gesture recognizer.
*/

virtual void reset() = 0;

signals:
/*!
The gesture recognizer might emit the stateChanged() signal when
the gesture state changes asynchronously, i.e. without any event
being filtered through filterEvent().
*/

void stateChanged(QGestureRecognizer::Result result);
};


Blog Topics:

Comments