Simon was so gracious to start mirroring the clone Denis and I made to add gesture and touch support to Qt. This clone includes the gesture API that Denis has been working on as well. At the moment, Windows 7 is the only operating system out there that provides a multi-point touch API for to build on-top of, so multi-point touch only works on Windows 7 at the moment. We're investigating how to support other platforms as well as 3rd party multi-point touch solutions. I'm hoping that someone out there with experience in this area can help us integrate support for these systems.
In my last blog, I asked the question "what about touch events that affect multiple widgets or QGraphicsItems?" The answer I gave at the time was the easiest to implement, but as discussions here in the hallway and via email showed, confining touch events to a single widget or item at a time is just too limiting. Multi-point touch is not just about pinching and zooming. People want to use both hands to interact with the UI.
So we changed that. The branch we have supports multiple, implicit touch "grabs." This means that it's possible to interact with several widgets and/or QGraphicsItems on the screen at the same time. There are several gotchas though:
No explicit grab support. We really don't have a good answer yet as to what should happen if you call QWidget::grabMouse().
No popup support (see above).
Recursing into the event loop (i.e. any of the exec() functions in Qt) from a touch event handler can and most likely will break touch event delivery. This is because we need to deliver multiple touch events at a time, and recursing essentially prevents other events from being sent. Again, we don't have a good answer to if or how this should work.
So, the good stuff. Here are a few teasers of multi-point touch working in Qt on the Windows 7 Release Candidate. The first video is a bunch of modified QDials (I hope to add this example to Qt soon), the second is 2 custom QGraphicsItems in a QGraphicsView (see examples/touch/knobs).