Here in sunny Brisvegas, 'stralia, we have been doing more than not wearing shoes all year, we have been working all kinds of hard on QtDeclarative, QtMultimedia, QtLocation, QtSystems and QtSensors... oh ya, can't forget Quality Assurance/CI System.
Of course, the most exciting one of these is QSensorGestures!
With QtDeclarative, Qt3d, QtMultimedia and QtSensorGestures, you can have an awesome cross platform gaming platform with massive potential. Crikey Mate, That one is a ripper!
Apparently a hot topic for console games, tablets and mobile devices (as well as robots) is - gesture recognition. Since I work in the Qt Sensors team, I decided to investigate gesture recognition using sensors such as accelerometer but also incorporating the light and proximity sensors and share with the world of Qt.
There are heaps of research papers detailing gesture recognition out there. Mostly using an advanced approach and algorithms which need a large amount of training data and time to teach the software the gestures. These systems are usually complicated and very detailed about the gestures they can support.
The Qt Sensors library in Qt 5 was developed as part of Qt Mobility. The Qt Sensor Gesture API was developed subsequently after the sensors code was integrated into the Qt 5 development branch.
Qt's Sensor Gestures uses a plugin system for recognizer integration. Which means if the system allows installing plugins, you can write your very own recognizers. You could even go so far as to use the more complicated and popular way of writing gesture recognizers that use machine learning and computer vision techniques.
I have developed a few simple gesture recognizers in the QtSensorGesture plugin - pickup, twist, cover, hover, whip, turnover, shake and slam. This plugin uses an ad-hoc method of detecting gestures, but recognizer plugin can also use more advanced methods as well. This has the advantage of speed, memory usage and removing the lengthy process of machine training, but it has the disadvantage of the user not being able to create their own gestures, and recognizer collisions - where one gesture will be recognized when another gets performed. But that is always a hazard with sensor gestures.
These gestures are documented here (until the Qt5 release, anyway):
The current state of QSensorGestures is that the current plugins are most likely device specific, which means they might have to be adapted to any real hardware that might want to use them.
In the future, as well as adding "attributes/properties" for device specific configuration, there could be QtSensor backends for certain game controllers which have accelerometers as well as a library to access them, as well as other devices. I have also been looking into using the image sensors as well as audio sensors for gesture recognition, which would require more advanced methods of gesture detection.
I don't have any flashy cool videos as I had hoped for, so you all will have to do with words, documentation and a few images.
and of course, the source code is available in the Qt5 repo.