*Available for a limited time only
To address this trend, "One of the most useful long-term and short-term tactics will be outsourced experiences," according to Gartner. "An outsourced UX is an experience that’s running on a device from the one being controlled. [...] Although we expect smartphones to be the most popular outsourced UX device, there will also be opportunities to create low-cost custom hardware devices for some interactions."
Fortunately, Qt lets you build software on both mobile embedded devices fast, based on your existing code, and supports a variety of no-touch UI technologies – see below.
*Available for a limited time only
Eye-tracking is an innovative technology that allows you to accurately estimate where a person is looking, instant by instant, through the use of dedicated devices. Today this technology is spreading more and more thanks to the reduction of costs, the miniaturization of the devices, and the high degree of reliability achieved.
DeepGlance Quick enables your traditional Qt UI to be controlled by gaze. Eye-tracking combines with other technologies to make interactions more effective and natural. It can also be used to monitor the attention of an operator and adapt the information shown based on the user's interest.
EyeTech Digital Systems used Qt to develop a hands-free computing device that offers superior user experience for the disabled. The module can be easily mounted to Augmentative and Alternative Communication (AAC) speech devices and shared across multiple computers, enabling people to communicate from anywhere, at any time of day.
Gesture control enable users to interact and control devices without physically touching them. The machine uses mathematical algorithms to interpret the gestures that can originate from any state or motion of any given body part. Today, most use cases focus on hands and faces to facilitate an intuitive touch-free input method.
Amazon Web Services combined a Qt user interface and MXNet deep learning to build a gesture-controlled robot arm.
BrightOne shows how smooth gesture control on an automotive HMI built with Qt can be!
Speech recognition is used in anything from creating text documents to remote controlling devices inside your smart home or your car. The main benefits of using speech over traditional input methods are that it's generally faster and that it allows for hands-free operation. While modern speech recognition is continually improving and is to date fairly accurate, some occasional inaccuracies and inconsistencies caused by unique dialects or speech patterns still occur.
Besides simple convenience and hygiene, it's also safer to control your in-vehicle infotainment system via speech. In this demo, you can see how media players, climate control, navigation, phones, and more can be remotely controlled via Amazon Alexa voice integration.
This demo features two ways to remotely control your UI, using Qt Speech as an Alexa wrapper and via smartphone screen mirroring.
Bluetooth is a short-range (less than 100 meters) wireless technology. It has a reasonably high data transfer rate of 2.1 Mbit/s, which makes it ideal for transferring data between devices. Bluetooth connectivity is based on basic device management, such as scanning for devices, gathering information about them, and exchanging data between them.
Qt MQTT enables IoT connectivity on different operating systems and hardware. A sensor sends environmental data to a Raspberry Pi via Qt Bluetooth. The tablet receives the data via Qt MQTT and visualizes it while The smartwatch is running the same qt application on a windows 10 IoT Core. All data is transmitted, logged, and distributed through the Azure Cloud integration.
This demo shows how MQTT enables communication throughout the supply chain. The Qt UI enables a live preview in real-time and in any browser. A sensor sends environmental data using Qt Bluetooth, and transmits it to the cloud via MQTT and Using Ethereum BlockChain to secure data of decentralized edge devices.