Posted on: 16 June 2015
Ph.D - Principal Experience Architect
Google Goes Gaga for Gestures
“For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive. You merely had to brush the panels with your fingers; now all you have to do is wave your hand in the general direction of the components and hope. It saved a lot of muscle expenditure of course, but it meant that you had to sit infuriatingly still if you wanted to keep listening to the same programme.”
– Hitchhikers Guide to the Galaxy, Douglas Adams
By now you may have heard the announcements from Google around Project Soli? If not, you can catch up to speed by watching the video here. While I am always excited about new technologies and paradigms for interacting, I think any predictions on the death of touchscreens are far off the mark for three reasons; incompatible mental models, inconsistent gesture lexicon, and the limits of the human body.
In the video, the researchers demonstrate several ways that gestures can be used to interact with the system. What is amusing is that all the gestures mimic interactions with their physical counterparts. Because I grew up with mechanical watches that had to be set, I am familiar with the paradigm of pulling out a knob and turning to update the time. Similarly, sliders, buttons are all virtual constructs of physical objects. With our current standard set of icons (print, save, phone), we see generational differences in comprehension because the graphics are no longer representative of the objects or behaviours they support.
The next generation grew up surrounded by touchscreens, and may not have the same mental models of how to interact with objects.
Ask someone to demonstrate the gestures that are familiar with on touchscreens and you will be shown the standard set of pinch, spread, swipe (up, down, left right), drag, tap and double tap. These have become universal. However, not all gestures are universal, or convey the same meaning in every culture. Giving a thumb up in Western countries implies that everything is ok, but has a rude and offensive meaning in Islamic countries.
Gestures may also have different meanings depending on the context. Putting your hand up, palm facing away from you could be indication to someone to stop, or you may be simply trying to catch their attention. To turn lights on in my house I can use up to three different gestures, flick a switch, press a button, and pull a string. So which is the appropriate “turn on light” gesture?
Our hands are incredible mechanisms that can support very large, gross motor functions, like grabbing a door handle, or very fine, minute actions like untying a knot. However, while we can perform a wide range of activities with just our hands, we often need to augment them with tools to be able to perform our tasks more efficiently, and accurately. So, while you can paint with your fingers as a child, to be able to paint the Mona Lisa, you will need to use a brush. Steve Jobs famously said “If you see a stylus, they blew it”, but the success of Samsung Notes, Bamboo pen and Touch, demonstrate that users like to have tangible ways of interacting, particularly when performing precise actions.
So while it is exciting to see the work that is coming out of the Google research labs, I think the limits of gesture based interaction ensures that its use is limited to specific use cases, where previous user mental models are not required, where the gesture is universally understood, and takes into account the limits of human performance. Gestures are simply another tool in the designers’ kit, along with other tangible (touch screens, physical controls) and other intangible (like voice recognition) interaction techniques.
Ph.D - Principal Experience Architect
Dan firmly believes that technology must be created with the user in mind. Never shy to critique a bad design, Dan uses the Akendi blog to shine a spotlight on usability mistakes…and their solutions. Leveraging his background in engineering, computer science, psychology, and anthropology, Dan offers a unique perspective on the latest UX trends and techniques.