10. We're using bodies
evolved for hunting,
gathering, and gratuitous
violence for information-
age tasks like word
processing and
spreadsheet tweaking.
—David Liddle
11. We’re in the midst
of an interaction
design revolution.
13. What we’re going
to talk about
Sensors and touchscreen types
Kinesiology and physiology
Touch targets
Communicating
Choosing appropriate gestures
Case study: Canesta Entertainment Center
14. Gesture: any physical
movement that can be
sensed and responded to
by a digital system
without the aid of a
traditional input device
such as a mouse or stylus.
21. Two types of
interactive gestures
Touchscreen
aka TUI
Single and multi-touch (MT)
Free-form
Wide variety of forms
22. Why not to have
a gestural interface
Heavy data input
Relies heavily on the visual (for now)
Can be inappropriate for context
More physically demanding
23. Why have a gestural
interface?
More flexible
Less visible hardware
Hardware fits context better
More “natural”
More fun
29. The ergonomics
of human gestures
Avoid hyperextension or extreme stretches
Avoid repetition
Utilize relaxed, neutral positions
Avoid staying in a static position
No “Gorilla Arm”
30. Gorilla arm
Humans not designed to hold their arms in front of
their faces, making small gestures
Ok for short-term use, not so much for repeated,
long-term use
Fun Fact: Telegraph operators had “glass arm”
Sorry, Minority Report-style UIs
34. The more challenging and
complicated the gesture,
the fewer people who will
be able to perform it.
35. What about
accessibility?
No good, clear answer
Improving via addition of haptics (and hopefully,
eventually, speech)
Some touchscreen systems much better than
traditional WIMP systems
Special care when designing touch targets
39. Fingers
Fingernails: Blessing
and curse
Fake fingernails: evil
Finger oil
Fingerprints
(Left) Handedness
Wrist support
Gloves
Inaccurate (when
compared to a cursor)
Attached to a hand aka
Screen Coverage
42. Avoid putting essential
features or information
like a label below an
interface element that
can be touched, as it may
become hidden by the
user’s own hand.
44. Touch target size
Remember Fitts’ Law! (Time it takes to get to a target
= distance to target / size of target)
As close to the user as possible to avoid users’
covering the screen with their hands
Space between the targets (when possible)
Create reasonably-sized targets: no smaller than
1cm in diameter/square (the size of finger pads)
73. Turning gestures
into code
Variables: what are you measuring?
Data: get the data in from the sensor
Computation: determine difference between data
Patterns: what do the sums mean?
Action: if a pattern is matched, do something
77. Architectural
wireframes
“Master UI” “Individual UI”
Run by presenter
Live Touchscreen
Projection Area
Used by show attendees
[floor]
[ showing typical arm’s reach for 6’ tall user ] [ showing typical arm’s reach for 6’ tall user ]
touchscreenoverview
[floor]
96. Four part equation
1. The task that needs to be performed
2. The available sensors and input devices
3. The physiology of the human body
4. The context
This can be pretty straightforward
Or not
98. Usability issues
Avoid unintentional triggers via everyday actions!
Wide variation in performing gestures: need
requisite variety
Pick one: select then action, or selecting does action
Gestures as command keys: Provide a normal means
of performing the action (buttons, etc.) but have
“advanced” gestures as shortcuts
104. The complexity of the
gesture should match
the complexity of the
task at hand.
105. The best designs are
those that “dissolve
into behavior.”
(Naoto Fukasawa)
106. The best, most natural
designs, then, are those
that match the behavior
of the system to the
gesture humans might
already do to enable
that behavior.