This is a presentation for the defense of my capstone project for a Master's of Science in Human-Computer Interaction, from Rochester Institute of Technology. For this project I created an application for navigating a geospatial display through gaze input on a 2D user interface overlay.
Using gaze input to navigate a virtual geospatial environment
1. Using Gaze Input to Navigate
a Virtual Geospatial
Environment
Mark Hazlewood
Committee
Anne Haake (chair)
Reynold Bailey
2. Defense Outline
• Capstone project details
• Prior work
• Software and user interface design
• User testing details and results
• Conclusions and future work
4. Project Details
Prior Work
Software Design
User Testing
Future Work
Objectives
• Primary objective
• Develop a software application allowing users to navigate in a virtual
geospatial environment using their gaze as input
• Secondary objectives
• Attempt using the Kinect sensor as a remote eye tracker
• Conduct preliminary user evaluations of the developed application
5. Project Details
Prior Work
Software Design
User Testing
Future Work
Planned Timeline
Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Activity
Development
Integration+Test
Participant Recruitment
User Testing
Analysis, documentation, writeup
6. Project Details
Prior Work
Software Design
User Testing
Future Work
Planned Timeline
Phase 4
Phase 3
Phase 2
Phase 1
Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Activity
Development
Integration+Test
Participant Recruitment
User Testing
Analysis, documentation, writeup
Phase 1 – Exploratory Development with Kinect
Phase 2 – Selection of an Eye Tracking System
Phase 3 – Development of Geospatial Application
Phase 4 – User Testing
8. Project Details
Prior Work
Software Design
Stellmach, et al
• Designing Gaze-based User Interfaces
for Steering in Virtual Environments
(ETRA ‘12)
• Evaluated several techniques in gazebased navigation
• Proposed a taxonomy of gaze-based UI
activation methods for navigation
• Environment was a 3D virtual “maze”
• My project built on some of the general
goals of Stellmach’s research, but
specifically applied to a geospatial
context
User Testing
Future Work
9. Project Details
Prior Work
Software Design
User Testing
Future Work
Stellmach, et al
• Stellmach’s proposed taxonomy:
Input Technique
x
Activation Speed
Description
Discrete
x
Constant
(DC) Input activated through fixed UI regions at a
constant view change rate. Once activated, a particular
movement action remains active until toggled.
Continuous
x
Gradient-based
(CG) Input activated through fixed UI regions at a variable
view change rate. Movement actions are only active
when gaze is within the UI element.
11. Project Details
Prior Work
Software Design
Adams, et al
• The Inspection of Very Large Images by
Eye-gaze Control (AVI ‘08)
• Proposed multiple methods for gazebased zooming
• Maintained fixed method for gaze-based
panning
• Used geospatial application as a test-bed,
but research focus was on image viewing
• My project referenced Adams’ work for
the “edge-of-screen” panning UI
User Testing
Future Work
12. Project Details
Prior Work
Software Design
User Testing
Future Work
Adams, et al
• Adams’ zooming techniques:
Technique
Description
Stare-to-Zoom (STZ)
Sustained gaze in the central region of the display
causes image to zoom inwards. Requires extended
stationary gaze > 420 ms. Zooming continues while
gaze remains stationary.
Head-to-Zoom (HTZ)
Zooming is initiated by movements of the users head
(calculated by eye to screen distance). Leaning
forward a small amount (~40 mm) initiates zooming
in. Leaning backward the same amount zooms out.
Dual-to-Zoom (DTZ)
Zooming is initiated using the mouse. Left mouse
button zooms in, right button zooms out. Panning is
still done through gaze.
Mouse-to-Zoom (MTZ)
Used as a baseline for comparison with other
techniques. Both zoom and pan is accomplished using
the mouse.
14. Project Details
Prior Work
Software Design
User Testing
Future Work
User Interface Design
• Hypothesis
• When navigating large geospatial areas, current zoom level can be used as an
indicator of the level of detailed information a user wishes to view
• Zoomed out Assume user is interested in navigating over large geographic
areas, from one broad region to another
• Zoomed in Assume user is interested in “fine searching” among smaller
geographic landmarks
• Design Goal
• Provide an adaptive UI that supports multiple user levels of interest
24. Project Details
Prior Work
Software Design
User Testing
User Interface Design
• Gaze cursor
• Displays current (filtered) gaze point to the user
• Helpful in maintaining orientation when navigating the UI
Future Work
26. Project Details
Prior Work
Software Design
User Testing
Gaze Point Filter
• Problem
• Even calibrated, raw output from eye tracker is noisy
• Brief (but valid) fixations contribute to the noisiness
• Makes UI activation difficult and gaze cursor is very distracting
Future Work
28. Project Details
Prior Work
Software Design
User Testing
Future Work
Gaze Point Filter
• Solution
• Filter the tracker’s output prior to processing by the client application
29. Project Details
Prior Work
Software Design
Gaze Point Filter
• Moving Average Filter
• As 2D points are received by the
tracker, samples are added to a
queue (“window”)
• Average of all samples currently in
the window is returned
• Window size is configurable –
optimal found to be 15-20 samples
• After initial charging, resulting
output is greatly improved
User Testing
Future Work
45. Project Details
Prior Work
Software Design
User Testing
User Testing
• Goals
• Evaluate the effectiveness of proposed designs
• Quantitative
• Are participants able to use the UI?
• How effective are they in navigating to geographic regions?
• Qualitative
• How natural or intuitive is the experience?
• Do participants feel like the system responds to their intent?
Future Work
46. Project Details
Prior Work
Software Design
User Testing
Future Work
Participants & Recruiting
• Planned goal was 5-10 test participants
• Recruited via online posts (graduate forum) and email solicitation
• Prospective participants completed an online screener
• Ended up with eight (8) participants
51. Project Details
Prior Work
Software Design
User Testing
Future Work
Test Procedures
1. Background questionnaire
2. Introduced to eye tracking system, calibration procedure, and tasks
3. Initial calibration
• 9-point automatic, using iViewX Experiment Center
4. Introduced to geospatial application and user interface
5. Navigation to practice point (with moderator support)
6. Sequential navigation to test regions (A, B, C, D)
1.
2.
3.
4.
Pan to general area of region
Zoom to region
Activate sub-points in the region
Zoom out to furthest level
54. Project Details
Prior Work
Software Design
User Testing
Future Work
Test Procedures – Initial calibration
• Targeted a < 1° angular error (X and Y)
55. Project Details
Prior Work
Software Design
User Testing
Test Procedures – Task ordering
Participant
1
2
3
4
5
6
7
8
Region Sequence
A B C D
B C D A
C D A B
D A B C
A B C D
B C D A
C D A B
D A B C
Future Work
56. Project Details
Prior Work
Software Design
User Testing
Test Results - Quantitative
Region Label
Average task time (seconds)
A
138.41
B
141.43
C
153.21
D
151.13
Overall Average
146.05
Future Work
57. Project Details
Prior Work
Software Design
User Testing
Test Results - Quantitative
Task Number
Average task time (seconds)
1
179.48
2
146.93
3
135.52
4
122.23
Overall Average
146.05
Future Work
58. Project Details
Prior Work
Software Design
User Testing
Test Results - Qualitative
• Participants given two surveys after tasks completion
• Qualitative gaze input survey
• System Usability Scale (SUS)
• Then debriefed with directed questions from moderator
Future Work
63. Project Details
Prior Work
Software Design
User Testing
Future Work
User Testing Observations
• Response to adaptive pan UI
• Initially somewhat disruptive
• Edge pan provided larger target surface
• Preferred when calibration had large angular error
• Central pan provided finer control and better view of map
• Generally preferred, except when calibration made activation difficult
• Expectation of dwell-based operation
• Before initial exposure to UI, participants expected a dwell-based solution
• Expected map to pan/zoom to where they were looking
64. Project Details
Prior Work
Software Design
User Testing
Future Work
User Testing Observations
• “Opposite Pan Problem”
• Many users had a tendency to pan in the exact opposite direction
• Error was relatively frequent and consistent between participants
• Debriefing revealed an opposite expectation of pan behavior
65. Project Details
Prior Work
Software Design
User Testing
Future Work
User Testing Observations
• “Opposite Pan Problem”
• Many users had a tendency to pan in the exact opposite direction
• Error was relatively frequent and consistent between participants
• Debriefing revealed an opposite expectation of pan behavior
Pan Target
66. Project Details
Prior Work
Software Design
User Testing
Future Work
User Testing Observations
• “Opposite Pan Problem”
• Many users had a tendency to pan in the exact opposite direction
• Error was relatively frequent and consistent between participants
• Debriefing revealed an opposite expectation of pan behavior
Observed
Error
Correct Pan
Pan Target
67. Project Details
Prior Work
Software Design
User Testing
Future Work
Ideas for Future Work
• More expansive and rigorous user testing
• Large variation in qualitative survey responses
• Larger sample size could yield more concrete significant results
• Comparative study of effectiveness of various design alternatives
• Fixed vs. adaptive pan UI
• Dwell-based activation vs. UI-based
• Implementation of Adams’ zoom techniques with adaptive pan UI