SlideShare uma empresa Scribd logo
1 de 4
Baixar para ler offline
Listing’s and Donders’ Laws and the Estimation of the Point-of-Gaze
                        Elias D. Guestrin                                                                                Moshe Eizenman
    University of Toronto and Toronto Rehabilitation Institute                                                        University of Toronto
                   elias.guestrin@utoronto.ca                                                                        eizenm@ecf.utoronto.ca

Abstract                                                                                           angle between the visual and optic axes that was obtained
                                                                                                   through personal calibration. Recent publications (e.g., [Villanu-
This paper examines the use of Listing’s and Donders’ laws for                                     eva and Cabeza 2007; Nagamatsu et al. 2008]) suggested using
the calculation of the torsion of the eye in the estimation of the                                 Listing’s and Donders’ laws to estimate eye torsion with the
point-of-gaze. After describing Listing’s and Donders’ laws and                                    objective of improving the point-of-gaze estimation accuracy.
providing their analytical representation, experimental results                                    The goals of this paper are (a) to quantify the difference in the
obtained while subjects looked at a computer screen are pre-                                       point-of-gaze estimation accuracy when eye torsion is estimated
sented. The experimental results show that when the point-of-                                      with Listing’s and Donders’ laws relative to when eye torsion is
gaze was estimated using Listing’s and Donders’ laws there was                                     ignored, and (b) to discuss the limitation of Listing’s and Dond-
no significant accuracy improvement relative to when eye tor-                                      ers’ laws in general gaze estimation applications.
sion was ignored. While for a larger range of eye rotation the
torsion would be more significant and should be taken into ac-                                     2     Listing’s and Donders’ Laws
count, the torsion predicted by Listing’s and Donders’ laws may
be inaccurate, even in ideal conditions. Moreover, eye torsion                                     Under a certain set of conditions (the head is erect and statio-
resulting from lateral head tilt can be significantly larger than                                  nary, the point-of-gaze is stationary, monocular viewing condi-
the torsion predicted by Listing’s and Donders’ laws, and even                                     tions with the gaze at infinity), the torsion of the eye follows
have opposite direction. To properly account for eye torsion, it                                   approximately Listing’s and Donders’ laws [Carpenter 1977;
should be measured independently (e.g., by tracking the iris                                       Alpern 1969b; Ferman et al. 1987a]. Before stating these laws, it
pattern and/or the scleral blood vessels).                                                         is necessary to define the primary position of the eye. The pri-
                                                                                                   mary position can be defined as the position of the eyes in the
Keywords: kinematics of the eye, eye torsion, Listing’s law,                                       skull when the head is in a natural erect position and the fixation
Donders’ law, gaze estimation accuracy                                                             line is horizontal and perpendicular to the line connecting the
                                                                                                   centers of rotation of both eyes. Note that, for most practical
1       Introduction                                                                               situations, the fixation line is very close to the visual axis.

The point-of-gaze can be defined as the intersection of the visual                                 Listing’s law states that each movement of the eyes from the
axis with the scene. The visual axis is the line defined by the                                    primary position to any other position is described by a single
center of the fovea and the nodal point of the eye, and it exhibits                                rotation about an axis that is perpendicular to the plane that con-
a subject-specific deviation from the optic axis of the eye of as                                  tains the initial and final positions of the fixation line. Since all
much as 5º. Gaze estimation methods based on 3-D models (e.g.,                                     possible axes of rotation are perpendicular to the fixation line in
[Shih and Liu 2004; Guestrin and Eizenman 2006; Villanueva                                         the primary position, they are all contained in a vertical plane
and Cabeza 2007; Guestrin and Eizenman 2008; Nagamatsu et                                          called the equatorial plane or Listing’s plane. In this context, any
al. 2008; Guestrin 2010]) follow three steps: (i) the reconstruc-                                  position of the eye can be fully described by specifying the
tion of the optic axis of the eye in 3-D space from a set of eye                                   orientation of the axis of rotation in Listing’s plane and the
features (e.g., the pupil and corneal reflections); (ii) the recon-                                magnitude of the rotation from the primary position. Therefore,
struction of the visual axis in 3-D space from the optic axis of                                   two degrees of freedom suffice to describe any eye position
the eye and the 3-D angular deviation between the visual and                                       completely.
optic axes (obtained through a personal calibration procedure);
(iii) the estimation of the point-of-gaze as the intersection of the                               A consequence of Listing’s law is that, except for purely hori-
visual axis with the 3-D scene.                                                                    zontal or purely vertical movements, the meridian of the eye that
                                                                                                   is vertical in the primary position is systematically tilted with
Different gaze directions are associated with different amounts                                    respect to the objective vertical (gravity) in any other position
of torsion of the eye about the line connecting the center of rota-                                (tertiary position). Based on this, the primary position can be
tion of the eye and the point-of-gaze (fixation line, line of fixa-                                defined as the position from which purely horizontal movements
tion or fixation axis). Consequently, the torsion of the eye                                       or purely vertical movements are not associated with any tilt of
should be taken into account to accurately reconstruct the visual                                  the vertical meridian of the eye with respect to the objective
axis in 3-D space from the optic axis of the eye and the 3-D                                       vertical [Carpenter 1977; Alpern 1969a].

                                                                                                   Donders’ law states that the angle of the tilt of the vertical meri-
Copyright © 2010 by the Association for Computing Machinery, Inc.                                  dian of the eye with respect to the objective vertical at any given
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
                                                                                                   eye position is always the same regardless of the way the eyes
for commercial advantage and that copies bear this notice and the full citation on the             reach that position. This implies that the torsion of the eye in any
first page. Copyrights for components of this work owned by others than ACM must be                position is the same as would be observed if the eye moved to it
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on         from the primary position by rotation about a single axis in List-
servers, or to redistribute to lists, requires prior specific permission and/or a fee.
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail
                                                                                                   ing’s plane.
permissions@acm.org.
ETRA 2010, Austin, TX, March 22 – 24, 2010.
© 2010 ACM 978-1-60558-994-7/10/0003 $10.00

                                                                                             199
To quantify the difference in the point-of-gaze estimation accu-              is in the primary position (ωp.p.).
racy when eye torsion is estimated using Listing’s and Donders’
laws relative to when eye torsion is ignored, Listing’s law has to            In the personal calibration procedure, c and ω are determined
be expressed analytically.                                                    while the subject fixates a known point g (not necessarily in the
                                                                              primary position). The unit vector in the direction of the visual
3     Analytical Expressions                                                  axis is then given by

                                                                                                            g−c .
In order to derive the analytical expressions, assume that the                                        ν=                                     (6)
fixation line is parallel to the visual axis (or, equivalently, rede-                                       g−c
fine the fixation line as the line that goes through the center of
rotation of the eye and is parallel to the visual axis). This is              Substituting (6) into (2)-(3) provides the direction and magni-
strictly true when the gaze is at infinity, in which case the fixa-           tude of the rotation vector prescribed by Listing’s law when the
tion line and the visual axis are about 0.4-0.8 mm apart. When                subject fixated the target calibration point. After obtaining the
the point-of-gaze is at a finite but relatively large distance, the           corresponding rotation matrix, RListing, using Rodrigues’ rotation
assumption that the fixation line and visual axis are parallel is             formula, ωp.p. is obtained from (5) as
reasonable (e.g., for a point-of-gaze at 57 cm from the eye, the
directions of the fixation line and the visual axis differ by less                                 ω p.p. = R T ω ,                          (7)
                                                                                                              Listing
than 5 minutes of arc).

Let c be the nodal point of the eye/center of curvature of the                using the fact that rotation matrices satisfy R-1 = RT.
cornea, ω be the unit vector in the direction of the optic axis of
the eye, ν be the unit vector in the direction of the visual axis,            For the estimation of the point-of-gaze, the next step after de-
and the subscript “p.p.” represent the primary position of the eye            termining c and ω consists of finding ν given ω, ωp.p. and νp.p..
(all coordinates are with respect to a world coordinate system).              Following the reasoning from [Nagamatsu et al. 2008], the di-
                                                                              rection and magnitude of the rotation vector prescribed by List-
The visual axis in 3-D space is then given by                                 ing’s law can be found from ω, ωp.p. and νp.p. as

                                                                                                      ν p.p. × (ω − ω p.p. ) ,
                   g = c + k g ν , kg ∈ ℜ ,                      (1)                             λ=                                          (8)
                                                                                                      ν p.p. × (ω − ω p.p. )
and the point-of-gaze is given by the value of kg corresponding
                                                                                                       ω p.p. • ω − (ω p.p. • λ ) 2  .
to the intersection of the visual axis with the scene.                                    ε = arccos                                       (9)
                                                                                                      
                                                                                                            1 − (ω p.p. • λ ) 2     
                                                                                                                                     
The rotation from the primary position (νp.p.) to any other posi-
tion (ν) as prescribed by Listing’s law is described by the rota-
                                                                              As before, the corresponding rotation matrix, RListing, can be
tion vector Γ = ε λ, where
                                                                              readily obtained using Rodrigues’ rotation formula. Having
                                                                              found RListing, ν is obtained with (4). Having c and ν, the point-
                               ν p.p. × ν                                     of-gaze is found using (1).
                        λ=                                       (2)
                               ν p.p. × ν
                                                                              4     Experimental Results
is the unit vector in the direction of the axis of rotation accord-
ing to the right-hand rule and                                                Experiments were performed with a system that uses two video
                                                                              cameras and multiple infrared light sources to estimate the point-
                   ε = arccos( ν p.p. • ν )                      (3)          of-gaze on a computer screen from the pupil and corneal reflec-
                                                                              tions after each subject completes a single-point personal cali-
is the magnitude of the rotation relative to the primary position             bration procedure [Guestrin and Eizenman 2008; Guestrin
(eccentricity angle). The corresponding rotation matrix, RListing,            2010]. The plane of the screen was vertical and, in the primary
                                                                              position, the subject’s face was parallel to the screen (conse-
can be readily obtained using Rodrigues’ rotation formula [Be-
longie]. Then,                                                                quently, νp.p. was perpendicular to the screen).

                                                                              The experiments were carried out with three adults without
                      ν = R Listing ν p.p. ,                     (4)          eyeglasses or contact lenses in binocular conditions (binocular
                     ω = R Listing ω p.p. .                                   conditions correspond to most gaze estimation applications).
                                                                 (5)
                                                                              The head of each subject was placed at 65 cm (nominal dis-
                                                                              tance), 60 cm (minimum distance) and 70 cm (maximum dis-
To be able to estimate the point-of-gaze, a set of subject-specific           tance) from the screen. For each head position, each subject
eye parameters has to be obtained through a personal calibration              completed a number of trials (2 for Subjects 1 and 2, and 7 for
procedure. Suppose now that the point-of-gaze estimation me-                  Subject 3). In each trial, the subject was asked to sequentially
thod being used can estimate c and ω without knowing the value                fixate 25 target points (arranged in a 5-by-5 rectangular grid) on
of any subject-specific eye parameter [Shih and Liu 2004; Gues-               the computer screen. For each target point, 50 point-of-gaze
trin and Eizenman 2006; Guestrin and Eizenman 2008; Naga-                     estimates (2.5 seconds @ 20 estimates/second) were obtained
matsu et al. 2008; Guestrin 2010]. In such case, only the subject-            for each eye (the point-of-gaze was estimated for both eyes si-
specific deviation of the visual axis from the optic axis of the              multaneously) using one of the methods described in [Guestrin
eye needs to be determined. Finding this deviation is equivalent              2010] (referred to as the CNoPlInt-3CR + PB-EF method). The
to finding the direction of the optic axis of the eye when the eye            subject-specific deviation of the visual axis from the optic axis



                                                                        200
was determined when the head was at the nominal distance from                 conditions, especially when viewing a relatively close object
the screen with the plane of the face parallel to the screen and              (convergence of the fixation lines, vergence eye movements)
the subject fixated the target point at the center of the screen.             [Carpenter 1977; Alpern 1969b]. In addition, Listing’s and
                                                                              Donders’ laws do not hold during smooth pursuit (eye move-
The RMS point-of-gaze estimation error obtained when the                      ments that follow a moving target) [Ferman et al. 1987b].
point-of-gaze was estimated ignoring the torsion of the eye and
when it was estimated using Listing’s and Donders’ laws is pro-               Even under monocular viewing conditions when the head is
vided in Table 1 for the three subjects. It can be observed that              erect and stationary and the point-of-gaze is also stationary,
although in most cases the RMS error is smaller when Listing’s                Listing’s law is only approximate: physiological eye movements
and Donders’ laws are used, the differences are in the order of               show considerable stochastic as well as systematical deviations
0.01º of visual angle, which is insignificant for any practical               from this law [Ferman et al. 1987a]. Some of the observations
purpose (certainly much lower than the noise level of any exist-              from [Ferman et al. 1987a] are: (a) even when the eye is in the
ing remote gaze estimation system). Figure 1 shows the mean                   primary position there can be considerable fluctuations of tor-
point-of-gaze estimates obtained in each trial for both eyes. Only            sion; (b) there is asymmetry between the torsion in the nasal
minor differences can be observed for the point-of-gaze esti-                 quadrants and the torsion in the temporal quadrants; (c) there are
mates in the bottom corners of the array of target fixation points.           variations in torsion between the left and right eyes of any sub-
                                                                              ject, and among subjects.
                                             Listing’s & Donders’             From the preceding discussion it is clear that Listing’s and
    Subject   Eye      Torsion ignored
                                                      laws                    Donders’ laws have several limitations for practical use. For
              L        4.82 mm (0.40º)          4.81 mm (0.40º)               example, while certain applications in ophthalmology and neu-
      1                                                                       rology require the estimation of gaze in monocular conditions
              R        4.84 mm (0.40º)          4.80 mm (0.39º)
                                                                              (the eye that is not being tested is covered or patched) with a
              L        7.26 mm (0.58º)          7.29 mm (0.58º)               stationary head, most gaze estimation applications need to be
      2                                                                       carried out in binocular conditions under natural translational
              R        6.62 mm (0.55º)          6.50 mm (0.54º)
                                                                              and rotational head movements. Moreover, the torsion of the eye
              L        6.79 mm (0.56º)          6.69 mm (0.55º)               resulting from lateral head tilt (observed often in applications
      3
              R        5.99 mm (0.49º)          5.94 mm (0.49º)               with infants) can be significantly larger than the torsion pre-
                                                                              dicted by Listing’s law, and even have opposite direction. All
    Table 1: Experimental RMS point-of-gaze estimation error.                 this suggests that, to properly account for eye torsion, the torsion
                                                                              of the eye should be measured independently, for example, by
                                                                              tracking the iris pattern and/or the scleral blood vessels.
5      Discussion and Conclusions
                                                                              References
The experimental results presented in the previous section show
that when the point-of-gaze was estimated using Listing’s and
                                                                              ALPERN, M. 1969a. Specification of the Direction of Regard. In
Donders’ laws there was no significant accuracy improvement
                                                                                The Eye, 2nd ed., vol. 3, H. Davson, Ed. Academic Press Inc.
relative to when eye torsion was ignored. This is largely due to
                                                                                (London) Ltd.
the relatively small amount of eye torsion expected under our
experimental conditions. For reference, note that, using Fick’s
                                                                              ALPERN, M. 1969b. Kinematics of the Eye. In The Eye, 2nd ed.,
system of axes [Carpenter 1977; Ferman et al. 1987a], when the
                                                                                vol. 3, H. Davson, Ed. Academic Press Inc. (London) Ltd.
head of the subject was at 65 cm from the screen and the subject
fixated the bottom right target point, the longitude (pan) angle of
                                                                              BELONGIE, S. Rodrigues’ Rotation Formula. MathWorld–A Wol-
the fixation line of the right eye was about 10-11º, the longitude
                                                                                fram Web Resource, created by Eric W. Weisstein.
angle of the fixation line of the left eye was about 15-16º, and
                                                                                http://mathworld.wolfram.com/RodriguesRotationFormula.ht
the latitude (tilt) angle of the fixation line of both eyes was about
                                                                                ml
–20º of visual angle. In such a case, the magnitude of the torsion
of the eye about the fixation line predicted by Listing’s law
                                                                              CARPENTER, R. H. S. 1977. Movements of the Eyes. Pion.
[Ferman et al. 1987a; Guestrin 2010] is about 1.9º and 2.7º, re-
spectively. For a 5º deviation between the visual and optic axes,
                                                                              FERMAN, L., COLLEWIJN, H., AND VAN DEN BERG, A. V. 1987a. A
a 2.7º torsion would result in a difference of less than 3 mm
                                                                                Direct Test of Listing’s Law – I. Human Ocular Torsion
between the point-of-gaze estimates obtained when eye torsion
                                                                                Measured in Static Tertiary Positions. Vision Research 27, 6,
is calculated with Listing’s and Donders’ laws and when eye
                                                                                929–938.
torsion is ignored. It is recognized, however, that for a larger
range of eye rotation the torsion would be more significant and
                                                                              FERMAN, L., COLLEWIJN, H., AND VAN DEN BERG, A. V. 1987b. A
should be taken into account.
                                                                                Direct Test of Listing’s Law – II. Human Ocular Torsion
                                                                                Measured under Dynamic Conditions. Vision Research 27, 6,
In general, the torsion of the eye depends on the position and
                                                                                939–951.
orientation of the head (especially the orientation of the head
with respect to gravity), the point-of-gaze and whether vision is
                                                                              GUESTRIN, E. D., AND EIZENMAN, M. 2006. General Theory of
monocular or binocular. The torsion of the eye is also affected
                                                                                Remote Gaze Estimation Using the Pupil Center and Corneal
by disorders of the visual and oculomotor systems.
                                                                                Reflections. IEEE Transactions on Biomedical Engineering
                                                                                53, 6, 1124–1133.
Listing’s and Donders’ laws are, in the best case, only approx-
imate. Listing’s law does not apply when the head is not erect,
                                                                              GUESTRIN, E. D., AND EIZENMAN, M. 2008. Remote Point-of-
when the head is non-stationary [Carpenter 1977] or in binocular
                                                                                Gaze Estimation Requiring a Single-Point Calibration for



                                                                        201
Applications with Infants. In Proceedings of ETRA 2008,                                                                    nematics Using Stereo Cameras. In Proceedings of ETRA
         ACM Press, 267–274.                                                                                                        2008, ACM Press, 95–98.

GUESTRIN, E. D. 2010. Remote, Non-Contact Gaze Estimation                                                                       SHIH, S.-W., AND LIU, J. 2004. A Novel Approach to 3-D Gaze
  with Minimal Subject Cooperation. PhD thesis, University of                                                                     Tracking Using Stereo Cameras. IEEE Transactions on Sys-
  Toronto.                                                                                                                        tems, Man, and Cybernetics B 34, 1, 234–245.

NAGAMATSU, T., KAMAHARA, J., IKO, T., AND TANAKA, N. 2008.                                                                      VILLANUEVA, A., AND CABEZA, R. 2007. Models for Gaze Track-
  One-Point Calibration Gaze Tracking Based on Eyeball Ki-                                                                        ing Systems. EURASIP Journal on Image and Video
                                                                                                                                  Processing 2007, article ID 23570.

                                                                Torsion ignored                                                            Torsion predicted by Listing’s and Donders’ laws
 Subject 1 (6 trials in total)
 Subject 2 (6 trials in total)
 Subject 3 (21 trials in total)




                                  Figure 1: Mean point-of-gaze estimates (+ : target fixation point,  : left eye,  : right eye). All coordinates are in mm.
                                        (Figures © 2009 Elias. D. Guestrin. ACM is licensed to use or reuse these figures in any possible form and to manage third-party requests to reuse these figures.)




                                                                                                                     202

Mais conteúdo relacionado

Mais procurados

introduction to learn laser in refractive erros
introduction to learn laser in refractive errosintroduction to learn laser in refractive erros
introduction to learn laser in refractive errosGazy Khatmi
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORcscpconf
 

Mais procurados (6)

Ophthalmology 5th year, 4th lecture (Dr. Tara)
Ophthalmology 5th year, 4th lecture (Dr. Tara)Ophthalmology 5th year, 4th lecture (Dr. Tara)
Ophthalmology 5th year, 4th lecture (Dr. Tara)
 
L15 chapter 13 keratometry and keratoscopy 2 2007 2008
L15 chapter 13 keratometry and keratoscopy 2 2007 2008L15 chapter 13 keratometry and keratoscopy 2 2007 2008
L15 chapter 13 keratometry and keratoscopy 2 2007 2008
 
introduction to learn laser in refractive erros
introduction to learn laser in refractive errosintroduction to learn laser in refractive erros
introduction to learn laser in refractive erros
 
Keratometry
KeratometryKeratometry
Keratometry
 
Defending
DefendingDefending
Defending
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
 

Destaque

Estatal comercio convenio_alec_2010_12
Estatal comercio convenio_alec_2010_12Estatal comercio convenio_alec_2010_12
Estatal comercio convenio_alec_2010_12oscargaliza
 
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Kalle
 
Es3 Pifferati Valentina
Es3 Pifferati ValentinaEs3 Pifferati Valentina
Es3 Pifferati Valentinaguest48e709
 
F2F 2015 - Client SDK (Specific Plataform Android)
F2F 2015 - Client SDK (Specific Plataform Android)F2F 2015 - Client SDK (Specific Plataform Android)
F2F 2015 - Client SDK (Specific Plataform Android)Daniel Passos
 
Galiciasindial11
Galiciasindial11Galiciasindial11
Galiciasindial11oscargaliza
 
M. Holovaty, Концепции автоматизированного тестирования
M. Holovaty, Концепции автоматизированного тестированияM. Holovaty, Концепции автоматизированного тестирования
M. Holovaty, Концепции автоматизированного тестированияAlex
 
Sana's lab report
Sana's lab reportSana's lab report
Sana's lab reportSana Samad
 
Yaşamboyu Öğrenme
Yaşamboyu ÖğrenmeYaşamboyu Öğrenme
Yaşamboyu Öğrenmenazzzy
 
Clase accesos venosos
Clase accesos venososClase accesos venosos
Clase accesos venososMANUEL RIVERA
 
Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...
Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...
Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...TDR d.o.o Rovinj
 
Y2e rewardsclub-payplan-presentation
Y2e rewardsclub-payplan-presentationY2e rewardsclub-payplan-presentation
Y2e rewardsclub-payplan-presentationabnercash
 
Galicia sindical 1 maio
Galicia sindical 1 maioGalicia sindical 1 maio
Galicia sindical 1 maiooscargaliza
 
Ana cristina comments
Ana cristina commentsAna cristina comments
Ana cristina commentscriszamu
 
Continuous Interop Testing
Continuous Interop TestingContinuous Interop Testing
Continuous Interop TestingJohn Cocke
 
Trip to arizona
Trip to arizonaTrip to arizona
Trip to arizonarachit5609
 

Destaque (20)

Estatal comercio convenio_alec_2010_12
Estatal comercio convenio_alec_2010_12Estatal comercio convenio_alec_2010_12
Estatal comercio convenio_alec_2010_12
 
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
 
Web 2 0
Web 2 0Web 2 0
Web 2 0
 
Es3 Pifferati Valentina
Es3 Pifferati ValentinaEs3 Pifferati Valentina
Es3 Pifferati Valentina
 
Cd covers
Cd coversCd covers
Cd covers
 
F2F 2015 - Client SDK (Specific Plataform Android)
F2F 2015 - Client SDK (Specific Plataform Android)F2F 2015 - Client SDK (Specific Plataform Android)
F2F 2015 - Client SDK (Specific Plataform Android)
 
Outlook Express
Outlook ExpressOutlook Express
Outlook Express
 
Galiciasindial11
Galiciasindial11Galiciasindial11
Galiciasindial11
 
M. Holovaty, Концепции автоматизированного тестирования
M. Holovaty, Концепции автоматизированного тестированияM. Holovaty, Концепции автоматизированного тестирования
M. Holovaty, Концепции автоматизированного тестирования
 
Sana's lab report
Sana's lab reportSana's lab report
Sana's lab report
 
Yaşamboyu Öğrenme
Yaşamboyu ÖğrenmeYaşamboyu Öğrenme
Yaşamboyu Öğrenme
 
Clase accesos venosos
Clase accesos venososClase accesos venosos
Clase accesos venosos
 
Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...
Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...
Predstavljanje rezultata poslovanja za 2011. godinu na tržištu Bosne i Herceg...
 
Y2e rewardsclub-payplan-presentation
Y2e rewardsclub-payplan-presentationY2e rewardsclub-payplan-presentation
Y2e rewardsclub-payplan-presentation
 
Galicia sindical 1 maio
Galicia sindical 1 maioGalicia sindical 1 maio
Galicia sindical 1 maio
 
Vpn
VpnVpn
Vpn
 
Ana cristina comments
Ana cristina commentsAna cristina comments
Ana cristina comments
 
Continuous Interop Testing
Continuous Interop TestingContinuous Interop Testing
Continuous Interop Testing
 
Trip to arizona
Trip to arizonaTrip to arizona
Trip to arizona
 
Presentatie lastoevoegmaterialen
Presentatie lastoevoegmaterialenPresentatie lastoevoegmaterialen
Presentatie lastoevoegmaterialen
 

Semelhante a Estimating Point-of-Gaze Accuracy Using Listing's and Donders' Laws

Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeKalle
 
Model User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation SystemModel User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation SystemKalle
 
Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye Tracker
Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye TrackerDaugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye Tracker
Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye TrackerKalle
 
Topic stereoscopy, Parallax, Relief displacement
Topic  stereoscopy, Parallax, Relief displacementTopic  stereoscopy, Parallax, Relief displacement
Topic stereoscopy, Parallax, Relief displacementsrinivas2036
 
ELEVATION BASED CORNEAL TOPOGRAPHY.pptx
ELEVATION BASED  CORNEAL TOPOGRAPHY.pptxELEVATION BASED  CORNEAL TOPOGRAPHY.pptx
ELEVATION BASED CORNEAL TOPOGRAPHY.pptxBipin Koirala
 
Ncert class-12-physics-part-2
Ncert class-12-physics-part-2Ncert class-12-physics-part-2
Ncert class-12-physics-part-2RAHUL SINGH
 
Coutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye TrackingCoutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye TrackingKalle
 
Aspheric IOLs for CRGH
Aspheric IOLs for CRGHAspheric IOLs for CRGH
Aspheric IOLs for CRGHperfectvision
 
Introduction to accommodative and binocular anomalies
Introduction to accommodative and binocular anomaliesIntroduction to accommodative and binocular anomalies
Introduction to accommodative and binocular anomaliesHammed Sherifdeen
 
Eie426 eye tracking-v2 (1)
Eie426 eye tracking-v2 (1)Eie426 eye tracking-v2 (1)
Eie426 eye tracking-v2 (1)manu8722487734
 
Corneal topography and wavefront imaging
Corneal topography and wavefront imagingCorneal topography and wavefront imaging
Corneal topography and wavefront imagingSocrates Narvaez
 
Understanding corneal-topography
Understanding corneal-topographyUnderstanding corneal-topography
Understanding corneal-topographymanojraut125
 
Correcting presbyopia - Modern Options
Correcting presbyopia - Modern OptionsCorrecting presbyopia - Modern Options
Correcting presbyopia - Modern OptionsJason Higginbotham
 
New microsoft power point presentation
New microsoft power point presentationNew microsoft power point presentation
New microsoft power point presentationJainull Abedin
 
Characterization of unknown space objects
Characterization of unknown space objectsCharacterization of unknown space objects
Characterization of unknown space objectsClifford Stone
 

Semelhante a Estimating Point-of-Gaze Accuracy Using Listing's and Donders' Laws (20)

Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
 
A scan biometry
A scan biometryA scan biometry
A scan biometry
 
Model User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation SystemModel User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation System
 
Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye Tracker
Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye TrackerDaugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye Tracker
Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye Tracker
 
Accomodative convergance and accommodation
Accomodative convergance and accommodationAccomodative convergance and accommodation
Accomodative convergance and accommodation
 
Topic stereoscopy, Parallax, Relief displacement
Topic  stereoscopy, Parallax, Relief displacementTopic  stereoscopy, Parallax, Relief displacement
Topic stereoscopy, Parallax, Relief displacement
 
ELEVATION BASED CORNEAL TOPOGRAPHY.pptx
ELEVATION BASED  CORNEAL TOPOGRAPHY.pptxELEVATION BASED  CORNEAL TOPOGRAPHY.pptx
ELEVATION BASED CORNEAL TOPOGRAPHY.pptx
 
Ncert class-12-physics-part-2
Ncert class-12-physics-part-2Ncert class-12-physics-part-2
Ncert class-12-physics-part-2
 
Coutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye TrackingCoutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
 
Synoptophore Information Guide
Synoptophore Information GuideSynoptophore Information Guide
Synoptophore Information Guide
 
Aspheric IOLs for CRGH
Aspheric IOLs for CRGHAspheric IOLs for CRGH
Aspheric IOLs for CRGH
 
Introduction to accommodative and binocular anomalies
Introduction to accommodative and binocular anomaliesIntroduction to accommodative and binocular anomalies
Introduction to accommodative and binocular anomalies
 
Eie426 eye tracking-v2 (1)
Eie426 eye tracking-v2 (1)Eie426 eye tracking-v2 (1)
Eie426 eye tracking-v2 (1)
 
Corneal topography and wavefront imaging
Corneal topography and wavefront imagingCorneal topography and wavefront imaging
Corneal topography and wavefront imaging
 
Understanding corneal-topography
Understanding corneal-topographyUnderstanding corneal-topography
Understanding corneal-topography
 
Correcting presbyopia - Modern Options
Correcting presbyopia - Modern OptionsCorrecting presbyopia - Modern Options
Correcting presbyopia - Modern Options
 
New microsoft power point presentation
New microsoft power point presentationNew microsoft power point presentation
New microsoft power point presentation
 
09.Ray optics.pdf
09.Ray optics.pdf09.Ray optics.pdf
09.Ray optics.pdf
 
Ch 9 images
Ch 9  imagesCh 9  images
Ch 9 images
 
Characterization of unknown space objects
Characterization of unknown space objectsCharacterization of unknown space objects
Characterization of unknown space objects
 

Mais de Kalle

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsKalle
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Kalle
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Kalle
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Kalle
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Kalle
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlKalle
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Kalle
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingKalle
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Kalle
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Kalle
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneKalle
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerKalle
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksKalle
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingKalle
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchKalle
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyKalle
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Kalle
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Kalle
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Kalle
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Kalle
 

Mais de Kalle (20)

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze Alone
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem Solving
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement Study
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
 

Estimating Point-of-Gaze Accuracy Using Listing's and Donders' Laws

  • 1. Listing’s and Donders’ Laws and the Estimation of the Point-of-Gaze Elias D. Guestrin Moshe Eizenman University of Toronto and Toronto Rehabilitation Institute University of Toronto elias.guestrin@utoronto.ca eizenm@ecf.utoronto.ca Abstract angle between the visual and optic axes that was obtained through personal calibration. Recent publications (e.g., [Villanu- This paper examines the use of Listing’s and Donders’ laws for eva and Cabeza 2007; Nagamatsu et al. 2008]) suggested using the calculation of the torsion of the eye in the estimation of the Listing’s and Donders’ laws to estimate eye torsion with the point-of-gaze. After describing Listing’s and Donders’ laws and objective of improving the point-of-gaze estimation accuracy. providing their analytical representation, experimental results The goals of this paper are (a) to quantify the difference in the obtained while subjects looked at a computer screen are pre- point-of-gaze estimation accuracy when eye torsion is estimated sented. The experimental results show that when the point-of- with Listing’s and Donders’ laws relative to when eye torsion is gaze was estimated using Listing’s and Donders’ laws there was ignored, and (b) to discuss the limitation of Listing’s and Dond- no significant accuracy improvement relative to when eye tor- ers’ laws in general gaze estimation applications. sion was ignored. While for a larger range of eye rotation the torsion would be more significant and should be taken into ac- 2 Listing’s and Donders’ Laws count, the torsion predicted by Listing’s and Donders’ laws may be inaccurate, even in ideal conditions. Moreover, eye torsion Under a certain set of conditions (the head is erect and statio- resulting from lateral head tilt can be significantly larger than nary, the point-of-gaze is stationary, monocular viewing condi- the torsion predicted by Listing’s and Donders’ laws, and even tions with the gaze at infinity), the torsion of the eye follows have opposite direction. To properly account for eye torsion, it approximately Listing’s and Donders’ laws [Carpenter 1977; should be measured independently (e.g., by tracking the iris Alpern 1969b; Ferman et al. 1987a]. Before stating these laws, it pattern and/or the scleral blood vessels). is necessary to define the primary position of the eye. The pri- mary position can be defined as the position of the eyes in the Keywords: kinematics of the eye, eye torsion, Listing’s law, skull when the head is in a natural erect position and the fixation Donders’ law, gaze estimation accuracy line is horizontal and perpendicular to the line connecting the centers of rotation of both eyes. Note that, for most practical 1 Introduction situations, the fixation line is very close to the visual axis. The point-of-gaze can be defined as the intersection of the visual Listing’s law states that each movement of the eyes from the axis with the scene. The visual axis is the line defined by the primary position to any other position is described by a single center of the fovea and the nodal point of the eye, and it exhibits rotation about an axis that is perpendicular to the plane that con- a subject-specific deviation from the optic axis of the eye of as tains the initial and final positions of the fixation line. Since all much as 5º. Gaze estimation methods based on 3-D models (e.g., possible axes of rotation are perpendicular to the fixation line in [Shih and Liu 2004; Guestrin and Eizenman 2006; Villanueva the primary position, they are all contained in a vertical plane and Cabeza 2007; Guestrin and Eizenman 2008; Nagamatsu et called the equatorial plane or Listing’s plane. In this context, any al. 2008; Guestrin 2010]) follow three steps: (i) the reconstruc- position of the eye can be fully described by specifying the tion of the optic axis of the eye in 3-D space from a set of eye orientation of the axis of rotation in Listing’s plane and the features (e.g., the pupil and corneal reflections); (ii) the recon- magnitude of the rotation from the primary position. Therefore, struction of the visual axis in 3-D space from the optic axis of two degrees of freedom suffice to describe any eye position the eye and the 3-D angular deviation between the visual and completely. optic axes (obtained through a personal calibration procedure); (iii) the estimation of the point-of-gaze as the intersection of the A consequence of Listing’s law is that, except for purely hori- visual axis with the 3-D scene. zontal or purely vertical movements, the meridian of the eye that is vertical in the primary position is systematically tilted with Different gaze directions are associated with different amounts respect to the objective vertical (gravity) in any other position of torsion of the eye about the line connecting the center of rota- (tertiary position). Based on this, the primary position can be tion of the eye and the point-of-gaze (fixation line, line of fixa- defined as the position from which purely horizontal movements tion or fixation axis). Consequently, the torsion of the eye or purely vertical movements are not associated with any tilt of should be taken into account to accurately reconstruct the visual the vertical meridian of the eye with respect to the objective axis in 3-D space from the optic axis of the eye and the 3-D vertical [Carpenter 1977; Alpern 1969a]. Donders’ law states that the angle of the tilt of the vertical meri- Copyright © 2010 by the Association for Computing Machinery, Inc. dian of the eye with respect to the objective vertical at any given Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed eye position is always the same regardless of the way the eyes for commercial advantage and that copies bear this notice and the full citation on the reach that position. This implies that the torsion of the eye in any first page. Copyrights for components of this work owned by others than ACM must be position is the same as would be observed if the eye moved to it honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on from the primary position by rotation about a single axis in List- servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail ing’s plane. permissions@acm.org. ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 199
  • 2. To quantify the difference in the point-of-gaze estimation accu- is in the primary position (ωp.p.). racy when eye torsion is estimated using Listing’s and Donders’ laws relative to when eye torsion is ignored, Listing’s law has to In the personal calibration procedure, c and ω are determined be expressed analytically. while the subject fixates a known point g (not necessarily in the primary position). The unit vector in the direction of the visual 3 Analytical Expressions axis is then given by g−c . In order to derive the analytical expressions, assume that the ν= (6) fixation line is parallel to the visual axis (or, equivalently, rede- g−c fine the fixation line as the line that goes through the center of rotation of the eye and is parallel to the visual axis). This is Substituting (6) into (2)-(3) provides the direction and magni- strictly true when the gaze is at infinity, in which case the fixa- tude of the rotation vector prescribed by Listing’s law when the tion line and the visual axis are about 0.4-0.8 mm apart. When subject fixated the target calibration point. After obtaining the the point-of-gaze is at a finite but relatively large distance, the corresponding rotation matrix, RListing, using Rodrigues’ rotation assumption that the fixation line and visual axis are parallel is formula, ωp.p. is obtained from (5) as reasonable (e.g., for a point-of-gaze at 57 cm from the eye, the directions of the fixation line and the visual axis differ by less ω p.p. = R T ω , (7) Listing than 5 minutes of arc). Let c be the nodal point of the eye/center of curvature of the using the fact that rotation matrices satisfy R-1 = RT. cornea, ω be the unit vector in the direction of the optic axis of the eye, ν be the unit vector in the direction of the visual axis, For the estimation of the point-of-gaze, the next step after de- and the subscript “p.p.” represent the primary position of the eye termining c and ω consists of finding ν given ω, ωp.p. and νp.p.. (all coordinates are with respect to a world coordinate system). Following the reasoning from [Nagamatsu et al. 2008], the di- rection and magnitude of the rotation vector prescribed by List- The visual axis in 3-D space is then given by ing’s law can be found from ω, ωp.p. and νp.p. as ν p.p. × (ω − ω p.p. ) , g = c + k g ν , kg ∈ ℜ , (1) λ= (8) ν p.p. × (ω − ω p.p. ) and the point-of-gaze is given by the value of kg corresponding  ω p.p. • ω − (ω p.p. • λ ) 2  . to the intersection of the visual axis with the scene. ε = arccos  (9)   1 − (ω p.p. • λ ) 2   The rotation from the primary position (νp.p.) to any other posi- tion (ν) as prescribed by Listing’s law is described by the rota- As before, the corresponding rotation matrix, RListing, can be tion vector Γ = ε λ, where readily obtained using Rodrigues’ rotation formula. Having found RListing, ν is obtained with (4). Having c and ν, the point- ν p.p. × ν of-gaze is found using (1). λ= (2) ν p.p. × ν 4 Experimental Results is the unit vector in the direction of the axis of rotation accord- ing to the right-hand rule and Experiments were performed with a system that uses two video cameras and multiple infrared light sources to estimate the point- ε = arccos( ν p.p. • ν ) (3) of-gaze on a computer screen from the pupil and corneal reflec- tions after each subject completes a single-point personal cali- is the magnitude of the rotation relative to the primary position bration procedure [Guestrin and Eizenman 2008; Guestrin (eccentricity angle). The corresponding rotation matrix, RListing, 2010]. The plane of the screen was vertical and, in the primary position, the subject’s face was parallel to the screen (conse- can be readily obtained using Rodrigues’ rotation formula [Be- longie]. Then, quently, νp.p. was perpendicular to the screen). The experiments were carried out with three adults without ν = R Listing ν p.p. , (4) eyeglasses or contact lenses in binocular conditions (binocular ω = R Listing ω p.p. . conditions correspond to most gaze estimation applications). (5) The head of each subject was placed at 65 cm (nominal dis- tance), 60 cm (minimum distance) and 70 cm (maximum dis- To be able to estimate the point-of-gaze, a set of subject-specific tance) from the screen. For each head position, each subject eye parameters has to be obtained through a personal calibration completed a number of trials (2 for Subjects 1 and 2, and 7 for procedure. Suppose now that the point-of-gaze estimation me- Subject 3). In each trial, the subject was asked to sequentially thod being used can estimate c and ω without knowing the value fixate 25 target points (arranged in a 5-by-5 rectangular grid) on of any subject-specific eye parameter [Shih and Liu 2004; Gues- the computer screen. For each target point, 50 point-of-gaze trin and Eizenman 2006; Guestrin and Eizenman 2008; Naga- estimates (2.5 seconds @ 20 estimates/second) were obtained matsu et al. 2008; Guestrin 2010]. In such case, only the subject- for each eye (the point-of-gaze was estimated for both eyes si- specific deviation of the visual axis from the optic axis of the multaneously) using one of the methods described in [Guestrin eye needs to be determined. Finding this deviation is equivalent 2010] (referred to as the CNoPlInt-3CR + PB-EF method). The to finding the direction of the optic axis of the eye when the eye subject-specific deviation of the visual axis from the optic axis 200
  • 3. was determined when the head was at the nominal distance from conditions, especially when viewing a relatively close object the screen with the plane of the face parallel to the screen and (convergence of the fixation lines, vergence eye movements) the subject fixated the target point at the center of the screen. [Carpenter 1977; Alpern 1969b]. In addition, Listing’s and Donders’ laws do not hold during smooth pursuit (eye move- The RMS point-of-gaze estimation error obtained when the ments that follow a moving target) [Ferman et al. 1987b]. point-of-gaze was estimated ignoring the torsion of the eye and when it was estimated using Listing’s and Donders’ laws is pro- Even under monocular viewing conditions when the head is vided in Table 1 for the three subjects. It can be observed that erect and stationary and the point-of-gaze is also stationary, although in most cases the RMS error is smaller when Listing’s Listing’s law is only approximate: physiological eye movements and Donders’ laws are used, the differences are in the order of show considerable stochastic as well as systematical deviations 0.01º of visual angle, which is insignificant for any practical from this law [Ferman et al. 1987a]. Some of the observations purpose (certainly much lower than the noise level of any exist- from [Ferman et al. 1987a] are: (a) even when the eye is in the ing remote gaze estimation system). Figure 1 shows the mean primary position there can be considerable fluctuations of tor- point-of-gaze estimates obtained in each trial for both eyes. Only sion; (b) there is asymmetry between the torsion in the nasal minor differences can be observed for the point-of-gaze esti- quadrants and the torsion in the temporal quadrants; (c) there are mates in the bottom corners of the array of target fixation points. variations in torsion between the left and right eyes of any sub- ject, and among subjects. Listing’s & Donders’ From the preceding discussion it is clear that Listing’s and Subject Eye Torsion ignored laws Donders’ laws have several limitations for practical use. For L 4.82 mm (0.40º) 4.81 mm (0.40º) example, while certain applications in ophthalmology and neu- 1 rology require the estimation of gaze in monocular conditions R 4.84 mm (0.40º) 4.80 mm (0.39º) (the eye that is not being tested is covered or patched) with a L 7.26 mm (0.58º) 7.29 mm (0.58º) stationary head, most gaze estimation applications need to be 2 carried out in binocular conditions under natural translational R 6.62 mm (0.55º) 6.50 mm (0.54º) and rotational head movements. Moreover, the torsion of the eye L 6.79 mm (0.56º) 6.69 mm (0.55º) resulting from lateral head tilt (observed often in applications 3 R 5.99 mm (0.49º) 5.94 mm (0.49º) with infants) can be significantly larger than the torsion pre- dicted by Listing’s law, and even have opposite direction. All Table 1: Experimental RMS point-of-gaze estimation error. this suggests that, to properly account for eye torsion, the torsion of the eye should be measured independently, for example, by tracking the iris pattern and/or the scleral blood vessels. 5 Discussion and Conclusions References The experimental results presented in the previous section show that when the point-of-gaze was estimated using Listing’s and ALPERN, M. 1969a. Specification of the Direction of Regard. In Donders’ laws there was no significant accuracy improvement The Eye, 2nd ed., vol. 3, H. Davson, Ed. Academic Press Inc. relative to when eye torsion was ignored. This is largely due to (London) Ltd. the relatively small amount of eye torsion expected under our experimental conditions. For reference, note that, using Fick’s ALPERN, M. 1969b. Kinematics of the Eye. In The Eye, 2nd ed., system of axes [Carpenter 1977; Ferman et al. 1987a], when the vol. 3, H. Davson, Ed. Academic Press Inc. (London) Ltd. head of the subject was at 65 cm from the screen and the subject fixated the bottom right target point, the longitude (pan) angle of BELONGIE, S. Rodrigues’ Rotation Formula. MathWorld–A Wol- the fixation line of the right eye was about 10-11º, the longitude fram Web Resource, created by Eric W. Weisstein. angle of the fixation line of the left eye was about 15-16º, and http://mathworld.wolfram.com/RodriguesRotationFormula.ht the latitude (tilt) angle of the fixation line of both eyes was about ml –20º of visual angle. In such a case, the magnitude of the torsion of the eye about the fixation line predicted by Listing’s law CARPENTER, R. H. S. 1977. Movements of the Eyes. Pion. [Ferman et al. 1987a; Guestrin 2010] is about 1.9º and 2.7º, re- spectively. For a 5º deviation between the visual and optic axes, FERMAN, L., COLLEWIJN, H., AND VAN DEN BERG, A. V. 1987a. A a 2.7º torsion would result in a difference of less than 3 mm Direct Test of Listing’s Law – I. Human Ocular Torsion between the point-of-gaze estimates obtained when eye torsion Measured in Static Tertiary Positions. Vision Research 27, 6, is calculated with Listing’s and Donders’ laws and when eye 929–938. torsion is ignored. It is recognized, however, that for a larger range of eye rotation the torsion would be more significant and FERMAN, L., COLLEWIJN, H., AND VAN DEN BERG, A. V. 1987b. A should be taken into account. Direct Test of Listing’s Law – II. Human Ocular Torsion Measured under Dynamic Conditions. Vision Research 27, 6, In general, the torsion of the eye depends on the position and 939–951. orientation of the head (especially the orientation of the head with respect to gravity), the point-of-gaze and whether vision is GUESTRIN, E. D., AND EIZENMAN, M. 2006. General Theory of monocular or binocular. The torsion of the eye is also affected Remote Gaze Estimation Using the Pupil Center and Corneal by disorders of the visual and oculomotor systems. Reflections. IEEE Transactions on Biomedical Engineering 53, 6, 1124–1133. Listing’s and Donders’ laws are, in the best case, only approx- imate. Listing’s law does not apply when the head is not erect, GUESTRIN, E. D., AND EIZENMAN, M. 2008. Remote Point-of- when the head is non-stationary [Carpenter 1977] or in binocular Gaze Estimation Requiring a Single-Point Calibration for 201
  • 4. Applications with Infants. In Proceedings of ETRA 2008, nematics Using Stereo Cameras. In Proceedings of ETRA ACM Press, 267–274. 2008, ACM Press, 95–98. GUESTRIN, E. D. 2010. Remote, Non-Contact Gaze Estimation SHIH, S.-W., AND LIU, J. 2004. A Novel Approach to 3-D Gaze with Minimal Subject Cooperation. PhD thesis, University of Tracking Using Stereo Cameras. IEEE Transactions on Sys- Toronto. tems, Man, and Cybernetics B 34, 1, 234–245. NAGAMATSU, T., KAMAHARA, J., IKO, T., AND TANAKA, N. 2008. VILLANUEVA, A., AND CABEZA, R. 2007. Models for Gaze Track- One-Point Calibration Gaze Tracking Based on Eyeball Ki- ing Systems. EURASIP Journal on Image and Video Processing 2007, article ID 23570. Torsion ignored Torsion predicted by Listing’s and Donders’ laws Subject 1 (6 trials in total) Subject 2 (6 trials in total) Subject 3 (21 trials in total) Figure 1: Mean point-of-gaze estimates (+ : target fixation point,  : left eye,  : right eye). All coordinates are in mm. (Figures © 2009 Elias. D. Guestrin. ACM is licensed to use or reuse these figures in any possible form and to manage third-party requests to reuse these figures.) 202