SlideShare uma empresa Scribd logo
1 de 4
Baixar para ler offline
Gaze Estimation Method based on an Aspherical Model of the Cornea:
                  Surface of Revolution about the Optical Axis of the Eye
        Takashi Nagamatsu∗                   Yukina Iwamoto†    Junzo Kamahara‡                           Naoki Tanaka§                                 Michiya Yamamoto¶
                                                    Kobe University                                                                                   Kwansei Gakuin University


Abstract                                                                                           cornea was not spherical or light from an LED was reflected from
                                                                                                   the sclera that was not modeled.
A novel gaze estimation method based on a novel aspherical model
of the cornea is proposed in this paper. The model is a surface of                                        Rotation Center           Center of Corneal Curvature
revolution about the optical axis of the eye. The calculation method                                                                                                                          POG
                                                                                                                                                             Cornea
is explained on the basis of the model. A prototype system for
estimating the point of gaze (POG) has been developed using this                                                                                                       l Axis
                                                                                                                                                  R               Visua
method. The proposed method has been found to be more accurate                                                                 E                                    α,β
than the gaze estimation method based on a spherical model of the                                                                           A                     Optical Axis
cornea.
                                                                                                         F                                                        B Center of the Pupil
                                                                                                                         Fovea
CR Categories: H.5.2 [Information Interfaces and Presentation]:                                                                                                   Pupil
User Interfaces—Ergonomics; I.4.9 [Image Processing and Com-                                                                                      K
puter Vision]: Applications
                                                                                                         Figure 1: Eye model with spherical model of cornea.
Keywords:          Gaze tracking, calibration-free, eye movement, eye
model
                                                                                                                    0   128   256     384   512       640   768   896   1024 1152 1280
1 Introduction                                                                                                 0

                                                                                                             102

The use of a physical model of the eye for remote gaze estima-                                               205

tion has gained considerable importance in recent times because                                              307

this technique does not require a large number of calibration points.                                        410
                                                                                                                                                                                          a
Most of the studies use a spherical model of the cornea [Shih and                                            512                                                                          b
                                                                                                                                                                                          c
Liu 2004; Guestrin and Eizenman 2007]. However, Shih et al. and                                              614

Guestrin et al. pointed out that a spherical model may not be suit-                                          717

able for modeling the boundary region of the cornea.                                                         819
                                                                                                             922
In this paper, we propose a novel physical model of the cornea,                                              1024
which is a surface of revolution about the optical axis of the eye,
and an estimation method based on the model.
                                                                                                   Figure 2: Evaluation results of our previously proposed system
                                                                                                   based on spherical model of cornea, in display coordinate sysytem.
2 Gaze estimation based on spherical model
  of cornea
                                                                                                   3 Novel aspherical model of cornea: surface
In our previous studies, we had proposed systems for the estima-                                     of revolution about optical axis of eye
tion of the point of gaze (POG) on the computer display, which
were based on the spherical model of the cornea as shown in Fig-
ure 1. [Nagamatsu et al. 2008a; Nagamatsu et al. 2008b]. Figure                                    3.1 Novel aspherical model
2 shows the evaluation results of the system that used two cameras
and two light sources; the evaluation involved three subjects (a, b,                               We propose an aspherical model of the cornea for remote gaze es-
c) [Nagamatsu et al. 2008b]. The system had low accuracy of esti-                                  timation. This model is a surface of revolution about the optical
mating the POG around the top left and right corners of the display.                               axis of the eye as shown in Figure 3. In our model, the boundary
The results could be because the boundary region of the modeled                                    between the cornea and the sclera can be joined smoothly.
   ∗ e-mail:nagamatu@kobe-u.ac.jp
    † e-mail:0667286w@stu.kobe-u.ac.jp                                                             3.2 Determination of optical axis of eye on the basis of
    ‡ e-mail:kamahara@maritime.kobe-u.ac.jp                                                            novel model
    § e-mail:ntanaka@maritime.kobe-u.ac.jp
   ¶ e-mail:michiya.yamamoto@kwansei.ac.jp                                                         We use a special arrangement of cameras and light sources to de-
Copyright © 2010 by the Association for Computing Machinery, Inc.
                                                                                                   termine the optical axis of the eye on the basis of the novel model.
Permission to make digital or hard copies of part or all of this work for personal or              We use two cameras with a light source attached to each camera.
classroom use is granted without fee provided that copies are not made or distributed              The position of each light source is supposed to be the same as the
for commercial advantage and that copies bear this notice and the full citation on the             nodal point of the camera.
first page. Copyrights for components of this work owned by others than ACM must be
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on         Figure 4 shows a cross section of the eyeball. A is the center of
servers, or to redistribute to lists, requires prior specific permission and/or a fee.
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail
                                                                                                   corneal curvature near the optical axis. Lj and Cj denote the po-
permissions@acm.org.                                                                               sition of the light source j and the nodal point of camera j, respec-
ETRA 2010, Austin, TX, March 22 – 24, 2010.
© 2010 ACM 978-1-60558-994-7/10/0003 $10.00

                                                                                             255
is                     planes when we use two cameras (j = 0, 1). The optical axis of
                                                                                  a   l ax
                                                                           O ptic                                 the eye can be determined from the intersection of the two planes.
                                                                                                                  The two planes must not be coplanar (i.e., the optical axis of the eye
                                                                                                                  must not be coplanar with the nodal points of both cameras).
                                                                                                                  Irrespective of the part on the surface of the eye (central region area
                                                                                                                  of the corneal surface, boundary region of the cornea, scleral region,
                                                                                                                  etc.) from where light is reflected, the optical axis of the eye can
                                                                                                                  be determined mathematically. Actually, the scleral surface is not
                                                                                                                  smooth, making it difficult to estimate the glint positions.
                                                                                                                  By using the method, only the optical axis of the eye can be deter-
                                                                                                                  mined. However, it is necessary to determine A to determine the
                                                                                                                  visual axis of the eye, because we have assumed that the visual axis
     Figure 3: Surface of revolution about optical axis of eye.                                                   and the optical axis of the eye intersect at A. The visual axis of the
                                                                                                                  eye is described as X = A+tc in a parametric form, where c is the
                                                                                                                  unit direction vector of the visual axis of the eye, which can be esti-
     Scleral Surface
                                                                                                                  mated by the method described in Section 5.2. The estimation error
                                                                                                                  of A leads to a parallel shift of the visual axis of the eye. Therefore,
                                   Corneal Surface                                                                as the distance between a user and the gazed object increases, the
                                                                                                                  estimation error of the POG on the object in terms of view angle
                                                         d                                                        decreases. The intersection of two lines (X = Cj + tj (Cj − Bj ),
                                                       Optical Axis
                               Pupil
                                                                                                                  j = 0, 1 ) gives an approximation of A; the approximation error is
                  Center of Pupil B             B''j                                                              less than approximately 7.8 mm (the average value of the radius of
                          A                                                                                       the cornea).
                                                              Light Source j                                      However, when the distance between the user and the object is
                       Center of Corneal        Pj                    Lj         Image Plane
                       Curvature
                                                                                  of Camera j                     small, the estimation error of POG caused by estimation error of
                                                                                                                  A is relatively large in terms of view angle. In order to solve this
                                                                                         Purkinje Image
                                                                      Cj              P'j on Image Plane          problem, we propose a method for estimating A and determining
                                                             Nodal Point         B'j    Center of Pupil
                                                                                                                  the visual axis of the eye accurately.
                                                             of Camera j                on Image Plane

                                                                           Camera j                               4 Estimation of user dependent parameters
                                              Scleral Surface                                                       (user calibration)
                                                                                                                  We have to estimate the following user dependent parameters: the
Figure 4: Cross section of eyeball showing center of corneal curva-                                               radius of corneal curvature near the optical axis of the eye, R; the
ture, along with center of pupil, position of light source, and nodal                                             distance between the centers of corneal curvature and the pupil, K;
point of camera.                                                                                                  and the offset between optical and visual axes, α and β. In order
                                                                                                                  to estimate these user dependent parameters, the user is instructed
                                                                                                                  to gaze at a single point (calibration point) the position of which is
tively; Cj is assumed to the same as Lj . The value of Cj (= Lj )                                                 known. The position of the calibration point is selected such that
is determined by calibrating the camera beforehand.                                                               the light from the camera is reflected from the corneal surface that
                                                                                                                  is approximated as a sphere. It is assumed that the pupil can be ob-
A ray originating from the center of the pupil B gets refracted at                                                served through the corneal surface that is approximated as a sphere.
point Bj , passes through the nodal point of the camera j, Cj , and                                               Therefore, the refraction at the corneal surface can be determined
intersects the camera image plane at a point Bj .                                                                 on the basis of the spherical model of the cornea wherein the pupil
A ray from Lj is reflected at a point Pj on the corneal surface                                                    is observed by using the cameras.
back along its incident path. It passes through Cj and intersects
the camera image plane at a point Pj . If the cornea is perfectly                                                 4.1 Estimation of radius of corneal curvature on the
spherical, the line connecting Lj and Pj would pass through A,                                                        basis of spherical model of cornea
and A can be determined by using the two cameras. However, the
position of A cannot be estimated accurately when light is reflected                                               When a user gazes at an object near the camera in the user-
from an aspherical surface of the eye.                                                                            calibration process, light is reflected from the spherical corneal sur-
                                                                                                                  face. Hence, we can use the spherical model of the cornea in this
Because we use the model of a surface of revolution about the op-
                                                                                                                  case.
tical axis of the eye, the ray from Lj is reflected from the surface
of the eye back in the plane including the optical axis of the eye.                                               We estimate the position of the center of corneal curvature, A. Fig-
Therefore, A, B, Bj , Bj , Cj , Lj , Pj , Pj , and the optical axis                                               ure 5 shows a cross section of the cornea including the center of
of the eye are coplanar. The normal vector of a plane that includes                                               corneal curvature A; the position of the light source i, Li ; the posi-
the optical axis is {(Cj − Bj ) × (Pj − Cj )}, and the plane is                                                   tion of the light source j, Lj ; the nodal point of camera i, Ci ; and
expressed as                                                                                                      the nodal point of the camera j, Cj . The positions of Ci (= Li ) and
                                                                                                                  Cj (= Lj ) are known. A ray from Li reflected from the corneal
                                                                                                                  surface returns to Ci and reaches Pii . The extension of the path
             {(Cj − Bj ) × (Pj − Cj )} · (X − Cj ) = 0,                                               (1)         includes A, because the corneal suface is supposed to be a sphere.
                                                                                                                  Similarly, the line connectiong Cj and Pjj includes A. Therefore,
where X (= (x, y, z)T ) is a point on the plane. We obtain two                                                    A can be estimated from the intersection of two lines as follows:


                                                                                                            256
where the incident vector vj = (Cj −Bj )/||Cj −Bj ||, the normal
                                                                                 vector at the point of refraction, nj = (Bj − A)/||Bj − A||, and
                 X = Ci + tii Ci − Pii ,                             (2)         ρ = n1 /n2 (n1 : refractive index of air ≈ 1; n2 : effective refractive
                X = Cj + tjj Cj − Pjj ,                              (3)         index ≈ 1.3375).
                                                                                 The center of the pupil, B, can be determined from the intersection
where tii and tjj are parameters.                                                of two rays from the two cameras, as follows:
A ray from Li is reflected at a point Pji on the corneal surface such
that the reflected ray passes through Cj and intersects the camera                                     X = Bj + sj tj      (j = 0, 1),                 (9)
image plane at a point Pji . Similarly, a ray from Lj is reflected at
a point Pij on the corneal surface such that the reflected ray passes             where sj is a parameter. Therefore, the distance between the centers
through Ci and intersects the camera image plane at a point Pij . In             of corneal curvature and the pupil, K, is determined as K = ||B −
order to estimate the radius of the cornea, we estimate the reflection            A||.
point Pji (= Pij ), that is, the intersection of the lines as follows:


                X = Ci + tij Ci − Pij ,                              (4)                                               Optical Axis
                                                                                              Pupil
                X = Cj + tji Cj − Pji ,                              (5)

                                                                                               B          tj B''j
where tij and tji are parameters. Therefore, the radius of corneal                                                      nj
curvature, R, is determined as R = ||Pji − A||.
                                                                                        A                         vj
          Corneal Surface                    Ci         Image Plane                Center of
                                                              P'ij                 Corneal Curvature
                                             Li                                                                                            Image Plane
              Pji = Pij                                       P'ii
                                                                                                Corneal Surface                       Cj
       A                                                                                                                                      B'j
 Center of
 Corneal Curvature                                   Image Plane
                                                                                               Figure 6: Refraction on corneal surface.

                                       Lj              P'jj
                                                                                 4.3 Estimation of offset between optical and visual
                                       Cj            P'ji                            axes

                                                                                 The offset between optical and visual axes is expressed by two pa-
Figure 5: Cross section of cornea showing containing center of                   rameters; e.g., horizontal and vertical angles. For the case of a
corneal curvature, position of light sources, and nodal points of                user gazing at a known position, the offset between optical and vi-
cameras.                                                                         sual axes is calculated by the method described in Nagamatsu et al.
                                                                                 [2008b].
4.2   Estimation of distance between centers of corneal
      curvature and pupil                                                        5 Estimation of visual axis of eye after user
                                                                                   calibration
As shown in Figure 6, a ray originating from the center of the pupil
B gets refracted at point Bj , passes through the nodal point of the             After the user calibration, the user moves his/her eyes freely. The
camera j, Cj , and intersects the camera image plane at a point Bj .             optical axis of the eye can be calculated by the method described in
Bj can be determined by solving the equations given below:                       Section 3. R, K, and the offset between optical and visual axes of
                                                                                 the eye are known from the user calibration.
                                                                                 The position of the center of corneal curvature, A, and the unit
                  X    =     Cj + tj Cj − Bj ,                       (6)         direction vector along the visual axis of the eye, c, are required for
                  R    =     ||X − A||.                              (7)         the calculation of the visual axis of the eye.

These equations may have two solutions; we select the one closer                 5.1 Estimation of center of corneal curvature
to Cj .
                                                                                 We suppose that the corneal surface where the pupil is observed can
The equation of the vector tj (the refracted vector at Bj shown in               be approximated as a spherical surface. The algorithm for searching
Figure 6) can be obtained by using Snell’s law as follows:                       the position of A is as follows:
                                                                                 1) Set the position of A on the optical axis; select the position that is
                                                                                 nearest to the intersection of the two lines, X = Cj + tj (Cj − Bj )
 tj = −ρnj · vj −           1 − ρ2 (1 − (nj · vj )2 ) nj + ρvj , (8)             (j = 0, 1).


                                                                           257
2) Calculate Bj and tj by using Equations 6, 7, and 8, where R is              connecting the corneal center and the virtual pupil on the basis of
known from the user calibration.                                               the spherical model of the cornea.
3) Calculate B, the position of the center of the pupil, from the              Figure 8 shows the evaluation results. The crosses and triangles
intersection of the two lines described by Equation 9.                         indicate the POG obtained by our method and Chen’s method, re-
                                                                               spectively. Our method appears to be more accurate than Chen’s
4) Calculate the distance between B and A, and compare it to K                 method in determining POG at the top left and right corners of the
that was estimated during the user calibration.                                display. The estimated R and K were 8.04 mm and 4.43 mm, re-
                                                                               spectively.
5) Shift the position of A toward the rotation center of the eye along
the optical axis of the eye and repeat steps 1–4 to determine the                         0   128   256      384   512   640   768   896   1024   1152   1280
accurate position of A. It is sufficient to search the position of A                   0
for a length of 10 mm, because the average radius of the cornea is                  102
approximately 7.8 mm. The search is finished when ||B − A|| =
K.                                                                                  205

                                                                                    307

                                                                                    410
5.2   Estimation of visual axis of eye and POG
                                                                                    512

The unit direction vector of the visual axis of the eye, c, is deter-               614

mined from the unit direction vector of the optical axis of the eye,                717
d, and the offset between optical and visual axes of the eye by using               819
the method described in Nagamatsu et al. [2008b].
                                                                                    922

The intersection point between the visual axis of the eye (X =                     1024

A + tc) and the object gives the POG.                                                                     Our method             Chen's method



6     Implementation                                                           Figure 8: Comparison of our method and Chen’s method in display
                                                                               coordinate system.
A prototype system for the estimation of the POG on a display has
been implemented, as shown in Figure 7. This system consists of
two synchronized monochrome IEEE-1394 digital cameras (Firefly                  7 Conclusion
MV, Point Grey Research Inc.), a 17 LCD, and a Windows-based
PC (Windows XP). The software was developed using OpenCV 1.0                   We proposed a novel physical model of the eye for remote gaze
[Intel]. Each camera is equipped with a 1/3 CMOS image sensor                  tracking. This model is a surface of revolution about the optical
whose resolution is 752 × 480 pixels. A 35-mm lens and an IR filter             axis of the eye. We determined the mathematical expression for
are attached to each camera. Two infrared LEDs are attached to                 estimating the POG on the basis of the model. We evaluated the
each camera such that the midpoint of the two LEDs coincides with              prototype system developed on the basis of our method and found
the nodal point of the camera. These cameras are positioned under              that the system could be used to estimate the POG on the entire
the display. The intrinsic parameters of the cameras are determined            computer display.
before setting up the system.
                                                                               References
                                                                               C HEN , J., T ONG , Y., G RAY, W., AND J I , Q. 2008. A robust 3D
                                                                                  eye gaze tracking system using noise reduction. In Proceedings
                                                                                  of the 2008 Symposium on Eye Tracking Research & Applica-
                                                                                  tions, 189–196.
                                                                               G UESTRIN , E. D., AND E IZENMAN , M. 2007. Remote point-
                                                                                  of-gaze estimation with free head movements requiring a single-
                                                                                  point calibration. In Proceedings of the 29th Annual Interna-
                                                                                  tional Conference of the IEEE EMBS, 4556–4560.
                                                                               I NTEL.           Open source computer vision                             library.
                                                                                  http://sourceforge.net/projects/opencvlibrary/.
                                                                               NAGAMATSU , T., K AMAHARA , J., I KO , T., AND TANAKA , N.
                    Figure 7: Prototype system.                                  2008. One-point calibration gaze tracking based on eyeball kine-
                                                                                 matics using stereo cameras. In Proceedings of the 2008 Sympo-
                                                                                 sium on Eye Tracking Research & Applications, 95–98.
The evaluation of the prototype system in a laboratory involved an
adult subject who does not wear glasses or contact lenses. The                 NAGAMATSU , T., K AMAHARA , J., AND TANAKA , N. 2008.
subject’s eye (right) was approximately 500 mm from the display.                 3D gaze tracking with easy calibration using stereo cameras for
She was asked to stare at 25 points on the display. More than 10                 robot and human communication. In Proceedings of IEEE RO-
data points were recorded for each point.                                        MAN 2008, 59–64.
In order to confirm the effectiveness of our method, we compared                S HIH , S.-W., AND L IU , J. 2004. A novel approach to 3-D gaze
our method to the method described in Sections 3.2 and 3.3 in Chen                tracking using stereo cameras. IEEE Transactions on Systems,
et al. [2008], in which the optical axis was determined as a line                 Man, and Cybernetics, Part B 34, 1, 234–245.



                                                                         258

Mais conteúdo relacionado

Mais procurados

Advances in iol calculation
Advances in iol calculationAdvances in iol calculation
Advances in iol calculationMehdi Khanlari
 
Laser Blended Vision for Presbyopia:
Laser Blended Vision for Presbyopia: Laser Blended Vision for Presbyopia:
Laser Blended Vision for Presbyopia: London Vision Clinic
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORcscpconf
 
Goldberg Scanpath Clustering And Aggregation
Goldberg Scanpath Clustering And AggregationGoldberg Scanpath Clustering And Aggregation
Goldberg Scanpath Clustering And AggregationKalle
 
Vaishno medisales: AOE 2021
Vaishno medisales: AOE 2021Vaishno medisales: AOE 2021
Vaishno medisales: AOE 2021Mero Eye
 
Guestrin Listings And Donders Laws And The Estimation Of The Point Of Gaze
Guestrin Listings And Donders Laws And The Estimation Of The Point Of GazeGuestrin Listings And Donders Laws And The Estimation Of The Point Of Gaze
Guestrin Listings And Donders Laws And The Estimation Of The Point Of GazeKalle
 
Goldberg Visual Scanpath Representation
Goldberg Visual Scanpath RepresentationGoldberg Visual Scanpath Representation
Goldberg Visual Scanpath RepresentationKalle
 
Optical System Of Eye
Optical System Of EyeOptical System Of Eye
Optical System Of EyeNaman Jain
 
My Personal Account of Laser Blended Vision
My Personal Account of Laser Blended VisionMy Personal Account of Laser Blended Vision
My Personal Account of Laser Blended VisionLondon Vision Clinic
 
2009 0359ogidigbe Arvo Final
2009 0359ogidigbe Arvo Final2009 0359ogidigbe Arvo Final
2009 0359ogidigbe Arvo Finalogidigbe
 
My Cv In Slides 06 02 09
My Cv In Slides 06 02 09My Cv In Slides 06 02 09
My Cv In Slides 06 02 09magarills
 
laser lenses for retinal diseases
 laser lenses for retinal diseases laser lenses for retinal diseases
laser lenses for retinal diseasesnalini2218
 
Visionbf 1h-fin
Visionbf 1h-finVisionbf 1h-fin
Visionbf 1h-finMUBOSScz
 
Achieving accurate iol power calculations
Achieving accurate iol power calculationsAchieving accurate iol power calculations
Achieving accurate iol power calculationsAbdelmonem Hamed
 

Mais procurados (20)

Advances in iol calculation
Advances in iol calculationAdvances in iol calculation
Advances in iol calculation
 
Laser Blended Vision for Presbyopia:
Laser Blended Vision for Presbyopia: Laser Blended Vision for Presbyopia:
Laser Blended Vision for Presbyopia:
 
Iq3116211626
Iq3116211626Iq3116211626
Iq3116211626
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
 
Goldberg Scanpath Clustering And Aggregation
Goldberg Scanpath Clustering And AggregationGoldberg Scanpath Clustering And Aggregation
Goldberg Scanpath Clustering And Aggregation
 
Vaishno medisales: AOE 2021
Vaishno medisales: AOE 2021Vaishno medisales: AOE 2021
Vaishno medisales: AOE 2021
 
Guestrin Listings And Donders Laws And The Estimation Of The Point Of Gaze
Guestrin Listings And Donders Laws And The Estimation Of The Point Of GazeGuestrin Listings And Donders Laws And The Estimation Of The Point Of Gaze
Guestrin Listings And Donders Laws And The Estimation Of The Point Of Gaze
 
Optical
OpticalOptical
Optical
 
Goldberg Visual Scanpath Representation
Goldberg Visual Scanpath RepresentationGoldberg Visual Scanpath Representation
Goldberg Visual Scanpath Representation
 
Optical System Of Eye
Optical System Of EyeOptical System Of Eye
Optical System Of Eye
 
My Personal Account of Laser Blended Vision
My Personal Account of Laser Blended VisionMy Personal Account of Laser Blended Vision
My Personal Account of Laser Blended Vision
 
2009 0359ogidigbe Arvo Final
2009 0359ogidigbe Arvo Final2009 0359ogidigbe Arvo Final
2009 0359ogidigbe Arvo Final
 
Aniseikonia
Aniseikonia Aniseikonia
Aniseikonia
 
Optics of eye
Optics of eyeOptics of eye
Optics of eye
 
My Cv In Slides 06 02 09
My Cv In Slides 06 02 09My Cv In Slides 06 02 09
My Cv In Slides 06 02 09
 
laser lenses for retinal diseases
 laser lenses for retinal diseases laser lenses for retinal diseases
laser lenses for retinal diseases
 
Visionbf 1h-fin
Visionbf 1h-finVisionbf 1h-fin
Visionbf 1h-fin
 
Hypermetropia
HypermetropiaHypermetropia
Hypermetropia
 
Achieving accurate iol power calculations
Achieving accurate iol power calculationsAchieving accurate iol power calculations
Achieving accurate iol power calculations
 
Reduced eye
Reduced eyeReduced eye
Reduced eye
 

Destaque

ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...
ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...
ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...ZFConf Conference
 
Invisible Public Debt (Presentation)
Invisible Public Debt (Presentation)Invisible Public Debt (Presentation)
Invisible Public Debt (Presentation)Kassymkhan Kapparov
 
MobileConf 2013 - Aerogear Android
MobileConf 2013 - Aerogear AndroidMobileConf 2013 - Aerogear Android
MobileConf 2013 - Aerogear AndroidDaniel Passos
 
How to make your destination pintastic! (Part 1)
How to make your destination pintastic! (Part 1)How to make your destination pintastic! (Part 1)
How to make your destination pintastic! (Part 1)Stephanie Lynch
 
Sentencia champion
Sentencia championSentencia champion
Sentencia championoscargaliza
 
การพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศ
การพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศการพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศ
การพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศPrincess Chulabhorn's College, Chiang Rai Thailand
 
The Dynamite of Next Generation (Y) Attack
The Dynamite of Next Generation (Y) AttackThe Dynamite of Next Generation (Y) Attack
The Dynamite of Next Generation (Y) AttackPrathan Phongthiproek
 

Destaque (20)

SEB
SEBSEB
SEB
 
TEMA 5A Vocabulario
TEMA 5A VocabularioTEMA 5A Vocabulario
TEMA 5A Vocabulario
 
Stc charc carrf
Stc charc carrfStc charc carrf
Stc charc carrf
 
Content statbyschool 2553_m3_1057012007
Content statbyschool 2553_m3_1057012007Content statbyschool 2553_m3_1057012007
Content statbyschool 2553_m3_1057012007
 
เศรษฐศาสตร์เบื้องต้น
เศรษฐศาสตร์เบื้องต้นเศรษฐศาสตร์เบื้องต้น
เศรษฐศาสตร์เบื้องต้น
 
Darth vader lego in steps for facebook
Darth vader lego in steps for facebookDarth vader lego in steps for facebook
Darth vader lego in steps for facebook
 
ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...
ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...
ZFConf 2011: Как может помочь среда разработки при написании приложения на Ze...
 
Invisible Public Debt (Presentation)
Invisible Public Debt (Presentation)Invisible Public Debt (Presentation)
Invisible Public Debt (Presentation)
 
สงครามโลก..[1]
สงครามโลก..[1]สงครามโลก..[1]
สงครามโลก..[1]
 
Triangle Gives Back October Webinar
Triangle Gives Back October WebinarTriangle Gives Back October Webinar
Triangle Gives Back October Webinar
 
MobileConf 2013 - Aerogear Android
MobileConf 2013 - Aerogear AndroidMobileConf 2013 - Aerogear Android
MobileConf 2013 - Aerogear Android
 
ฟ้เดล กัสโตร
ฟ้เดล กัสโตรฟ้เดล กัสโตร
ฟ้เดล กัสโตร
 
Acta ci 121106
Acta ci 121106Acta ci 121106
Acta ci 121106
 
How to make your destination pintastic! (Part 1)
How to make your destination pintastic! (Part 1)How to make your destination pintastic! (Part 1)
How to make your destination pintastic! (Part 1)
 
Cere tom neurologica
Cere tom neurologicaCere tom neurologica
Cere tom neurologica
 
Age
AgeAge
Age
 
Sentencia champion
Sentencia championSentencia champion
Sentencia champion
 
การพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศ
การพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศการพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศ
การพํมนาเศรษฐกิจและเศรษฐกิจระหว่างประเทศ
 
The Dynamite of Next Generation (Y) Attack
The Dynamite of Next Generation (Y) AttackThe Dynamite of Next Generation (Y) Attack
The Dynamite of Next Generation (Y) Attack
 
กำแพงเบอร น(สมบร ณ_)-1
กำแพงเบอร น(สมบร ณ_)-1กำแพงเบอร น(สมบร ณ_)-1
กำแพงเบอร น(สมบร ณ_)-1
 

Semelhante a Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

Coutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye TrackingCoutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye TrackingKalle
 
Mulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head MovementMulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head MovementKalle
 
Model User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation SystemModel User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation SystemKalle
 
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...Waqas Tariq
 
Eye gaze tracking with a web camera
Eye gaze tracking with a web cameraEye gaze tracking with a web camera
Eye gaze tracking with a web camerajpstudcorner
 
194Martin LeungUnerd Poster
194Martin LeungUnerd Poster194Martin LeungUnerd Poster
194Martin LeungUnerd PosterMartin Leung
 
NEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONNEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONcscpconf
 
Eye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop EnvironmentEye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop Environment1crore projects
 
Research on Iris Region Localization Algorithms
Research on Iris Region Localization AlgorithmsResearch on Iris Region Localization Algorithms
Research on Iris Region Localization AlgorithmsIJERA Editor
 
3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx
3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx
3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptxsajalmenon99
 
The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...
The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...
The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...CSCJournals
 
An optimized rubber sheet model for normalization phase of iris recognition
An optimized rubber sheet model for normalization phase of iris recognitionAn optimized rubber sheet model for normalization phase of iris recognition
An optimized rubber sheet model for normalization phase of iris recognitionCSITiaesprime
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001vital vital
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001vital vital
 
SciVerse ScienceDirect Domed Fresnel
SciVerse ScienceDirect Domed FresnelSciVerse ScienceDirect Domed Fresnel
SciVerse ScienceDirect Domed FresnelBob Harshaw
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORcsitconf
 

Semelhante a Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye (20)

Coutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye TrackingCoutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
Coutinho A Depth Compensation Method For Cross Ratio Based Eye Tracking
 
Mulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head MovementMulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head Movement
 
Model User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation SystemModel User Calibration Free Remote Gaze Estimation System
Model User Calibration Free Remote Gaze Estimation System
 
Ijetcas14 315
Ijetcas14 315Ijetcas14 315
Ijetcas14 315
 
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
 
Eye gaze tracking with a web camera
Eye gaze tracking with a web cameraEye gaze tracking with a web camera
Eye gaze tracking with a web camera
 
194Martin LeungUnerd Poster
194Martin LeungUnerd Poster194Martin LeungUnerd Poster
194Martin LeungUnerd Poster
 
NEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONNEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTION
 
Eye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop EnvironmentEye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop Environment
 
Biometry.pptx
Biometry.pptxBiometry.pptx
Biometry.pptx
 
Research on Iris Region Localization Algorithms
Research on Iris Region Localization AlgorithmsResearch on Iris Region Localization Algorithms
Research on Iris Region Localization Algorithms
 
3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx
3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx
3c31fd00-8ac4-4e1e-82c6-ee6f24a54210-492.pptx
 
The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...
The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...
The Biometric Algorithm based on Fusion of DWT Frequency Components of Enhanc...
 
An optimized rubber sheet model for normalization phase of iris recognition
An optimized rubber sheet model for normalization phase of iris recognitionAn optimized rubber sheet model for normalization phase of iris recognition
An optimized rubber sheet model for normalization phase of iris recognition
 
Progresivos
ProgresivosProgresivos
Progresivos
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001
 
Ijcse13 05-01-001
Ijcse13 05-01-001Ijcse13 05-01-001
Ijcse13 05-01-001
 
SciVerse ScienceDirect Domed Fresnel
SciVerse ScienceDirect Domed FresnelSciVerse ScienceDirect Domed Fresnel
SciVerse ScienceDirect Domed Fresnel
 
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATORIRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
IRIS BIOMETRIC RECOGNITION SYSTEM EMPLOYING CANNY OPERATOR
 
K0966468
K0966468K0966468
K0966468
 

Mais de Kalle

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsKalle
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Kalle
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Kalle
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Kalle
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Kalle
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlKalle
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Kalle
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingKalle
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Kalle
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeKalle
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Kalle
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneKalle
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerKalle
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksKalle
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingKalle
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchKalle
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyKalle
 
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Kalle
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Kalle
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Kalle
 

Mais de Kalle (20)

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze Alone
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem Solving
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement Study
 
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
 

Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

  • 1. Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye Takashi Nagamatsu∗ Yukina Iwamoto† Junzo Kamahara‡ Naoki Tanaka§ Michiya Yamamoto¶ Kobe University Kwansei Gakuin University Abstract cornea was not spherical or light from an LED was reflected from the sclera that was not modeled. A novel gaze estimation method based on a novel aspherical model of the cornea is proposed in this paper. The model is a surface of Rotation Center Center of Corneal Curvature revolution about the optical axis of the eye. The calculation method POG Cornea is explained on the basis of the model. A prototype system for estimating the point of gaze (POG) has been developed using this l Axis R Visua method. The proposed method has been found to be more accurate E α,β than the gaze estimation method based on a spherical model of the A Optical Axis cornea. F B Center of the Pupil Fovea CR Categories: H.5.2 [Information Interfaces and Presentation]: Pupil User Interfaces—Ergonomics; I.4.9 [Image Processing and Com- K puter Vision]: Applications Figure 1: Eye model with spherical model of cornea. Keywords: Gaze tracking, calibration-free, eye movement, eye model 0 128 256 384 512 640 768 896 1024 1152 1280 1 Introduction 0 102 The use of a physical model of the eye for remote gaze estima- 205 tion has gained considerable importance in recent times because 307 this technique does not require a large number of calibration points. 410 a Most of the studies use a spherical model of the cornea [Shih and 512 b c Liu 2004; Guestrin and Eizenman 2007]. However, Shih et al. and 614 Guestrin et al. pointed out that a spherical model may not be suit- 717 able for modeling the boundary region of the cornea. 819 922 In this paper, we propose a novel physical model of the cornea, 1024 which is a surface of revolution about the optical axis of the eye, and an estimation method based on the model. Figure 2: Evaluation results of our previously proposed system based on spherical model of cornea, in display coordinate sysytem. 2 Gaze estimation based on spherical model of cornea 3 Novel aspherical model of cornea: surface In our previous studies, we had proposed systems for the estima- of revolution about optical axis of eye tion of the point of gaze (POG) on the computer display, which were based on the spherical model of the cornea as shown in Fig- ure 1. [Nagamatsu et al. 2008a; Nagamatsu et al. 2008b]. Figure 3.1 Novel aspherical model 2 shows the evaluation results of the system that used two cameras and two light sources; the evaluation involved three subjects (a, b, We propose an aspherical model of the cornea for remote gaze es- c) [Nagamatsu et al. 2008b]. The system had low accuracy of esti- timation. This model is a surface of revolution about the optical mating the POG around the top left and right corners of the display. axis of the eye as shown in Figure 3. In our model, the boundary The results could be because the boundary region of the modeled between the cornea and the sclera can be joined smoothly. ∗ e-mail:nagamatu@kobe-u.ac.jp † e-mail:0667286w@stu.kobe-u.ac.jp 3.2 Determination of optical axis of eye on the basis of ‡ e-mail:kamahara@maritime.kobe-u.ac.jp novel model § e-mail:ntanaka@maritime.kobe-u.ac.jp ¶ e-mail:michiya.yamamoto@kwansei.ac.jp We use a special arrangement of cameras and light sources to de- Copyright © 2010 by the Association for Computing Machinery, Inc. termine the optical axis of the eye on the basis of the novel model. Permission to make digital or hard copies of part or all of this work for personal or We use two cameras with a light source attached to each camera. classroom use is granted without fee provided that copies are not made or distributed The position of each light source is supposed to be the same as the for commercial advantage and that copies bear this notice and the full citation on the nodal point of the camera. first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on Figure 4 shows a cross section of the eyeball. A is the center of servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail corneal curvature near the optical axis. Lj and Cj denote the po- permissions@acm.org. sition of the light source j and the nodal point of camera j, respec- ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 255
  • 2. is planes when we use two cameras (j = 0, 1). The optical axis of a l ax O ptic the eye can be determined from the intersection of the two planes. The two planes must not be coplanar (i.e., the optical axis of the eye must not be coplanar with the nodal points of both cameras). Irrespective of the part on the surface of the eye (central region area of the corneal surface, boundary region of the cornea, scleral region, etc.) from where light is reflected, the optical axis of the eye can be determined mathematically. Actually, the scleral surface is not smooth, making it difficult to estimate the glint positions. By using the method, only the optical axis of the eye can be deter- mined. However, it is necessary to determine A to determine the visual axis of the eye, because we have assumed that the visual axis Figure 3: Surface of revolution about optical axis of eye. and the optical axis of the eye intersect at A. The visual axis of the eye is described as X = A+tc in a parametric form, where c is the unit direction vector of the visual axis of the eye, which can be esti- Scleral Surface mated by the method described in Section 5.2. The estimation error of A leads to a parallel shift of the visual axis of the eye. Therefore, Corneal Surface as the distance between a user and the gazed object increases, the estimation error of the POG on the object in terms of view angle d decreases. The intersection of two lines (X = Cj + tj (Cj − Bj ), Optical Axis Pupil j = 0, 1 ) gives an approximation of A; the approximation error is Center of Pupil B B''j less than approximately 7.8 mm (the average value of the radius of A the cornea). Light Source j However, when the distance between the user and the object is Center of Corneal Pj Lj Image Plane Curvature of Camera j small, the estimation error of POG caused by estimation error of A is relatively large in terms of view angle. In order to solve this Purkinje Image Cj P'j on Image Plane problem, we propose a method for estimating A and determining Nodal Point B'j Center of Pupil the visual axis of the eye accurately. of Camera j on Image Plane Camera j 4 Estimation of user dependent parameters Scleral Surface (user calibration) We have to estimate the following user dependent parameters: the Figure 4: Cross section of eyeball showing center of corneal curva- radius of corneal curvature near the optical axis of the eye, R; the ture, along with center of pupil, position of light source, and nodal distance between the centers of corneal curvature and the pupil, K; point of camera. and the offset between optical and visual axes, α and β. In order to estimate these user dependent parameters, the user is instructed to gaze at a single point (calibration point) the position of which is tively; Cj is assumed to the same as Lj . The value of Cj (= Lj ) known. The position of the calibration point is selected such that is determined by calibrating the camera beforehand. the light from the camera is reflected from the corneal surface that is approximated as a sphere. It is assumed that the pupil can be ob- A ray originating from the center of the pupil B gets refracted at served through the corneal surface that is approximated as a sphere. point Bj , passes through the nodal point of the camera j, Cj , and Therefore, the refraction at the corneal surface can be determined intersects the camera image plane at a point Bj . on the basis of the spherical model of the cornea wherein the pupil A ray from Lj is reflected at a point Pj on the corneal surface is observed by using the cameras. back along its incident path. It passes through Cj and intersects the camera image plane at a point Pj . If the cornea is perfectly 4.1 Estimation of radius of corneal curvature on the spherical, the line connecting Lj and Pj would pass through A, basis of spherical model of cornea and A can be determined by using the two cameras. However, the position of A cannot be estimated accurately when light is reflected When a user gazes at an object near the camera in the user- from an aspherical surface of the eye. calibration process, light is reflected from the spherical corneal sur- face. Hence, we can use the spherical model of the cornea in this Because we use the model of a surface of revolution about the op- case. tical axis of the eye, the ray from Lj is reflected from the surface of the eye back in the plane including the optical axis of the eye. We estimate the position of the center of corneal curvature, A. Fig- Therefore, A, B, Bj , Bj , Cj , Lj , Pj , Pj , and the optical axis ure 5 shows a cross section of the cornea including the center of of the eye are coplanar. The normal vector of a plane that includes corneal curvature A; the position of the light source i, Li ; the posi- the optical axis is {(Cj − Bj ) × (Pj − Cj )}, and the plane is tion of the light source j, Lj ; the nodal point of camera i, Ci ; and expressed as the nodal point of the camera j, Cj . The positions of Ci (= Li ) and Cj (= Lj ) are known. A ray from Li reflected from the corneal surface returns to Ci and reaches Pii . The extension of the path {(Cj − Bj ) × (Pj − Cj )} · (X − Cj ) = 0, (1) includes A, because the corneal suface is supposed to be a sphere. Similarly, the line connectiong Cj and Pjj includes A. Therefore, where X (= (x, y, z)T ) is a point on the plane. We obtain two A can be estimated from the intersection of two lines as follows: 256
  • 3. where the incident vector vj = (Cj −Bj )/||Cj −Bj ||, the normal vector at the point of refraction, nj = (Bj − A)/||Bj − A||, and X = Ci + tii Ci − Pii , (2) ρ = n1 /n2 (n1 : refractive index of air ≈ 1; n2 : effective refractive X = Cj + tjj Cj − Pjj , (3) index ≈ 1.3375). The center of the pupil, B, can be determined from the intersection where tii and tjj are parameters. of two rays from the two cameras, as follows: A ray from Li is reflected at a point Pji on the corneal surface such that the reflected ray passes through Cj and intersects the camera X = Bj + sj tj (j = 0, 1), (9) image plane at a point Pji . Similarly, a ray from Lj is reflected at a point Pij on the corneal surface such that the reflected ray passes where sj is a parameter. Therefore, the distance between the centers through Ci and intersects the camera image plane at a point Pij . In of corneal curvature and the pupil, K, is determined as K = ||B − order to estimate the radius of the cornea, we estimate the reflection A||. point Pji (= Pij ), that is, the intersection of the lines as follows: X = Ci + tij Ci − Pij , (4) Optical Axis Pupil X = Cj + tji Cj − Pji , (5) B tj B''j where tij and tji are parameters. Therefore, the radius of corneal nj curvature, R, is determined as R = ||Pji − A||. A vj Corneal Surface Ci Image Plane Center of P'ij Corneal Curvature Li Image Plane Pji = Pij P'ii Corneal Surface Cj A B'j Center of Corneal Curvature Image Plane Figure 6: Refraction on corneal surface. Lj P'jj 4.3 Estimation of offset between optical and visual Cj P'ji axes The offset between optical and visual axes is expressed by two pa- Figure 5: Cross section of cornea showing containing center of rameters; e.g., horizontal and vertical angles. For the case of a corneal curvature, position of light sources, and nodal points of user gazing at a known position, the offset between optical and vi- cameras. sual axes is calculated by the method described in Nagamatsu et al. [2008b]. 4.2 Estimation of distance between centers of corneal curvature and pupil 5 Estimation of visual axis of eye after user calibration As shown in Figure 6, a ray originating from the center of the pupil B gets refracted at point Bj , passes through the nodal point of the After the user calibration, the user moves his/her eyes freely. The camera j, Cj , and intersects the camera image plane at a point Bj . optical axis of the eye can be calculated by the method described in Bj can be determined by solving the equations given below: Section 3. R, K, and the offset between optical and visual axes of the eye are known from the user calibration. The position of the center of corneal curvature, A, and the unit X = Cj + tj Cj − Bj , (6) direction vector along the visual axis of the eye, c, are required for R = ||X − A||. (7) the calculation of the visual axis of the eye. These equations may have two solutions; we select the one closer 5.1 Estimation of center of corneal curvature to Cj . We suppose that the corneal surface where the pupil is observed can The equation of the vector tj (the refracted vector at Bj shown in be approximated as a spherical surface. The algorithm for searching Figure 6) can be obtained by using Snell’s law as follows: the position of A is as follows: 1) Set the position of A on the optical axis; select the position that is nearest to the intersection of the two lines, X = Cj + tj (Cj − Bj ) tj = −ρnj · vj − 1 − ρ2 (1 − (nj · vj )2 ) nj + ρvj , (8) (j = 0, 1). 257
  • 4. 2) Calculate Bj and tj by using Equations 6, 7, and 8, where R is connecting the corneal center and the virtual pupil on the basis of known from the user calibration. the spherical model of the cornea. 3) Calculate B, the position of the center of the pupil, from the Figure 8 shows the evaluation results. The crosses and triangles intersection of the two lines described by Equation 9. indicate the POG obtained by our method and Chen’s method, re- spectively. Our method appears to be more accurate than Chen’s 4) Calculate the distance between B and A, and compare it to K method in determining POG at the top left and right corners of the that was estimated during the user calibration. display. The estimated R and K were 8.04 mm and 4.43 mm, re- spectively. 5) Shift the position of A toward the rotation center of the eye along the optical axis of the eye and repeat steps 1–4 to determine the 0 128 256 384 512 640 768 896 1024 1152 1280 accurate position of A. It is sufficient to search the position of A 0 for a length of 10 mm, because the average radius of the cornea is 102 approximately 7.8 mm. The search is finished when ||B − A|| = K. 205 307 410 5.2 Estimation of visual axis of eye and POG 512 The unit direction vector of the visual axis of the eye, c, is deter- 614 mined from the unit direction vector of the optical axis of the eye, 717 d, and the offset between optical and visual axes of the eye by using 819 the method described in Nagamatsu et al. [2008b]. 922 The intersection point between the visual axis of the eye (X = 1024 A + tc) and the object gives the POG. Our method Chen's method 6 Implementation Figure 8: Comparison of our method and Chen’s method in display coordinate system. A prototype system for the estimation of the POG on a display has been implemented, as shown in Figure 7. This system consists of two synchronized monochrome IEEE-1394 digital cameras (Firefly 7 Conclusion MV, Point Grey Research Inc.), a 17 LCD, and a Windows-based PC (Windows XP). The software was developed using OpenCV 1.0 We proposed a novel physical model of the eye for remote gaze [Intel]. Each camera is equipped with a 1/3 CMOS image sensor tracking. This model is a surface of revolution about the optical whose resolution is 752 × 480 pixels. A 35-mm lens and an IR filter axis of the eye. We determined the mathematical expression for are attached to each camera. Two infrared LEDs are attached to estimating the POG on the basis of the model. We evaluated the each camera such that the midpoint of the two LEDs coincides with prototype system developed on the basis of our method and found the nodal point of the camera. These cameras are positioned under that the system could be used to estimate the POG on the entire the display. The intrinsic parameters of the cameras are determined computer display. before setting up the system. References C HEN , J., T ONG , Y., G RAY, W., AND J I , Q. 2008. A robust 3D eye gaze tracking system using noise reduction. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applica- tions, 189–196. G UESTRIN , E. D., AND E IZENMAN , M. 2007. Remote point- of-gaze estimation with free head movements requiring a single- point calibration. In Proceedings of the 29th Annual Interna- tional Conference of the IEEE EMBS, 4556–4560. I NTEL. Open source computer vision library. http://sourceforge.net/projects/opencvlibrary/. NAGAMATSU , T., K AMAHARA , J., I KO , T., AND TANAKA , N. Figure 7: Prototype system. 2008. One-point calibration gaze tracking based on eyeball kine- matics using stereo cameras. In Proceedings of the 2008 Sympo- sium on Eye Tracking Research & Applications, 95–98. The evaluation of the prototype system in a laboratory involved an adult subject who does not wear glasses or contact lenses. The NAGAMATSU , T., K AMAHARA , J., AND TANAKA , N. 2008. subject’s eye (right) was approximately 500 mm from the display. 3D gaze tracking with easy calibration using stereo cameras for She was asked to stare at 25 points on the display. More than 10 robot and human communication. In Proceedings of IEEE RO- data points were recorded for each point. MAN 2008, 59–64. In order to confirm the effectiveness of our method, we compared S HIH , S.-W., AND L IU , J. 2004. A novel approach to 3-D gaze our method to the method described in Sections 3.2 and 3.3 in Chen tracking using stereo cameras. IEEE Transactions on Systems, et al. [2008], in which the optical axis was determined as a line Man, and Cybernetics, Part B 34, 1, 234–245. 258