SlideShare uma empresa Scribd logo
1 de 4
Baixar para ler offline
Development of Eye-Tracking Pen Display Based on
                                   Stereo Bright Pupil Technique
              Michiya Yamamoto*                                           Takashi Nagamatsu†                                     Tomio Watanabe‡
        School of Science and Technology                          Graduate School of Maritime Sciences                    Department of Systems Engineering
           Kwansei Gakuin University                                        Kobe University                                Okayama Prefectural University

Abstract                                                                                           we could analyze how a presenter indicates or emphasizes a
                                                                                                   slide in presentation by using intuitive pen display. In addition,
The intuitive user interfaces of PCs and PDAs, such as pen dis-                                    such an eye-tracking pen display could become a gadget for
play and touch panel, have become widely used in recent times.                                     realizing a new mode of interaction between humans and com-
In this study, we have developed an eye-tracking pen display                                       puters.
based on the stereo bright pupil technique. First, the bright pupil
camera was developed by examining the arrangement of cam-                                          In this study, we have developed an eye-tracking pen display
eras and LEDs for pen display. Next, the gaze estimation                                           based on the stereo bright pupil technique, and have made an
method was proposed for the stereo bright pupil camera, which                                      eye-tracking tabletop as its application.
enables one point calibration. Then, the prototype of the eye-
tracking pen display was developed. The accuracy of the system                                     2       Technical Requirements
was approximately 0.7° on average, which is sufficient for hu-
man interaction support. We also developed an eye-tracking                                         There are several eye-trackers which we can be listed as de facto
tabletop as an application of the proposed stereo bright pupil                                     standards such as Tobii X120. However, they are not suitable for
technique.                                                                                         use with pen display. The biggest problem of such eye-trackers
                                                                                                   is that they have cameras and IR LEDs under their displays
CR Categories: H.5.2 [Information Interfaces and Presenta-                                         (Figure 1). When a right handed person use a pen on the display,
tion]:User Interfaces—Input devices and strategies; I.4.9 [Image                                   the right arm may hide the camera or LED.
Processing and Computer Vision]: Applications

Keywords: embodied interaction, eye-tracking, pen display,
bright pupil technique

1       Introduction
                                                                                                                      IR LEDs
Today, the intuitive user interfaces of PCs and PDAs, such as
pen display and touch panel, have bcome widely used. These
devices are expected to open up a new embodied interaction and
communication as well as interaction between humans and com-
puters.
                                                                                                                        Cameras
By focusing on the importance of embodied interaction, the
authors have developed a CG-embodied communication support                                                   Figure 1: Typical layout of cameras and LEDs.
system [Yamamoto el al. 2005]. Especially, the importance of
timing control in generating embodied motions and actions is                                       The tracking distance and gaze angle may also cause a problem
made clear for supporting natural, familiar, and polite interaction                                when a user draws on a pen display. Because, the tracking dis-
via CG and robot agent [Yamamoto et al. 2008]. However, for                                        tance of existing eye-trackers is approximately 50 cm or more,
making further uses of embodiment, it is required to analyze the                                   and the gaze angle is approximately 30° in many cases. If we put
relationships between body motion and attention.                                                   an eye-tracker at the left bottom of the display and use a pen on
                                                                                                   the display, the tracking distance becomes too close and gaze
If we could integrate pen display and eye-tracker, it becomes                                      angle becomes too wide.
possible to analyze various embodied interactions. For example,
                                                                                                   In addition, easy calibration is required for eye-tracking pen
    *
      michiya.yamamoto@kwansei.ac.jp                                                               display, so that intuitive interface can be realized. Thus, we can
    †
      nagamatu@kobe-u.ac.jp                                                                        summarize the technical requirements as follows:
    ‡
      watanabe@cse.oka-pu.ac.jp
                                                                                                       •   Free arrangement of cameras and LEDs to prevent ob-
                                                                                                           struction by the right hand
Copyright © 2010 by the Association for Computing Machinery, Inc.
Permission to make digital or hard copies of part or all of this work for personal or                  •   Robust gaze estimation with short distance & wide gaze
classroom use is granted without fee provided that copies are not made or distributed                      angle
for commercial advantage and that copies bear this notice and the full citation on the
first page. Copyrights for components of this work owned by others than ACM must be                    •   Easy calibration
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on
servers, or to redistribute to lists, requires prior specific permission and/or a fee.
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail
permissions@acm.org.
ETRA 2010, Austin, TX, March 22 – 24, 2010.
© 2010 ACM 978-1-60558-994-7/10/0003 $10.00

                                                                                             165
3     Reviews of Previous Studies to Decide Ar-                            right hand and the eyelid. Therefore, we reviewed the arrange-
      rangement of Cameras and IR LEDs                                     ments proposed in previous studies again. Some researchers
                                                                           have proposed camera-LED integrated systems. For example,
                                                                           Ohno developed a system that involved the use of one camera
As the first step of this study, we analyzed the body motions
                                                                           and two LEDs [Ohno 2006]. Chen et al. developed a system
involved in using pen display of a right handed user. For this, we
                                                                           that involved the use of two cameras and two LEDs mounted
used motion capture system (Vicon Motion Systems, Vicon 512)
                                                                           near the camera centers; in this arrangement, the camera and the
and measured a subject’s body motion; i.e., movement of head,
                                                                           LED were integrated into one component [Chen 2008]. We can
right shoulder, and arm. As shown in Figure 2, the posture of the
                                                                           arrange such a system to the left of the pen display; however,
subject and the angle of pen display were assumed to be limited
                                                                           such a system would be inadequate if the pen display is to be
to 3 cases to avoid hiding the cameras and IR LEDs.
                                                                           used at various angles. The two cameras should be separated for
                                                                           the eye tracking pen display system.
We developed a software for analyzing the arrangement. Figure
3 shows the screen shot of the software which draws the results
of measurement of 10 subjects. It can be seen that there is an             4     Stereo Bright Pupil Technique for Pen Dis-
unavailable volume for arranging cameras and IR LEDs at the                      play
left bottom.
                                                                           4.1     Bright Pupil Camera

                                                                           On the basis of the reviews of previous papers, we decided to
                                Sitting,                                   use the stereo bright pupil technique. We integrated an IR LED
                                Pen display at the angle of 60°            at the center of the camera (POINT GRAY, FFMV-03MTM,
                                                                           752x480 pixels) lens, as shown in Figure 5; this modified cam-
       Markers                                                             era is called the bright pupil camera. A 35-mm lens and an IR
                                                                           filter are attached. We positioned two bright pupil cameras sepa-
                                Standing,                                  rately to the left of the pen display (Figure 4 (c)). When these
                                Pen display at the angle of 60 °           cameras are used, the light from the LED reflects on the retina
                                                                           and a bright pupil can be observed in the camera image.

                                Standing,
                                Pen display at the angle of 15 °

Figure 2: Measurement of body motion while using a pen display.

                                Pen display




                                                                                              Figure 5: Bright pupil camera.

                                                                           4.2     Eye Model
              Unavailable volume
              for arranging cameras and IR LEDs Right arm                  Figure 6 shows the eye model in this study, which is typical in
     Figure 3: Arrangement volume of cameras and LEDs.                     model-based approaches. An eye consists of two balls. It has
                                                                           two axes: one is the optical axis of the eye that is the geometric
 Next, we reviewed previous studies and developed a prototype              center line of the eye, and the other is the visual axis that is the
of the system by considering its technical requirements. The 3D            line of sight connecting the fovea. These axes intersect at the
gaze-tracking approach was selected for accuracy [Shih et al.              center of the corneal curvature. The average of horizontal and
2004; Guestrin et al. 2007; Nagamatsu et al. 2008a]. This ap-              vertical angles between the optical and visual axes are 5.5° and
proach involves the use of two cameras and three or four LEDs.             1.0°, respectively [OSAKA, 1993].
Figure 4 (a) shows the arrangement of the system proposed by
Nagamatsu et al. In this study, we first developed a prototype of                                  Cornea
                                                                                                  Pupil
the system by positioning the cameras and LEDs: two cameras                                                     A Center of Corneal Curvature
are placed to the left of the pen display, and one LED each is                           Visual Axis              Rotation
                                                                                                                   Center E
placed on the top, left, and bottom frames of the pen display                      Optical Axis
(Figure 4 (b)). However, even with such an arrangement, stable
eye-tracking cannot be realized due to the obstructions by the                                     B Pupil                     Fovea
              LED                                                                                    Center

                                                                                                       Figure 6: Eye model.

                Camera                                                     4.3      Image Processing
        (a)                        (b)                      (c)
                                                                           By using two bright pupil cameras, the light from one of the
         Figure 4: Arrangement of cameras and LEDs.                        LED reflects at the retina and the camera image of the pupil




                                                                     166
Edge detection




                           Binarlize


            Figure 7: Example of image processing.
                                                                                     Figure 9: Prototype of the eye-tracking pen display.
becomes bright. In addition, there are two reflections of light
sources from the outer surface of the cornea called the first Pur-
kinje image (Figure 7, left). First, we carried out edge detection
in order to detect the position of the pupil. Next, we fitted an
ellipse to the edge, and calculated the pupil center. To detect the
position of the Purkinje image, we trimmed the neighborhood of
the pupil center, and binarlized the image. We considered the
two bright points as a Purkinje image (Figure 7, right). This                                                            60 °
image processing was performed using Open CV 1.0.

4.4     Estimation of the Optical Axis of the Eye                                                         300 mm

                                                                                                   Figure 10: Experimental setup.
We estimated the optical axis on the basis of the results of image
processing. We initially calculated the relationship between each              can be realized while a user is drawing a line while looking at
pixel on the image plane and the corresponding 3D position by                  the tip of the pen. The white cross is the estimated point. We can
calibrating the camera. We assumed that the light source and the               confirm that the center of the white cross and the tip of the pen
camera center are at the same position. Then, we obtained a                    is almost the same. We developed this system on an HP xw4600
plane that contains A and B by using the expression                            Workstation with MS Windows XP. The frame rate was ap-
 (C − Β') × (C − P')i( X − C) = 0 , where X is a point on the                  proximately 10 fps.
plane (Figure 8). One bright pupil camera was used to determine
one plane that contains the optical axis. Therefore, the optical               We then evaluated the prototype. Figure 10 shows the experi-
axis can be obtained as the intersection of two planes obtained                mental setup. The left part is the eye-tracking pen display and a
using the two cameras. While Chen estimated the optical axis by                subject. The minimum distance between the subject and pen
determining Virtual B − A in Figure 8, we determined the exact                 display was 30 cm. The angle of pen display was 60°. The right
optical axis [Nagamatsu et al. 2010]. After that, the user gazes at            LCD is displaying a captured and processed image.
a point on the pen display for calibration. The difference be-
tween optical axis and visual axis is revised by doing this cali-              In the experiment, we asked the user to gaze at the marker at the
bration [Nagamatsu et al. 2008b]. The cross point of optical axis              left side of the pen display for calibration. We next displayed a
and pen display is estimated as the gaze point.                                white cross on the pen display, and asked him to gaze at the
                                                                               center of the white cross for 10 frames. The cross was displayed
                                 Corneal                                       on each of the 128 pixels. Because of the narrow range of view
                                 Surface    Virtual B                          angle and focus of the cameras, the area where a user can move
                                                                               is limited. 3 students participated in the experiment.
                      Optical Axis           B Pupil Center
                                     Β''                                       5.2     Results
                                               Α
             Light / Camera                Center of Corneal Curvature
                                       P                                       Figure 11 shows the results. The accuracy was average 17.4
                          C                                                    pixels (5.2 mm) on the screen, which means about 0.71°. It was
Purkinje Image                    Purkinje Image
on Image Plage   P'                                                            equivalent to Tobii, etc. In other words, the pen display can
                                                                               recognize 22 horizontal lines.
              B'
                        Image Plane                                                            0    128    256 384 512    640 768   896 1024
                                                                                           0
           Figure 8: Estimation of the Optical Axes.
                                                                                         128

5     Evaluation                                                                         256                                                   Subject1
                                                                                                                                               Subject2
                                                                                         384
5.1     Method                                                                                                                                 Subject3

                                                                                         512
We integrated the bright pupil camera and pen display (Wacom,
                                                                                         640
DTI-520, 15 inch (380 mm), 1024 × 768 pixels) and developed a
prototype of the eye-tracking pen display. Figure 9 shows a pro-                         768
totype of the eye-tracking pen display. Here, the gaze estimation
                                                                                          Figure 11: Result of evaluation experiment.




                                                                         167
a prototype of an eye-tracking tabletop as an application of the
                                                                             proposed stereo bright pupil technique, and confirmed effective-
                                                                             ness of the system.

                                                                             Acknowledgement

                                                                             This work under our project “Embodied Communication Inter-
                                                                             face for Mind Connection” has been supported by “New IT In-
         Figure 12: Purkinje image on edge of cornea.                        frastructure for the Information-explosion Era” of MEXT Grant-
                                                                             in-Aid for Scientific Research on Priority Areas. Also, our pro-
In the case of some subjects, the Purkinje image was reflected               ject "Generation and Control Technology of Human-entrained
on the edge of the cornea, and the gaze point could not be cor-              Embodied Media" has been supported by CREST (Core Re-
rectly estimated, as shown in Figure 12. However, this problem               search for Evolution Science and Technology) of JST (Japan
can be solved by using one or more bright pupil cameras in a                 Science and Technology Agency).
layout-free arrangement.
                                                                             References
6     Application
                                                                             CHEN, J., TONG, Y., GRAY, W., AND JI. Q. 2008. A Robust 3D
The proposed method can be applied to develop various types of               Eye Gaze Tracking System using Noise Reduction. In Proceed-
eye-tracking systems.                                                        ings of the 2008 symposium on Eye tracking research & appli-
                                                                             cations, 189–196.
For example, we have developed a prototype of an eye-tracking
tabletop interface as shown in Figure 13. We integrated two                  GUESTRIN, E. D., AND EIZENMAN, M. 2007. Remote Point-of-
bright pupil cameras and a projector. The image is projected on              Gaze Estimation with Free Head Movements Requiring a Sin-
the tabletop. This interface is used to realize both eye-gaze inter-         gle-Point Calibration. In Proceedings of the 29th Annual Inter-
action and physical interaction. For example, the red square                 national Conference of the IEEE EMBS, 4556–4560.
indicates the gaze point on the tabletop. When a user is looking
at a physical pointer on the tabletop, the red square moves ap-              NAGAMATSU, T., KAMAHARA, J., IKO, T., AND TANAKA, N. 2008.
propriately, followed by physical movement of the pointer. We                One-Point Calibration Gaze Tracking Based on Eyeball Kine-
can enlarge the tabletop interface and extend the interaction area           matics Using Stereo Cameras. In Proceedings of the 2008 sym-
to include off-surface areas.                                                posium on Eye tracking research & applications, 95–98.

                                                                             NAGAMATSU, T., KAMAHARA, J., AND TANAKA, N. 2008. 3D Gaze
                      Britht Pupil Camera       Projector                    Tracking with Easy Calibration Using stereo Cameras for Ro-
                                                                             bot and Human Communication. In Proceedings of IEEE RO-
                                                                             MAN 2008, 59–64.

                                                                             NAGAMATSU, T., IWAMOTO, Y., KAMAHARA, J., TANAKA, N., AND
         User                                                                YAMAMOTO, Y. 2010. Gaze Estimation Method based on an
                                                                             Aspherical Model of the Cornea Surface of Revolution about the
                                              Eye-Gaze
                                                                             Optical Axis of the Eye. In Proceedings of Eye Tracking Re-
                                              Interaction
                                                                             search & Applications Symposium ETRA 2010. (to appear).

                     Tabletop                                                OHNO, T. 2006. One-point calibration gaze tracking method. In
                                               Physical                      Proceedings of the 2006 symposium on Eye tracking research &
                                              Interaction                    applications, 34.
       Figure 13: Prototype of an eye-tracking tabletop.
                                                                             OSAKA, R. 1993. Experimental Psychology of Eye Movements
                                                                             (in Japanese). The University of Nagoya Press, Nagoya, Japan.
In this manner, bright pupil cameras enable flexible arrangement
of cameras, which can lead to the developments of various hu-
                                                                             SHIH, S.-W., AND LIU, J. 2004. A Novel Approach to 3-D
man-computer interfaces such as pen displays and tabletops as
                                                                             Gaze Tracking using Stereo Cameras. IEEE Transactions on
well as interaction analysis of laptops.
                                                                             Systems, Man, and Cybernetics Part B 34, 1, 234–245.

7      Conclusion                                                            YAMAMOTO, M., AND WATANABE, T. 2005. Development of an
                                                                             Embodied Interaction System with InterActor by Speech and
In this study, we have developed an eye-tracking pen display                 Hand Motion Input. In CD-ROM of the 2005 IEEE International
based on the stereo bright pupil technique. First, the bright pupil          Workshop on Robots and Human Interactive Communication,
camera was developed by reviewing and examining the ar-                      323–328.
rangement of cameras and LEDs for pen display. Next, the gaze
estimation method was proposed for the bright pupil camera,                  YAMAMOTO, M., AND WATANABE, T. 2008. Timing Control Ef-
which enables one-point calibration and wide-angle accuracy.                 fects of Utterance to Communicative Actions on Embodied In-
Then, the prototype of the eye-tracking pen display was devel-               teraction with a Robot and CG Character. International Journal
oped. The accuracy of the system was approximately 0.7° on                   of Human-Computer Interaction 24, 1, 103–112.
average, which is sufficient for pen display. We also developed




                                                                       168

Mais conteúdo relacionado

Mais procurados

Saksham presentation
Saksham presentationSaksham presentation
Saksham presentationSakshamTurki
 
GSOC proposal
GSOC proposalGSOC proposal
GSOC proposallavanya
 
Istance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of PerformanceIstance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of PerformanceKalle
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPTBinayak Ghosh
 
Human activity recognition
Human activity recognition Human activity recognition
Human activity recognition srikanthgadam
 
Maltoni 1
Maltoni 1Maltoni 1
Maltoni 1odtylu
 
Camera-based sensory substitution and augmented reality for the blind (ACIVS ...
Camera-based sensory substitution and augmented reality for the blind (ACIVS ...Camera-based sensory substitution and augmented reality for the blind (ACIVS ...
Camera-based sensory substitution and augmented reality for the blind (ACIVS ...Peter Meijer
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...NidhinRaj Saikripa
 
Dip 4 ece-1 & 2
Dip 4 ece-1 & 2Dip 4 ece-1 & 2
Dip 4 ece-1 & 2hayhadiabbas
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyesRoshmi Sarmah
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...NidhinRaj Saikripa
 
General introduction to computer vision
General introduction to computer visionGeneral introduction to computer vision
General introduction to computer visionbutest
 

Mais procurados (20)

streoscopy ppt
streoscopy pptstreoscopy ppt
streoscopy ppt
 
Saksham presentation
Saksham presentationSaksham presentation
Saksham presentation
 
GSOC proposal
GSOC proposalGSOC proposal
GSOC proposal
 
Istance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of PerformanceIstance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of Performance
 
58 towards a new gaze tracker
58 towards a new gaze tracker58 towards a new gaze tracker
58 towards a new gaze tracker
 
Monocular Human Pose Estimation with Bayesian Networks
Monocular Human Pose Estimation with Bayesian NetworksMonocular Human Pose Estimation with Bayesian Networks
Monocular Human Pose Estimation with Bayesian Networks
 
Infrared -
Infrared - Infrared -
Infrared -
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPT
 
Human activity recognition
Human activity recognition Human activity recognition
Human activity recognition
 
Pdf4
Pdf4Pdf4
Pdf4
 
Maltoni 1
Maltoni 1Maltoni 1
Maltoni 1
 
Camera-based sensory substitution and augmented reality for the blind (ACIVS ...
Camera-based sensory substitution and augmented reality for the blind (ACIVS ...Camera-based sensory substitution and augmented reality for the blind (ACIVS ...
Camera-based sensory substitution and augmented reality for the blind (ACIVS ...
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
Dip 4 ece-1 & 2
Dip 4 ece-1 & 2Dip 4 ece-1 & 2
Dip 4 ece-1 & 2
 
A digital camera
A digital cameraA digital camera
A digital camera
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
 
Sony
SonySony
Sony
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
Gesture phones final
Gesture phones  finalGesture phones  final
Gesture phones final
 
General introduction to computer vision
General introduction to computer visionGeneral introduction to computer vision
General introduction to computer vision
 

Destaque

Ecobuild Londres
Ecobuild LondresEcobuild Londres
Ecobuild LondresEva Cajigas
 
Невидимый гос долг в Казахстане
Невидимый гос долг в КазахстанеНевидимый гос долг в Казахстане
Невидимый гос долг в КазахстанеKassymkhan Kapparov
 
ZFConf 2010: Proposal Lifecycle
ZFConf 2010: Proposal LifecycleZFConf 2010: Proposal Lifecycle
ZFConf 2010: Proposal LifecycleZFConf Conference
 
Doing Business With Aboriginal People
Doing Business With Aboriginal PeopleDoing Business With Aboriginal People
Doing Business With Aboriginal PeopleLee_Ahenakew
 
הגברת תודעת השימוש באינטרנט באוכלוסיה הנשית
הגברת תודעת השימוש באינטרנט באוכלוסיה הנשיתהגברת תודעת השימוש באינטרנט באוכלוסיה הנשית
הגברת תודעת השימוש באינטרנט באוכלוסיה הנשיתhaimkarel
 
M&A Integration Check Lists and Benchmarks From Our Most Successful
M&A Integration Check Lists and Benchmarks From Our Most SuccessfulM&A Integration Check Lists and Benchmarks From Our Most Successful
M&A Integration Check Lists and Benchmarks From Our Most Successfulperegoff
 
Logistiek 30 Okt 2003
Logistiek 30 Okt 2003Logistiek 30 Okt 2003
Logistiek 30 Okt 2003guest2f17d3
 
שיעור רביעי התעדכנות התמצאות ודואר אלקטרוני
שיעור רביעי   התעדכנות התמצאות ודואר אלקטרונישיעור רביעי   התעדכנות התמצאות ודואר אלקטרוני
שיעור רביעי התעדכנות התמצאות ודואר אלקטרוניhaimkarel
 
JudCon Brazil 2014 - Mobile push for all platforms
JudCon Brazil 2014 - Mobile push for all platformsJudCon Brazil 2014 - Mobile push for all platforms
JudCon Brazil 2014 - Mobile push for all platformsDaniel Passos
 
Plataforma Windows Azure (Cloud Computing)
Plataforma Windows Azure (Cloud Computing)Plataforma Windows Azure (Cloud Computing)
Plataforma Windows Azure (Cloud Computing)Marcelo Paiva
 
Homophones Lesson
Homophones LessonHomophones Lesson
Homophones Lessonjgd7971
 
DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...
DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...
DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...Dealin Doug
 

Destaque (20)

News
News News
News
 
Ecobuild Londres
Ecobuild LondresEcobuild Londres
Ecobuild Londres
 
Statby school 2554_m6_1057012007
Statby school 2554_m6_1057012007Statby school 2554_m6_1057012007
Statby school 2554_m6_1057012007
 
Невидимый гос долг в Казахстане
Невидимый гос долг в КазахстанеНевидимый гос долг в Казахстане
Невидимый гос долг в Казахстане
 
ZFConf 2010: Proposal Lifecycle
ZFConf 2010: Proposal LifecycleZFConf 2010: Proposal Lifecycle
ZFConf 2010: Proposal Lifecycle
 
Doing Business With Aboriginal People
Doing Business With Aboriginal PeopleDoing Business With Aboriginal People
Doing Business With Aboriginal People
 
הגברת תודעת השימוש באינטרנט באוכלוסיה הנשית
הגברת תודעת השימוש באינטרנט באוכלוסיה הנשיתהגברת תודעת השימוש באינטרנט באוכלוסיה הנשית
הגברת תודעת השימוש באינטרנט באוכלוסיה הנשית
 
Pajaros
PajarosPajaros
Pajaros
 
TEMA 5B Vocabulario
TEMA 5B VocabularioTEMA 5B Vocabulario
TEMA 5B Vocabulario
 
M&A Integration Check Lists and Benchmarks From Our Most Successful
M&A Integration Check Lists and Benchmarks From Our Most SuccessfulM&A Integration Check Lists and Benchmarks From Our Most Successful
M&A Integration Check Lists and Benchmarks From Our Most Successful
 
Logistiek 30 Okt 2003
Logistiek 30 Okt 2003Logistiek 30 Okt 2003
Logistiek 30 Okt 2003
 
שיעור רביעי התעדכנות התמצאות ודואר אלקטרוני
שיעור רביעי   התעדכנות התמצאות ודואר אלקטרונישיעור רביעי   התעדכנות התמצאות ודואר אלקטרוני
שיעור רביעי התעדכנות התמצאות ודואר אלקטרוני
 
Adverts
AdvertsAdverts
Adverts
 
JudCon Brazil 2014 - Mobile push for all platforms
JudCon Brazil 2014 - Mobile push for all platformsJudCon Brazil 2014 - Mobile push for all platforms
JudCon Brazil 2014 - Mobile push for all platforms
 
India Horizontal Plant
India Horizontal PlantIndia Horizontal Plant
India Horizontal Plant
 
Full MSSQL Injection PWNage
Full MSSQL Injection PWNageFull MSSQL Injection PWNage
Full MSSQL Injection PWNage
 
Plataforma Windows Azure (Cloud Computing)
Plataforma Windows Azure (Cloud Computing)Plataforma Windows Azure (Cloud Computing)
Plataforma Windows Azure (Cloud Computing)
 
Homophones Lesson
Homophones LessonHomophones Lesson
Homophones Lesson
 
Jvum2013s niftycloud
Jvum2013s niftycloudJvum2013s niftycloud
Jvum2013s niftycloud
 
DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...
DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...
DealinDougCommunity.com - ArapahoeOnline.com; 2009 AAA Aggressive Driving Res...
 

Semelhante a Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil Technique

IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image ProcessingIRJET Journal
 
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Kalle
 
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Kalle
 
IRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually ImpairedIRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually ImpairedIRJET Journal
 
Droege Pupil Center Detection In Low Resolution Images
Droege Pupil Center Detection In Low Resolution ImagesDroege Pupil Center Detection In Low Resolution Images
Droege Pupil Center Detection In Low Resolution ImagesKalle
 
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...mrgazer
 
CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...
CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...
CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...Rong-Hao Liang
 
IRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze TrackingIRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze TrackingIRJET Journal
 
10.1109@ecs.2015.7124874
10.1109@ecs.2015.712487410.1109@ecs.2015.7124874
10.1109@ecs.2015.7124874Ganesh Raja
 
Implementing Deep Learning Model in Human Computer Interaction with Face Reco...
Implementing Deep Learning Model in Human Computer Interaction with Face Reco...Implementing Deep Learning Model in Human Computer Interaction with Face Reco...
Implementing Deep Learning Model in Human Computer Interaction with Face Reco...IRJET Journal
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with EyeballIRJET Journal
 
IRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET Journal
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
Eye tracking – an innovative monitor
Eye tracking – an innovative monitorEye tracking – an innovative monitor
Eye tracking – an innovative monitorSakthi Sivaraman S
 
Eye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardEye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardIRJET Journal
 
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALAHUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALASaikiran Panjala
 
IRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind AssistanceIRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind AssistanceIRJET Journal
 
Van der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawingVan der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawingmrgazer
 

Semelhante a Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil Technique (20)

IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image Processing
 
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
 
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
 
IRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually ImpairedIRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually Impaired
 
Droege Pupil Center Detection In Low Resolution Images
Droege Pupil Center Detection In Low Resolution ImagesDroege Pupil Center Detection In Low Resolution Images
Droege Pupil Center Detection In Low Resolution Images
 
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
 
CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...
CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...
CHI'15 - WonderLens: Optical Lenses and Mirrors for Tangible Interactions on ...
 
IRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze TrackingIRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze Tracking
 
10.1109@ecs.2015.7124874
10.1109@ecs.2015.712487410.1109@ecs.2015.7124874
10.1109@ecs.2015.7124874
 
Implementing Deep Learning Model in Human Computer Interaction with Face Reco...
Implementing Deep Learning Model in Human Computer Interaction with Face Reco...Implementing Deep Learning Model in Human Computer Interaction with Face Reco...
Implementing Deep Learning Model in Human Computer Interaction with Face Reco...
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with Eyeball
 
IRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
Eye tracking – an innovative monitor
Eye tracking – an innovative monitorEye tracking – an innovative monitor
Eye tracking – an innovative monitor
 
Eye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardEye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual Keyboard
 
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALAHUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
 
2010TDC_light
2010TDC_light2010TDC_light
2010TDC_light
 
IRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind AssistanceIRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind Assistance
 
Van der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawingVan der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawing
 
EyePhone.ppt
EyePhone.pptEyePhone.ppt
EyePhone.ppt
 

Mais de Kalle

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsKalle
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Kalle
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Kalle
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Kalle
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlKalle
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Kalle
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingKalle
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeKalle
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Kalle
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneKalle
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerKalle
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksKalle
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingKalle
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchKalle
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyKalle
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Kalle
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Kalle
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Kalle
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Kalle
 
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Kalle
 

Mais de Kalle (20)

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze Alone
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem Solving
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement Study
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
 
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
 

Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil Technique

  • 1. Development of Eye-Tracking Pen Display Based on Stereo Bright Pupil Technique Michiya Yamamoto* Takashi Nagamatsu† Tomio Watanabe‡ School of Science and Technology Graduate School of Maritime Sciences Department of Systems Engineering Kwansei Gakuin University Kobe University Okayama Prefectural University Abstract we could analyze how a presenter indicates or emphasizes a slide in presentation by using intuitive pen display. In addition, The intuitive user interfaces of PCs and PDAs, such as pen dis- such an eye-tracking pen display could become a gadget for play and touch panel, have become widely used in recent times. realizing a new mode of interaction between humans and com- In this study, we have developed an eye-tracking pen display puters. based on the stereo bright pupil technique. First, the bright pupil camera was developed by examining the arrangement of cam- In this study, we have developed an eye-tracking pen display eras and LEDs for pen display. Next, the gaze estimation based on the stereo bright pupil technique, and have made an method was proposed for the stereo bright pupil camera, which eye-tracking tabletop as its application. enables one point calibration. Then, the prototype of the eye- tracking pen display was developed. The accuracy of the system 2 Technical Requirements was approximately 0.7° on average, which is sufficient for hu- man interaction support. We also developed an eye-tracking There are several eye-trackers which we can be listed as de facto tabletop as an application of the proposed stereo bright pupil standards such as Tobii X120. However, they are not suitable for technique. use with pen display. The biggest problem of such eye-trackers is that they have cameras and IR LEDs under their displays CR Categories: H.5.2 [Information Interfaces and Presenta- (Figure 1). When a right handed person use a pen on the display, tion]:User Interfaces—Input devices and strategies; I.4.9 [Image the right arm may hide the camera or LED. Processing and Computer Vision]: Applications Keywords: embodied interaction, eye-tracking, pen display, bright pupil technique 1 Introduction IR LEDs Today, the intuitive user interfaces of PCs and PDAs, such as pen display and touch panel, have bcome widely used. These devices are expected to open up a new embodied interaction and communication as well as interaction between humans and com- puters. Cameras By focusing on the importance of embodied interaction, the authors have developed a CG-embodied communication support Figure 1: Typical layout of cameras and LEDs. system [Yamamoto el al. 2005]. Especially, the importance of timing control in generating embodied motions and actions is The tracking distance and gaze angle may also cause a problem made clear for supporting natural, familiar, and polite interaction when a user draws on a pen display. Because, the tracking dis- via CG and robot agent [Yamamoto et al. 2008]. However, for tance of existing eye-trackers is approximately 50 cm or more, making further uses of embodiment, it is required to analyze the and the gaze angle is approximately 30° in many cases. If we put relationships between body motion and attention. an eye-tracker at the left bottom of the display and use a pen on the display, the tracking distance becomes too close and gaze If we could integrate pen display and eye-tracker, it becomes angle becomes too wide. possible to analyze various embodied interactions. For example, In addition, easy calibration is required for eye-tracking pen * michiya.yamamoto@kwansei.ac.jp display, so that intuitive interface can be realized. Thus, we can † nagamatu@kobe-u.ac.jp summarize the technical requirements as follows: ‡ watanabe@cse.oka-pu.ac.jp • Free arrangement of cameras and LEDs to prevent ob- struction by the right hand Copyright © 2010 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or • Robust gaze estimation with short distance & wide gaze classroom use is granted without fee provided that copies are not made or distributed angle for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be • Easy calibration honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail permissions@acm.org. ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 165
  • 2. 3 Reviews of Previous Studies to Decide Ar- right hand and the eyelid. Therefore, we reviewed the arrange- rangement of Cameras and IR LEDs ments proposed in previous studies again. Some researchers have proposed camera-LED integrated systems. For example, Ohno developed a system that involved the use of one camera As the first step of this study, we analyzed the body motions and two LEDs [Ohno 2006]. Chen et al. developed a system involved in using pen display of a right handed user. For this, we that involved the use of two cameras and two LEDs mounted used motion capture system (Vicon Motion Systems, Vicon 512) near the camera centers; in this arrangement, the camera and the and measured a subject’s body motion; i.e., movement of head, LED were integrated into one component [Chen 2008]. We can right shoulder, and arm. As shown in Figure 2, the posture of the arrange such a system to the left of the pen display; however, subject and the angle of pen display were assumed to be limited such a system would be inadequate if the pen display is to be to 3 cases to avoid hiding the cameras and IR LEDs. used at various angles. The two cameras should be separated for the eye tracking pen display system. We developed a software for analyzing the arrangement. Figure 3 shows the screen shot of the software which draws the results of measurement of 10 subjects. It can be seen that there is an 4 Stereo Bright Pupil Technique for Pen Dis- unavailable volume for arranging cameras and IR LEDs at the play left bottom. 4.1 Bright Pupil Camera On the basis of the reviews of previous papers, we decided to Sitting, use the stereo bright pupil technique. We integrated an IR LED Pen display at the angle of 60° at the center of the camera (POINT GRAY, FFMV-03MTM, 752x480 pixels) lens, as shown in Figure 5; this modified cam- Markers era is called the bright pupil camera. A 35-mm lens and an IR filter are attached. We positioned two bright pupil cameras sepa- Standing, rately to the left of the pen display (Figure 4 (c)). When these Pen display at the angle of 60 ° cameras are used, the light from the LED reflects on the retina and a bright pupil can be observed in the camera image. Standing, Pen display at the angle of 15 ° Figure 2: Measurement of body motion while using a pen display. Pen display Figure 5: Bright pupil camera. 4.2 Eye Model Unavailable volume for arranging cameras and IR LEDs Right arm Figure 6 shows the eye model in this study, which is typical in Figure 3: Arrangement volume of cameras and LEDs. model-based approaches. An eye consists of two balls. It has two axes: one is the optical axis of the eye that is the geometric Next, we reviewed previous studies and developed a prototype center line of the eye, and the other is the visual axis that is the of the system by considering its technical requirements. The 3D line of sight connecting the fovea. These axes intersect at the gaze-tracking approach was selected for accuracy [Shih et al. center of the corneal curvature. The average of horizontal and 2004; Guestrin et al. 2007; Nagamatsu et al. 2008a]. This ap- vertical angles between the optical and visual axes are 5.5° and proach involves the use of two cameras and three or four LEDs. 1.0°, respectively [OSAKA, 1993]. Figure 4 (a) shows the arrangement of the system proposed by Nagamatsu et al. In this study, we first developed a prototype of Cornea Pupil the system by positioning the cameras and LEDs: two cameras A Center of Corneal Curvature are placed to the left of the pen display, and one LED each is Visual Axis Rotation Center E placed on the top, left, and bottom frames of the pen display Optical Axis (Figure 4 (b)). However, even with such an arrangement, stable eye-tracking cannot be realized due to the obstructions by the B Pupil Fovea LED Center Figure 6: Eye model. Camera 4.3 Image Processing (a) (b) (c) By using two bright pupil cameras, the light from one of the Figure 4: Arrangement of cameras and LEDs. LED reflects at the retina and the camera image of the pupil 166
  • 3. Edge detection Binarlize Figure 7: Example of image processing. Figure 9: Prototype of the eye-tracking pen display. becomes bright. In addition, there are two reflections of light sources from the outer surface of the cornea called the first Pur- kinje image (Figure 7, left). First, we carried out edge detection in order to detect the position of the pupil. Next, we fitted an ellipse to the edge, and calculated the pupil center. To detect the position of the Purkinje image, we trimmed the neighborhood of the pupil center, and binarlized the image. We considered the two bright points as a Purkinje image (Figure 7, right). This 60 ° image processing was performed using Open CV 1.0. 4.4 Estimation of the Optical Axis of the Eye 300 mm Figure 10: Experimental setup. We estimated the optical axis on the basis of the results of image processing. We initially calculated the relationship between each can be realized while a user is drawing a line while looking at pixel on the image plane and the corresponding 3D position by the tip of the pen. The white cross is the estimated point. We can calibrating the camera. We assumed that the light source and the confirm that the center of the white cross and the tip of the pen camera center are at the same position. Then, we obtained a is almost the same. We developed this system on an HP xw4600 plane that contains A and B by using the expression Workstation with MS Windows XP. The frame rate was ap- (C − Β') × (C − P')i( X − C) = 0 , where X is a point on the proximately 10 fps. plane (Figure 8). One bright pupil camera was used to determine one plane that contains the optical axis. Therefore, the optical We then evaluated the prototype. Figure 10 shows the experi- axis can be obtained as the intersection of two planes obtained mental setup. The left part is the eye-tracking pen display and a using the two cameras. While Chen estimated the optical axis by subject. The minimum distance between the subject and pen determining Virtual B − A in Figure 8, we determined the exact display was 30 cm. The angle of pen display was 60°. The right optical axis [Nagamatsu et al. 2010]. After that, the user gazes at LCD is displaying a captured and processed image. a point on the pen display for calibration. The difference be- tween optical axis and visual axis is revised by doing this cali- In the experiment, we asked the user to gaze at the marker at the bration [Nagamatsu et al. 2008b]. The cross point of optical axis left side of the pen display for calibration. We next displayed a and pen display is estimated as the gaze point. white cross on the pen display, and asked him to gaze at the center of the white cross for 10 frames. The cross was displayed Corneal on each of the 128 pixels. Because of the narrow range of view Surface Virtual B angle and focus of the cameras, the area where a user can move is limited. 3 students participated in the experiment. Optical Axis B Pupil Center Β'' 5.2 Results Α Light / Camera Center of Corneal Curvature P Figure 11 shows the results. The accuracy was average 17.4 C pixels (5.2 mm) on the screen, which means about 0.71°. It was Purkinje Image Purkinje Image on Image Plage P' equivalent to Tobii, etc. In other words, the pen display can recognize 22 horizontal lines. B' Image Plane 0 128 256 384 512 640 768 896 1024 0 Figure 8: Estimation of the Optical Axes. 128 5 Evaluation 256 Subject1 Subject2 384 5.1 Method Subject3 512 We integrated the bright pupil camera and pen display (Wacom, 640 DTI-520, 15 inch (380 mm), 1024 × 768 pixels) and developed a prototype of the eye-tracking pen display. Figure 9 shows a pro- 768 totype of the eye-tracking pen display. Here, the gaze estimation Figure 11: Result of evaluation experiment. 167
  • 4. a prototype of an eye-tracking tabletop as an application of the proposed stereo bright pupil technique, and confirmed effective- ness of the system. Acknowledgement This work under our project “Embodied Communication Inter- face for Mind Connection” has been supported by “New IT In- Figure 12: Purkinje image on edge of cornea. frastructure for the Information-explosion Era” of MEXT Grant- in-Aid for Scientific Research on Priority Areas. Also, our pro- In the case of some subjects, the Purkinje image was reflected ject "Generation and Control Technology of Human-entrained on the edge of the cornea, and the gaze point could not be cor- Embodied Media" has been supported by CREST (Core Re- rectly estimated, as shown in Figure 12. However, this problem search for Evolution Science and Technology) of JST (Japan can be solved by using one or more bright pupil cameras in a Science and Technology Agency). layout-free arrangement. References 6 Application CHEN, J., TONG, Y., GRAY, W., AND JI. Q. 2008. A Robust 3D The proposed method can be applied to develop various types of Eye Gaze Tracking System using Noise Reduction. In Proceed- eye-tracking systems. ings of the 2008 symposium on Eye tracking research & appli- cations, 189–196. For example, we have developed a prototype of an eye-tracking tabletop interface as shown in Figure 13. We integrated two GUESTRIN, E. D., AND EIZENMAN, M. 2007. Remote Point-of- bright pupil cameras and a projector. The image is projected on Gaze Estimation with Free Head Movements Requiring a Sin- the tabletop. This interface is used to realize both eye-gaze inter- gle-Point Calibration. In Proceedings of the 29th Annual Inter- action and physical interaction. For example, the red square national Conference of the IEEE EMBS, 4556–4560. indicates the gaze point on the tabletop. When a user is looking at a physical pointer on the tabletop, the red square moves ap- NAGAMATSU, T., KAMAHARA, J., IKO, T., AND TANAKA, N. 2008. propriately, followed by physical movement of the pointer. We One-Point Calibration Gaze Tracking Based on Eyeball Kine- can enlarge the tabletop interface and extend the interaction area matics Using Stereo Cameras. In Proceedings of the 2008 sym- to include off-surface areas. posium on Eye tracking research & applications, 95–98. NAGAMATSU, T., KAMAHARA, J., AND TANAKA, N. 2008. 3D Gaze Britht Pupil Camera Projector Tracking with Easy Calibration Using stereo Cameras for Ro- bot and Human Communication. In Proceedings of IEEE RO- MAN 2008, 59–64. NAGAMATSU, T., IWAMOTO, Y., KAMAHARA, J., TANAKA, N., AND User YAMAMOTO, Y. 2010. Gaze Estimation Method based on an Aspherical Model of the Cornea Surface of Revolution about the Eye-Gaze Optical Axis of the Eye. In Proceedings of Eye Tracking Re- Interaction search & Applications Symposium ETRA 2010. (to appear). Tabletop OHNO, T. 2006. One-point calibration gaze tracking method. In Physical Proceedings of the 2006 symposium on Eye tracking research & Interaction applications, 34. Figure 13: Prototype of an eye-tracking tabletop. OSAKA, R. 1993. Experimental Psychology of Eye Movements (in Japanese). The University of Nagoya Press, Nagoya, Japan. In this manner, bright pupil cameras enable flexible arrangement of cameras, which can lead to the developments of various hu- SHIH, S.-W., AND LIU, J. 2004. A Novel Approach to 3-D man-computer interfaces such as pen displays and tabletops as Gaze Tracking using Stereo Cameras. IEEE Transactions on well as interaction analysis of laptops. Systems, Man, and Cybernetics Part B 34, 1, 234–245. 7 Conclusion YAMAMOTO, M., AND WATANABE, T. 2005. Development of an Embodied Interaction System with InterActor by Speech and In this study, we have developed an eye-tracking pen display Hand Motion Input. In CD-ROM of the 2005 IEEE International based on the stereo bright pupil technique. First, the bright pupil Workshop on Robots and Human Interactive Communication, camera was developed by reviewing and examining the ar- 323–328. rangement of cameras and LEDs for pen display. Next, the gaze estimation method was proposed for the bright pupil camera, YAMAMOTO, M., AND WATANABE, T. 2008. Timing Control Ef- which enables one-point calibration and wide-angle accuracy. fects of Utterance to Communicative Actions on Embodied In- Then, the prototype of the eye-tracking pen display was devel- teraction with a Robot and CG Character. International Journal oped. The accuracy of the system was approximately 0.7° on of Human-Computer Interaction 24, 1, 103–112. average, which is sufficient for pen display. We also developed 168