Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies,
1. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 1
APPROVED BY AICTE, NEW DELHI & AFFILIATED TO NORTH MAHARASHTRA
UNIVERSITY, JALGAON CERTIFIED BY ISO 9001:2008
Jamia Institute of Engineering & Management
Studies, Akkalkuwa
.
Department of Computer Engineering
Seminar report on
“Eye Gaze” by
Mr. Saba Karim
[BE COMPUTER]
2. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 2
Jamia Institute of Engineering & Management
Studies, Akkalkuwa
CERTIFICATE
This is to certify that the seminar report of seminar entitled, “Eye Gaze”,
being submitted by Mr. Saba Karim to Computer Engineering Department is a
record of bonafied work carried out by him under my supervision and guidance
during year 2018-2019.
Seminar Guide Head of Department
[Mohammad Asif] [Prof. Patel Suhel Ishaq]
I/c Principal
[Prof. Saiyed Irfan]
3. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 3
CHAPTER 1
INTRODUCTION
1.1 Introduction
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies,
Fig:1.1 Eye gaze communication
5. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 5
This system is mainly developed for those who This system is mainly developed for those who
lack the use of their hands or voice. Only requirements to operate the Eye gaze are control of at least one
eye with good vision & ability to keep head fairly still. Eye gaze Systems are in use around the world. Its
users are adults and children with cerebral palsy, spinal cord injuries, brain injuries, ALS, multiple
sclerosis, brainstem strokes, muscular dystrophy, and Werdnig Hoffman syndrome. Eye gaze Systems
are being used in homes, offices, schools, hospitals, and long term care facilities. By looking at control
keys displayed on a screen, a person can synthesize speech, control his environment (lights, appliances,
etc.), type, operate a telephone, run computer software, operate a computer mouse, and access the Internet
and e-mail. Eye gaze Systems are being used to write books, attend school and enhance the quality of life
of people with disabilities all over the world.
1.2 The skills needed by the user:
1.1.1 Good control of one eye:
The user must be able to look up, down, left and right. He must be able to fix his gaze on all areas
of a 15-inch screen that is about 24 inches in front of his face. He must be able to focus on one spot for at
least 1/2 second. Several common eye movement problems may interfere with Eye gaze use. These
include:
Nystagmus (constant, involuntary movement of the eyeball): The user may not be able to fix his gaze long
enough to make eye gaze selections.
Alternating strabismus (eyes cannot be directed to the same object, either one deviates): The Eye gaze
System is constantly tracking the same single eye. If, for example, a user with alternating strabismus is
operating the Eyegaze System with the right eye, and that eye begins to deviate, the left eye will take over
and focus on the screen. The Eye gaze camera, however, will continue to take pictures of the right eye,
and the System will not be able to determine where the user's left eye is focused. When the left eye
deviates and the right eye is again fixed on the screen the Eyegaze System will resume predicting the gaze
point. Putting a partial eye patch over the nasal side of the eye not being observed by the camera often
solves this tracking problem. Since only the unpatched eye can the screen, it will continuously focus on
the screen. By applying only a nasal-side patch to the other eye, the user will retain peripheral vision on
that side.
6. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 6
1.2.2 Adequate vision:
Several common vision problems may affect a user's ability to see text clearly on the Eyegaze
monitor. These include the following:
1.2.3 Inadequate Visual acuity:
The user must be able to see text on the screen clearly. If, prior to his injury or the onset of his
illness he wore glasses, he may need corrective lenses to operate the Eyegaze System. If he's over 40
years old and has not had his vision checked recently, he might need reading glasses in order to see the
screen clearly. In most cases, eyetracking works well with glasses. The calibration procedure
accommodates for the refractive properties of most lenses. Hard-line bifocals can be a problem if the lens
boundary splits the image of the pupil, making it difficult for the system's image processing software to
determine the pupil center accurately. Graded bifocals, however, typically do not interfere with
eyetracking. Soft contact lenses that cover all or most of the cornea generally work well with the Eyegaze
System. The corneal reflection is obtained from the contact lens surface rather than the cornea itself.
Small, hard contacts can interfere, if the lens moves around considerably on the cornea and causes the
corneal reflection to move across the discontinuity between the contact lens and the cornea.
Diplopia (double vision):
Diplopia may be the result of an injury to the brain, or a side effect of many commonly prescribed
medications, and may make it difficult for the user to fix his gaze on a given point. Partially patching the
eye not being tracked may alleviate double vision during Eyegaze System operation.
1.2.4 Blurred vision:
Another occurrence associated with some brain injuries, as well as a side effect of medications, a
blurred image on the screen decreases the accuracy of eye fixations.
1.2.5 Cataracts (clouding of the lens of the eye):
If a cataract has formed on the portion of the lens that covers the pupil, it may prevent light from
passing through the pupil to reflect off the retina. Without a good retinal reflection the Eyegaze System
cannot accurately predict the user's eye fixations. The clouded lens may also make it difficult for a user to
see text on the screen clearly. Surgical removal of the cataracts will normally solve the problem and make
Eyegaze use possible.
1.2.6 Homonymous hemianopsia
(blindness or defective vision in the right or left halves of the visual fields of both eyes): This
may make calibration almost impossible if the user cannot see calibration points on one side of the screen.
7. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 7
1.3 Ability to maintain a position in front of the Eyegaze monitor:
It is generally easiest to run the System from an upright, seated position, with the head centered in
front of the Eyegaze monitor.However the Eyegaze System can be operated from a semi-reclined position
if necessary. Continuous, uncontrolled head movement can make Eyegaze operation difficult, since the
Eyegaze System must relocate the eye each time the user moves away from the camera's field of view and
then returns. Even though the System's eye search is completed in just a second or two, it will be more
tiring for a user with constant head movement to operate the System.
Absence of medication side effects that affect Eyegaze operation:
Many commonly prescribed medications have potential side effects that can make it difficult to operate
Eyegaze. Anticonvulsants (seizure drugs) can cause: nystagmus, blurred vision, diplopia, dizziness,
drowsiness, headache and confusion. Some antidepressants can cause blurred vision and mydriasis
(abnormally dilated pupil.) And Baclofen, a drug commonly used to decrease muscle spasms, can cause
dizziness, drowsiness, headache, disorientation, blurred vision and mydriasis. Mydriasis can be severe
enough to block eyetracking. If the retinal reflection is extremely bright, and the corneal reflection is
sitting on top of a big, bright pupil, the corneal reflection may be indistinguishable and therefore
unreadable by the computer.
1.4 Mental abilities that improve the probability for successful Eyegaze use:
Cognition: Cognitive level may be difficult to assess in someone who is locked in, especially if a
rudimentary communication system has not been established. In general, a user with average intelligence
will best maximize the capabilities of an Eyegaze System. Ability to read: At present, the Eyegaze
System is configured for users who are literate. The System is text-based. A young child with average
intelligence may not be reading yet, but probably has the capability to learn to read at an average age. He
may be able to recognize words, and may be moving his eyes in a left to right pattern in preparation for
reading. As an interim solution many teachers and parents stick pictures directly onto the screen. When
the child looks at the picture he activates the Eyegaze key that is located directly underneath it.
1.4.1 Memory:
Memory deficits are a particular concern in considering the Eyegaze System for someone with a
brain injury. A user who can't remember from one day to the next how to operate the system may find it
too difficult to use effectively.
8. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 8
CHAPTER 2
LITERATURE SERVEY
2.1 HISTARY
Eye and gaze tracking state of the art This section introduces the latest uses of eye gaze tracking
in applications, with a special focus on interactive applications and/or video games, but in order to
understand the present let’s have a look at the past. After the second world war, one of the first measures
of gaze direction was done in 1947 by the group of Fitts, Jones and Milton . They published technical
reports in the late 1940s that are considered to be the seminal research on visual sampling and represent
the largest collection of eye movement data collected in a visual monitoring task. The data encompass
over 500,000 frames of movie film of over 40 pilots taken under various flight conditions. The general
conclusions was that: It is reasonable to assume that frequency of eye fixations is an indication of the
relative importance of that instrument. The length of fixations, on the contrary, may more properly be
considered as an indication of the relative difficulty of checking and interpreting particular instruments.
[...] If we know where a pilot is looking, we do not necessarily know what he is thinking, but we know
something of what he is thinking about.
Following this work, authors were able to propose a more efficient arrangement of instruments and
identified those which were difficult to read, for a possible redesign of the actual instrument. This was the
first time a survey allowed interaction between an application (an airplane cockpit) and a manual gaze
tracking system. It was also the first time video was used to perform measures. Actually, they were
mainly performed using a medical technique that allowed registration of eyeball movements using a
number of electrodes hal-00215967, version 1 - 24 Jan 2008.
Left: The gaze tends to come and go between the eyes and mouth in the picture of a face. Right: Think
where you are looking at to see either a musician or a woman face. positioned around the eye. Most of the
described techniques required the viewer’s head to be motionless during eye tracking and used a variety
of invasive devices. The major innovation in eye tracking was the invention of a head-mounted eye
9. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 9
tracker ([13], [23], [24], [30]), this technique is still widely used. Another reference work in the gaze
tracking world is the one done by Yarbus. Yarbus was a Russian psychologist who studied eye
movements and saccadic exploration of complex images in the 1950s and 1960s. He recorded the eye
movements performed by observers while viewing natural objects and scenes. Here again, this work tends
to show that the gaze direction is crucial in interactivity, actually Yarbus showed that the gaze trajectories
followed depends on the task that the observer has to perform (cf. classical Figure and experience 2.2).
Eyes would concentrate on areas of the images of relevance to the questions. Much of the relevant work
in the 1970s focused on technical improvements to increase accuracy and precision and reduce the impact
of the trackers.In , Jacob proposes a brief history of the hal-00215967, version 1 - 24 Jan 2008 eye
tracking system during these years. Since this period, works are deeply correlated with the performance of
computers. The more it progresses, the more it provides the necessary resources for real time and/or
complex applications. For instance, it is nowadays possible to develop a human computer interface using
gaze tracking. During the 80’s, interest for gaze tracking have persisted, the incredible boom of personal
computers allowed to design new interfaces and ways of thinking our relation with the computer [5], [4]
and [31]. Another pioneer was Dr. Levine who was one of the first to see the potential of eye tracking in
interactive applications. For 15 years, eye and gaze tracking have become an industrial stake, many
manufacturers have developed products in this field of research. The dramatically increasing number of
publications around this topic prevents from doing an exhaustive state of the art, actually many journals,
conferences and publications have been created around this topic (conferences: ECEM - the European
Conference on Eye Movements, SWAET - the Scandinavian Workshop on Applied Eye-tracking, ETRA -
Eyetracking Research and Applications...). At least seven sponsors are represented in the last ETRA
conference. Nowadays, many systems, invasive or not, allow to measure, follow, analyse, and log in real
time numerous data coming from cameras. Applications are various, from military to advertisement
analysis via medical applications. A complete vision of theories and applications is presented in. With the
advances in eye gaze sensing technologies these systems are now much more precise and far less
intrusive. As a consequence, researches using eye gaze as input stream have grown increasingly. This
reality leads to the existence of several ways of tracking the direction of eye-gaze. [3], [27] and [10]
provide the following list of requirements of an ideal tracking device, which are still not fully satisfied by
current techniques.
1. Off er an unobstructed field of view with good access to the face and head
2. Make no contact with the subject
3. Meet the practical challenge of being capable of artificially stabilising the retinal image if necessary
4. Possess an accuracy of at least one percent or a few minutes of arc. Accuracy is limited by the
cumulative eff ects of non-linearity, distortion, noise, lag and other sources of error
10. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 10
5. Off er a resolution of 1 minute of arc sec−1, and thus be capable of detecting the smallest changes in
eye position; resolution is limited only by instrumental noise
hal-00215967, version 1 - 24 Jan 2008
6. Off er a wide dynamic range of one minute to 45◦ for eye position and one minute arc sec−1 to 800
sec−1 for eye velocity
7. Off er good temporal dynamics and speed of response (e.g. good gain and small phase shift to 100Hz,
or a good step response).
8. Possess a real-time response (to allow physiological manoeuvres).
9. Measure all three degrees of angular rotation and be insensitive to ocular translation
10. Be easily extended to binocular recording
11. Be compatible with head and body recordings
12. Be easy to use on a variety of subjects
To summarize this brief history, even if eye gaze tracking systems exist since a long time, the
tracking and measure of eye behaviour and gaze direction was until recently a very complex and
expensive task reserved for research or military labs. The eye tracking devices were uncomfortable head
mounted systems, thus they were mainly used as pointing devices for a very narrow range of applications
(mainly military). However, rapid technological advancements (increased processor speed, advanced
digital video processing) have both lowered the cost and dramatically increased the efficiency of eye and
gaze tracking equipment. In the next two sections, we present a review of eye movement tracking systems
and applications.
2.2 A review of eye and gaze tracking systems
The most widely used current designs are video-based eye trackers. Even if these techniques are
predominant, we have to mention in order to be complete, the electro-oculography tracking technique. It
is based on the fact that an electrostatic field exists when eyes rotate. By recording small diff erences in
the skin potential around the eye, the position of the eye can be estimated. Also, since this is done with
electrodes placed on the skin around the eye, this technique does not require a clear view of the eye. This
technique is rather troublesome though, and is not well-suited for everyday use, since it requires the close
contact of electrodes to the user, yet a recent application can be found in [1].hal-00215967, version 1 - 24
Jan 2008
11. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 11
Fig: 2.2 Eye Tracking system
Left:the Czech anatomist Jan Evangelista Purkyne (1787-1869) and his four images. Right:
representation of a bright corneal reflection of near infrared diodes light
Concerning video based eye trackers, one or two cameras focus on one or both eyes and record and/or
analyse their movements. In this section, we present the very simple and known taxonomy in which we
separate systems into two main categories: • Head-mounted systems; • non intrusive systems. Each one
splits into two others categories depending on the kind of light they use: • ambient light; • infrared or near
infrared light. Head-mounted systems are commonly composed of cameras (1, 2 or 3) and diode which
provide light. These systems have followed the same path as computers; smaller and faster (Figure 2.4).
Nowadays, it is quite easy to follow and analyse the four Purkinje images. Purkinje images are reflections
of objects from structure of the eye. There are at least four Purkinje images that are visible when looking
at an eye. The first Purkinje image (P1) is the reflection from the outer surface of the cornea. The second
one (P2) is the reflection from the inner surface of the cornea. The third one (P3) is the reflection from the
anterior surface of the lens and the last one (P4) is the reflection from the posterior surface of the lens [10]
Using light emitting diodes on head mounted system, it is possible to record several images which
represent the reflects of emitted light in the eyes.
12. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 12
Fig:2.2.1 Eye Traking system
used in the Dual Purkinje Image method). If the illumination is coaxial with the optical path then it
produces a bright pupil eff ect similar to red eye. If the illumination source is off set from the optical path,
then the pupil appears dark. Bright Pupil tracking creates greater iris/pupil contrast allowing more robust
eye tracking and more reliable tracking in lighting conditions ranging from total darkness to very bright.
However, bright pupil techniques are not eff ective for tracking outdoors as extraneous infrared sources
interfere with monitoring. Regarding technology, some eye tracking systems require the head to be stable
(for example, with a chin rest), and some function remotely and automatically track the head during
motion. Concerning frame rate acquisition, most use a sampling rate of at least 30Hz until 50/60 Hz. In
the field of neurobiology or in order to capture the detail of the very rapid eye movements during reading
some of them can run at 240, 350 or even 1000/1250 Hz.
The other main category of eye and gaze tracking system is the non intrusive one. It provides some
advantages compared to the head-mounted systems, some of the most obvious are: it should allow natural
head movements;it should be able to perform with a wide variety of eye shapes, contact lenses or glasses;
it should be real time. With the increasing processing speed of computers, it is now possible to analyse
digital videos of face and eye movements in order to provide reliable measures. We can mention three
main approaches to detect and measure in a non invasive way eye and gaze, depending on the king og
13. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 13
features they use he glint;a 3D-model;a local linear map network. The first one and the most commonly
used approach is to calculate the angle of the visual axis and the location of the fixation point on the
display surface by tracking the relative position of the pupil and a point of light reflected from the cornea,
i.e. the glint. Infrared light enhanced measures take advantage of the bright pupil eff ect (a recent study of
such a system may be found in. The second one consists in the use of serialized image processing
methods to detect face, pupils, mouth and nostrils, once these treatments are done, a 3D model is used to
evaluate face orientation and finally gaze direction is estimated using eyes images (a very recent work can
be found in. The last one is more marginal, it consists in the use of a neural networks of the local linear
map type which enables a computer to identify the head orientation of a user by learning from examples.
In each case two comments can be made: firstly infrared light may facilitate results and secondly
calibration is a real problem, either a model is adapted in real time or is build before the tracking and
adapted for a single person. The price range of most commercially available eye-trackers is between
$5000 and $60.000. In the next section, we present the most popular applications using eye and gaze
tracking systems.
2.3 A review of eye and gaze tracking applications
Silicon has some special chemical properties, especially in its crystalline form. An atom of silicon
has 14 electrons, arranged in three different shells. The first two shells- which hold two and eight
electrons respectively- are completely full. The outer shell, however, is only half full with just four
electrons (Valence electrons). A silicon atom will always look for ways to fill up its last shell, and to do
this, it will share electrons with four nearby atoms. It's like each atom holds hands with its neighbours,
except that in this case, each atom has four hands joined to four neighbours. That's what forms the
crystalline structure. The only problem is that pure crystalline silicon is a poor conductor of electricity
because none of its electrons are free to move about, unlike the electrons in more optimum conductors
like copper.
14. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 14
CHAPTER 3
System Development
3.1 Eyegaze Communication
In a culture dominated by visual images, most people use their eyes to obtain vast amounts of
information without the need for direct or close physical contact. A human recognizes the outside world
through the ability of nervous systems which construct internal visual representations of the outside
world. The eye is like a camera in that it has a set of lenses in the front (the cornea and the lens) that focus
images on a light-sensitive film (the retina) in the back [see photography]. The retina contains several
layers of nerve cells that analyze visual information before it ever leaves the eye. Signals from the retina
are transmitted via the optic nerve to a way station in the core of the brain called the geniculate body, then
to the primary visual cortex at the back of brain. Our image of the world is mapped topographically onto
the visual cortex.It is important to note that the internal perception of visual media is not only a reflection
of its physical properties, but also the changes induced by its transduction, filtering, and transformation
by the nervous system.It is the brain, and not the eye, that is the true organ of visual perception. Given the
brain's integral interpretive role in the construction of any complex visual impression, it is necessary to be
aware of how a human understands his or her physical environment as a perceived environment.
The term "gaze" is broadly used by media theorists to refer both to the ways in which viewers look at
images of people in any visual medium and to the gaze of those depicted in visual texts.The "gaze" is a
double-sided term. There must be someone to gaze and there may be someone to gaze back. To give the
gaze is to perceive that one is looking at an object. To set oneself at gaze is to expose oneself to view or
display oneself. Words for the agent of gazing are beholder, viewer, and occasionally spectator or
audience. Like a person, gaze also can be exchanged in a medium. Several key forms of gaze can be
identified in photographic, filmic or televisual tests, or in figurative graphic art based on who is doing the
looking: the spectator's gaze, the intra-diegetic gaze, the direct or extra-diegetic address to the viewer, and
the look of the camera. The antiquity of the discourse on gaze can be seen in such myths as that of the evil
eye and the gorgon Medusa, whose gaze could turn its object to stone. Folkloric representations of eyes
sought to protect their wearers from the power of the evil gaze. In the nineteenth century, the discourse on
the visually perceptual object was centered on an opposition between the optical and tactile senses. The
tactile sense placed us in contact with reality while the optical sense was regarded as the sense of the
intellect, the spirit, and the imagination. Impressionists and symbolists were attracted by the fact that
optical perception seemed to unite the subjectivity of artistic vision with the objectivity of the external
world. It survived in the work of critics of the mid-twentieth century who used formal criteria to interpret
such artistic movements as abstract expressionism. This discourse was continued, but replaced to a large
extent by the term "gaze." In early twentieth-century, German expressionism exploited the sense of power
15. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 15
in images that stared out at the viewer menacingly. The charisma of the gaze came to its peak in Hitler,
who prided himself on his hypnotic gaze. Jean-Paul Sartre's almost paranoid treatment of " le regard " (the
look) in his treatise on existential philosophy, Being and Nothingness , portrayed the state of being
watched as a threat to the self.
A late-twentieth century interest in the eye and the gaze has been largely investigated so far in terms of
psychoanalysis. According to Jacques Lacan, human recognition of the visual object is overlaid with mis-
recognition. In "Of the Gaze as object petit a " Lacan indicates some sort of outside observer; the imagery
petit a is the lure for the subject's desire.The embodiment of object petit a is what we may call the gaze.
According to Lacan, the subject's attempt to view the other must pass through the intermediary. The plane
mirror provides a virtual image that covers up the fundamental lack in the real image. Thus, the gaze
corresponds to desire, the desire for self-completion through the other. "The eye and the gaze--this is for
us the split in which the drive is manifested at the level of the scopic field."In this permutation the gaze is
the unattainable object of desire that seemed to make the other complete. However, for Lacan, it is
important to understand that the eye and gaze, although split, are part of the same person. Marshall
McLuhan, in his Understanding Media: The extensions Man, refers to the tragedy of Narcissus caused by
the misrecognition of his own image: "The Greek myth of Narcissus is directly concerned with a fact of
human experience, as the word Narcissus indicates. It is from the Greek word narcosis, or numbness. The
youth Narcissus mistook his own reflection in the water for another person. This extension of himself by
mirror numbed his perceptions until he became the servomechanism of his own extended or repeated
image. The nymph Echo tried to win his love with fragments of his own speech, but in vain. He was
numb. He had adapted to his extension of himself and had become a closed system." Lev Manovich notes
that Lacan emphasizes that perspective extends beyond the domain of the visible. Manovich points out
that Lacan reminds us that an image is anything defined "by the correspondences from one point to
another in space" and the idea that perspective is not only limited to sight but also functions in other
senses defines the classical discourses on perception: "The whole trick, the key presto!, of the classic
dialectic around perception, derives from the fact that it deals with geometric vision, that is to say, with
vision in so far as it situated in a space that is not in its essence the visual."
16. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 16
CHAPTER 4
Perform Analysis
4.1 How does the Eyegaze System work?
As a user sits in front of the Eyegaze monitor, a specialized video camera mounted below the
monitor observes one of the user's eyes. Sophisticated image- processing software in the Eyegaze System's
computer continually analyzes the video image of the eye and determines where the user is looking on the
screen. Nothing is attached to the user's head or body.
Fig:4.1 Eyegaze control monitor
In detail the procedure can be described as follows: The Eyegaze System uses the pupil-
center/corneal-reflection method to determine where the user is looking on the screen. An infrared-
sensitive video camera, mounted beneath the System's monitor, takes 60 pictures per second of the user's
eye. A low power, infrared light emitting diode (LED), mounted in the center of the camera's lens
illuminates the eye. The LED reflects a small bit of light off the surface of the eye's cornea. The light also
shines through the pupil and reflects off of the retina, the back surface of the eye, and causes the pupil to
appear white. The bright-pupil effect enhances the camera's image of the pupil and makes it easier for the
image processing functions to locate the center of the pupil. The computer calculates the person's
gazepoint, i.e., the coordinates of where he is looking on the screen, based on the relative positions of the
pupil center and corneal reflection within the video image of the eye. Typically the Eyegaze System
predicts the gazepoint with an average accuracy of a quarter inch or better. Prior to operating the
eyetracking applications, the Eyegaze System must learn several physiological properties of a user's eye
in order to be able to project his gazepoint accurately. The system learns these properties by performing a
calibration procedure. The user calibrates the system by fixing his gaze on a small yellow circle displayed
17. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 17
on the screen, and following it as it moves around the screen. The calibration procedure usually takes
about 15 seconds, and the user does not need to recalibrate if he moves away from the Eyegaze System
and returns later.
4.2 How to run the Eyegaze System?
A user operates the Eyegaze System by looking at rectangular keys that are displayed on the control
screen. To "press" an Eyegaze key, the user looks at the key for a specified period of time. The gaze
duration required to visually activate a key, typically a fraction of a second, is adjustable. An array of menu
keys and exit keys allow the user to navigate around the Eyegaze programs independently.
18. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 18
CHAPTER 5
CONCLUSION
5.1 Conclusion
Today, the human eye-gaze can be recorded by relatively unremarkable techniques. This thesis
argues that it is possible to use the eye-gaze of a computer user in the interface to aid the control of the
application. Care must be taken, though, that eye-gaze tracking data is used in a sensible way, since the
nature of human eye-movements is a combination of several voluntary and involuntary cognitive
processes.
The main reason for eye-gaze based user interfaces being attractive is that the direction of the eye-
gaze can express the interests of the user-it is a potential porthole into the current cognitive processes-and
communication through the direction of the eyes is faster than any other mode of human communication.
It is argued that eye-gaze tracking data is best used in multimodal interfaces where the user interacts with
the data instead of the interface, in so-called non-command user interfaces.
5.2 Advantages
1. Eye movement is faster than other current input media
2. No training or particular coordination is required for normal users
3. Can determine where the user’s interest is focused automatically
4. Helpful for usability studies to understand users interact with their environment
5.3 Dis Advantares
1. The equipment is expensive
2. Some users can't work with the equipment (for example if they wear contact lenses or have
long eye lashes)
3. Calibrating the equipment takes time; this problem may resultantly cause the user to deviate
from using the device.
19. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 19
5.4 Application
Every year more than 100,000 people are diagnosed with motor neurone diseases. Typically, even
when all other ways of communicating are either severely damaged or completely lost, the eyes still
function. Communication by Gaze Interaction (COGAIN) is a Network of Excellence designed
specifically to help people with these disabilities to communicate more effectively with eye gaze. At the
COGAIN stand you can see how this technology is used by a person who relies on it. Current eye
tracking equipment allows users to generate text on a computer by using eye gaze. Users are able to select
letters and numbers by looking at a keyboard on a screen with their eyes, and can construct words and
sentences that can be spoken aloud by the system. Using these systems both empowers and enables
people with disabilities as they can now communicate without the need for an assistant or helper, giving
the users greater freedom in their lives. A wide variety of disciplines use eye tracking techniques,
including cognitive science, psychology (notably psycholinguistics, the visual world paradigm), human-
computer interaction (HCI), marketing research and medical research (neurological diagnosis). Specific
applications include the tracking eye movement in language reading, music reading, human activity
recognition, the perception of advertising, and the playing of sport. Uses include:
Cognitive Studies
Medical Research
Laser refractive surgery
Human Factors
Computer Usability
Translation Process Research
Vehicle Simulators
In-vehicle Research
Training Simulators
5.5 SCOPE FOR FUTURE DEVELOPMENT
A non-intrusive system to localize the eyes and monitor fatigue was developed. Information about
the head and eyes position are obtained through various self-developed image processing algorithms.
During the monitoring, the system is able to decide whether the eyes are opened or closed. When the eyes
have been closed for two seconds, a warning signal is issued. In addition during monitoring, the system is
able to automatically detect any eye localizing error that might have occurred. In case of this type of error,
20. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 20
the system is able to recover and properly localize the eyes. The proposed system was tested on the real
driver images. The video image [480 x 640 SSSpixels] of 75 different test persons has been recorded
during several day, night and complex background at different places. The proposed system has two key
phases such as preprocessing and detecting eye from video images described in Chapter 5 and 6
respectively. In preprocessing, new enhanced technique is used to enhance the contrast of dark regions
and tested with existing algorithm. As per the results obtained in section 5.5, all the noises in the video
image are removed successfully. In second phase new techniques are used to extract eye from the
preprocessed image. It is also tested with standard existing method and the comparison results are shown
in Table 6.1 and 6.2. The eye pair can be selected successfully in most cases, shown in Fig 6.1, 6.2 and
6.3. From the Chapter 6, the false finding rate of drowsiness of Color cue and projection function.
5.5.1 We achieved the following:
DDDS achieves highly accurate and reliable detection of drowsiness. DDDS offers a non-
intrusive approach to detect drowsiness without the annoyance and interference. Processing, judges the
driver’s alertness level on the basis of continuous eye closures. The proposed system works in both day
time and night time conditions. All the drawbacks mentioned in section 2.5 have been eliminated. In
future, this prototype can be extended to give alarm before sleeping by calculating the heart beat measure
without physical disturbance i.e., non intrusive method using modified ECG methods. Usually in ECG
method key points of body (For example chest, head, wrist etc.,) are sticked with wire. In the extended
method, sticking wire may be avoided. This will lead us to a way to find out the optimum level of
drowsiness. Further, this prototype will be extended to monitor the reflect ray from eye using nano
camera. If the reflection ray is absent, then eye is closed otherwise eye is opened. We believe that this will
create a better opportunity to detect drowsiness.
21. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 21
References
1. Tharpe AM, Ashmead D, Sladen DP, Ryan HA, Rothpletz AM (2008) Visual attention and hearing
loss: Past and current perspectives. Journal of the American Academy of Audiology 19: 741–747.
2. Stivalet P, Moreno Y, Richard J, Barraud PA, Raphel C (1998) Differences in visual search tasks
between congenitally deaf and normally hearing adults. Cognitive Brain Research 6: 227–232.
3. Proksch J, Bavelier D (2002) Changes in the spatial distribution of visual attention after early deafness.
Journal of Cognitive Neuroscience 14: 687–701.
4. Bavelier D, Dye MWG, Hauser PC (2006) Do deaf individuals see better? Trends in Cognitive
Sciences 10: 512–518.
5. Dye MWG, Hauser PC, Bavelier D (2008) Visual skills and cross-modal plasticity in deaf readers:
Possible implications for acquiring meaning from print. Annals of the New York Academy of Sciences
1145: 71–82.
6. Dye MWG, Hauser PC, Bavelier D (2009) Is visual selective attention in deaf individuals enhanced or
deficient? The case of the useful field of view. PLOS One 4(5): e5640.
7. McCullough S, Emmorey K (1997) Face processing by deaf ASL signers: evidence for expertise in
distinguishing local features. Journal of Deaf Studies and Deaf Education 2: 212–222.
8. Bettger J, Emmorey K, McCullough S, Bellugi U (1997) Enhanced facial discrimination: effects of
experience with American sign language. Journal of Deaf Studies and Deaf Education 2: 223–233. 9.
Kubota Y, Que´rel C, Pelion F, Laborit J, Laborit MF, et al. (2003) Facial affect recognition in pre-
lingually deaf people with schizophrenia. Schizophrenia Research 61: 265–270.
10. Yarbus AL (1965) Role of Eye Movements in the Visual Process. Moscow: Nauka.
11. Findlay JM, Gilchrist ID (2003) Active Vision - The Psychology of Looking and Seeing. Oxford,
UK: Oxford University Press.
12. Walker-Smith G, Gale A, Findlay J (1977) Eye movement strategies involved in face perception.
Perception 6: 313–326.
13. Janik SW, Wellens AR, Goldberg ML, Dell’Osso LF (1978) Eyes as the center of focus in the visual
examination of human faces. Perceptual and Motor Skills 47: 857–858.
14. Groner R, Walder F, Groner M (1984) Looking at faces: local and global aspects of scanpaths. In:
Gale AG, Johnson F, eds. Theoretical and applied aspects of eye movements research. Amsterdam:
Elsevier. pp 523–533.
15. Henderson JM, Williams CC, Falk RJ (2005) Eye movements are functional during face learning.
Memory & Cognition 33: 98–106.