SlideShare uma empresa Scribd logo
1 de 12
Baixar para ler offline
Action as a Window to Perception:
Measuring Attention with Mouse Movements

   A Validation Study of the MetrixLab FocusTracker


     Prof. dr. A. Johnson and dr. ir. L.J.M. Mulder
Action as a Window to Perception:
                   Measuring Attention with Mouse Movements

Introduction

Attention has been described as the interface between memory and events in the world.
We have to attend to information if we are to encode it, and retrieval of past experiences
depends on attention to appropriate cues in the environment (Logan & Compton, 1998).
In visual processing, such as when shopping for a product on the internet or scanning
supermarket shelves, attention is needed to locate relevant information and to guide
action. Although it is possible to move the focus of attention at least a few degrees of
visual angle away from the focus of the eyes (von Helmholtz, 1894), we almost always
attend where we look (Johnson & Proctor, 2004). Therefore, if we want to know whether
someone has attended to information, we will want to know if they have looked at it. Eye
movement tracking is one means of measuring attention to scenes (Duchowski, 2002). In
practice, however, the inconvenience and cost of collecting and analyzing eye-movement
data limit the effectiveness of the technique for evaluating visual information displays.

Recently, market research institute MetrixLab has developed a computer-based tool for
tracking visual attention. This tool, the FocusTracker™, is based on the assumption that
mouse movements provide a reliable indication of when and where attention is allocated
on a computer screen. Because it is internet based, the tool can be used with hundreds of
participants from any location in the world, allowing maximal freedom in targeting
specific groups. It has the advantage that participants stay in a natural setting and that no
laboratory or specialized equipment is necessary. Participants are first trained to point the
mouse at high speeds, moving over images or texts on the computer screen. After this
short training period, the displays of interest are presented with the instruction to the
participant to “point to whatever catches your eye.” “Scan path” data from the
FocusTracker can then be replayed using MetrixLab’s online FT Replay™ program to
determine how attention is allocated to objects in the scene.

The FocusTracker is based on the assumption that there is a one-to-one relationship
between where we fix our gaze, where we point via a mouse, and what we are attending
to. The question addressed in this research is thus whether the hand can be trained to
follow visual spatial attention and whether attentional processing can be measured by
tracking pointing movements with a handheld computer mouse. A related question is
whether viewers can be adequately instructed to perform the task from their own homes,
without the direct intervention of a researcher. In brief, the results of the research are very
promising, showing high correlations between the scan paths for the eye and the mouse
as well as high correlations between the percentage of time spent in designated regions of
interest for the mouse and the eye. An additional comparison of the data with that of a
group who did not use a mouse while viewing the experimental stimuli showed that
viewing patterns were not disrupted by mouse use.




                                                                                         ©
                                              1                                           2006
The Experiment: Tracking the Focus of Attention

In order to determine whether the hand can effectively follow the eye while viewing
visual information, an experiment was conducted in which 21 advertisements were
viewed for 5 seconds each. Participants were instructed to view the advertisements and to
attempt to move the mouse in the same way as the eyes. Eye movements were registered
with an eye tracker and hand movements were registered by logging the position of a
handheld computer mouse. Two conditions were compared: one in which the participants
received a short, verbal instruction and demonstration of how the mouse should be
moved, and one in which participants followed the FocusTracker instruction program.

Method
Participants. Each group included 15 participants. The FocusTracker instruction group
included 5 men and 10 women, 11 of which were university students. The mean age in
this group was 25 years old (sd = 7.15). The verbal instruction group included 3 men and
12 women; 11 of which were university students. The mean age in this group was 25
years old (sd = 9.58). All but one participant had completed either a VWO or HBO
program.

Stimuli. The stimuli were relatively unknown advertisements1 taken from relatively
expensive magazines. Advertisements were 25.5 cm high and 16-20.5 cm wide and were
presented on a 17-inch computer screen. Each advertisement was made up of four regions
of interest (ROIs): A headline, an illustration, text, and a trademark (see Figure 1). The
position of each ROI varied across the advertisements and in some cases they overlapped.

Apparatus. Eye movements were recorded with an Applied Science Laboratories model
504 eye tracker equipped with a pan/tilt camera.2 Eye position was determined 50 times
per s. A Logitech infrared mouse was used to record hand position. Mouse position was
also sampled 50 times per s. Participants were tested individually in a dimly lit room.

Procedure. Participants were randomly assigned to either the verbal instruction or
FocusTracker training groups. Participants in both groups were told that they should
move the mouse to follow their eye movements. Participants in the verbal instruction
group were also given a demonstration of how the mouse should be moved along with the
eyes. Participants in the FocusTracker training group performed the tasks in the
FocusTracker training: following a moving butterfly with the mouse, moving the mouse
to each of a series of sequentially presented objects of the same type, moving the mouse
to each of a series of sequentially presented objects of different types and moving the


1
  The familiarity of the advertisements was tested in a pilot study in which 15 people (aged 17-40) were
asked whether they had ever seen each of the advertisements. Only advertisements that were recognized by
no more than 3 of the 15 participants in the pilot study were used as stimuli in the experiment.
2
  Eye position is determined by comparing the pupil and the corneal reflection of infrared light emitted
from the camera.


                                                                                                  ©
                                                   2                                               2006
(b)

                    (a)




                    (c)




                                                                 (d)

 Figure 1. A sample advertisement showing the four regions of interest (ROIs): the
 headline (a), illustration (b), text (c) and trademark (d). ROIs were defined for analysis by
 enclosing them in rectangles. In some cases (see ROI d), two rectangles were used to
 define the region. In case of overlap, the smaller ROI was subtracted from the larger ROI.


mouse along with the eyes while viewing each of two advertisements. The training was
followed independently by each participant, without the intervention of the researcher,
and lasted approximately 1.5 – 2 min.

The session began with the calibration of the eye tracker. The training was then given and
was immediately followed by the experimental trials. Trials were separated with the
presentation of the mouse cursor centered in a light gray screen. This screen was shown
for 6 s before the first trial and for 2 s between subsequent trials. Participants were
instructed to look at the mouse cursor until the advertisement appeared and were told that
it would be impossible to move the mouse during these 2 s. The 21 advertisements were
presented for 5 s each in the same order for all participants.

After participants had viewed all of the advertisements, a surprise memory test was
given. In Part 1 of the memory test, participants were shown either the headline,
illustration or trademark from one of the advertisements and were asked to recall the
other two attributes of the advertisement (e.g., if the trademark was shown, participants
should report the illustration and the headline; recall of the text was not tested). Each cue
was used seven times. Part 2 of the memory test was a recognition test in which 42


                                                                                                 ©
                                                 3                                               2006
Eye scan path                     Mouse scan path

Figure 2. Eye and mouse scan paths from a single participant.



trademarks were presented, 21 from the advertisements used in the experiment and 21
from related products. The participant’s task was to classify each trademark as having
been presented or not. Finally, participants were asked how familiar they were with each
of the advertisements.

At the end of the experimental session personal data (e.g., age, level of education,
familiarity with the computer) was collected and participants were debriefed. The entire
experiment lasted approximately 1 hour.

Data analysis. Not all trials could be included in the analysis because of missing or
unreliable eye movement data. If more than 0.5 s data was not usable the trial was not
analyzed. Missing data resulted from the camera losing the eye position, extreme eye
position readings as a result of the camera misreading the reflection point and as a result
of correcting for eye blinks. Approximately 33 trials per group (9.6% of the data) were
excluded from analysis. Additionally, occasional extreme values (outliers) were removed
and replaced by the average of the two values before and after the outliers. The same
procedure was applied to brief eye blinks.

The relation between eye and hand movements was tested by (1) comparing the scan
paths for the eye and hand using bidimensional regression techniques and (2) by
comparing the percentages of time spent by the eye and hand, respectively in each of the
four ROIs. Because mouse movements lagged behind eye movements, it was necessary to
compensate for the lag on each trial. This was done by determining the best fit
(bidimensional r) between the mouse and eye data. On average, 0.63 s of the mouse data
at the beginning of the trial was discarded, as was a corresponding amount of the eye
movement data at the end of the trial. The FocusTracker training group took on average
0.58 s to move the mouse whereas the verbal instruction group needed 0.69 s (F(1, 25) =


                                                                                      ©
                                               4                                       2006
Table 1.
    Average Values from the Bimendimensional Regression as a Function of Group (standard
    error in parentheses)

                                                                          Group

                                                        FocusTracker training       Verbal Instruction
    Value
    Correlation (r)                                          0.87    (0.04)            0.89    (0.04)
    Rotation (θ)                                             0.14    (0.06)            0.09    (0.05)

    Expansion (φ)                                            0.80    (0.02)            0.79    (0.02)

    Left-right translation (α1)                              90.2    (13.3)            87.0    (12.0)

    Up-down translation (α2)                                 79.8     (9.2)            70.9     (6.1)



4.37, p = .047). Both groups were slower to move the mouse on the first advertisement
(m = 1.17 s) as compared to all other advertisements (m = 0.61 s; (F(1, 25) = 22.63, p <
.001).



Results

The relation of mouse and eye scan paths. Sample eye and mouse scan paths are shown
in Figure 2. The degree to which the mouse followed the position of the eyes was
assessed with bidimensional regression. The overall correlation between the scan paths
(r), and the rotation (θ), expansion (φ), and translation (to the right or left or up or down;
α) of the mouse scan path relative to that of the eye were determined individually for
each participant and each advertisement (see Table 1). Correlations were Fisher
transformed for analysis.

Large correlations (ranging from .83 - .92 across advertisements) were found between the
mouse and eye scan paths. These correlations did not significantly differ as a function of
group. Differences between the advertisements were also minimal.3 Analysis of the
rotation parameter, θ, revealed that the mouse scan path showed a slight (m = 11.5°)
rotation to the right relative to the eye scan path. The expansion parameter, φ, with an
average value of 0.8, reflected that the mouse scan path covered a somewhat smaller area
than that covered by the eye. Finally, the mouse scan path was shifted, on average, about
4 cm above and to the right of the eye scan path.



3
 One advertisement significantly differed from two others; no other differences between advertisements
were found.


                                                                                                  ©
                                                    5                                               2006
Table 2
  Percent Time Spent in Each Region of Interest by Eye and Mouse as a Function of Group
  (standard error in parentheses)

                                                        Group

                               FocusTracker Training                   Verbal Instruction

  Region of Interest        % time eye       % time mouse       % time eye        % time mouse

  Illustration             41.2   (3.12)     47.1   (3.51)      46.3   (2.32)     50.1      (2.16)
  Headline                 30.1   (1.83)     28.0   (2.15)      27.7   (1.61)     26.9      (1.75)
  Text                     14.8   (1.89)     14.8   (2.10)      11.6   (1.76)     10.9      (1.69)
  Trademark                11.3   (1.27)      9.5   (1.27)      11.0   (1.22)     10.9      (1.44)



Percentage time in each region of interest. The overall percent time spent in each of the
ROIs by the eye and mouse, respectively, is shown in Table 2. Two analyses were carried
out: A correlational analysis in which the correlation between the time spent in each
region by the eye and the mouse was computed across all advertisements for each
participant, and an ANOVA with ROI (illustration, headline, text or trademark), effector
(eye or mouse) and group as factors. Correlations were Fisher transformed for analysis.

Overall, the correlation between time spent in each of the ROIs for the eye and mouse
was high (r = .88). No significant difference in the correlation as a function of group was
found (r = .88 and r = .87 for the FocusTracker and verbal instruction groups,
respectively). Additional analysis carried out per advertisement showed average
correlation coefficients ranging from .68 to .97. Figure 3 shows the time spent in each
ROI by the mouse as a function of the time spent in each ROI by the eye.

An ANOVA with ROI (illustration, headline, text or trademark) and effector (eye or
mouse) as within subject factors and group (FocusTracker training or verbal instruction)
as a between subject factor showed a main effect of ROI (F(3, 84) = 102.81, p < .001).
Follow-up tests showed that significantly more time was spent on the illustration than the
headline, and on these two ROIs than on the text or trademark. Furthermore, there was a
significant Effector x ROI interaction (F(3, 84) = 14.14, p < .001). As can be seen in
Table 2, the tendency to spend the most time on the illustration was more pronounced for
the mouse than for the eye. No differences between groups were found.

To investigate in more detail the differences in percentage time spent in the ROIs, the
number of cases in which the eye visited a ROI not visited by the mouse, and vice versa,
was computed. These percentages are shown in Table 3. The eye was more likely to visit
an area not visited by the mouse than vice versa (F(1, 28) = 31.98, p < .001). This was
especially the case for the trademark.




                                                                                            ©
                                             6                                              2006
Region of Interest




                                                                           100
                                                             100,00                                                                             aandachtsgebieden
                                                                                                                                                     headline
                                                                                                                                                       Headline




               percentage tijd muis per aandachtsgebied
                                               Percent time mouse in ROI
                                                                                                                                                     tekst
                                                                                                                                                       Text
                                                                                                                                                     afbeelding
                                                                                                                                                       Illustration




                                                                           80
                                                                   80,00                                                                             merk
                                                                                                                                                       Trademark



                                                                           60
                                                                   60,00
                                                                           40


                                                                   40,00
                                                                           20




                                                                   20,00
                                                                           0




                                                                           0,00


                                                                                   0,00       20,00    40,00     60,00       80,00    100,00
                                                                                    0         20       40         60          80      100
                                                                                          percentage tijd oog per aandachtsgebied
                                                                                                Percent time eye in ROI
Figure 3. Percent time spent in each region of interest (ROI) by the mouse as a function of percent
time spent in each region of interest by the eye

Table 3
Average Percentage of Trials in Which Either the Eye Visited a Region of Interest Not Visited by the
Mouse or the Mouse Visited a Region of Interest Not Visited by the Eye

                                                                                                                                   Group

                                                                                            FocusTracker Training                                    Verbal Instruction

Region of Interest                                                                      Eye only               Mouse only                      Eye only               Mouse only

Illustration                                                                       1.47 (0.65)                 0.0       (0.0)             0.73 (0.50)                0.33 (0.33)
Headline                                                                           6.27 (1.78)                 0.0       (0.0)             7.73 (1.73)                1.07 (0.57)
Text                                                                               9.47 (2.58)                 5.53 (1.66)              10.40 (2.30)                  8.27 (1.30)
Trademark                                                                         13.00 (2.01)                 1.73 (1.03)                 7.53 (1.62)                0.33 (0.33)




   Effects of Mouse Use on Viewing Behavior

   The results of the experiment comparing mouse and eye scan paths suggest that “mouse
   tracking” can be an excellent substitute for eye tracking. High correlations were found
   between the forms of the scan paths for mouse and eye and for the amount of time spent
   in each ROI by the mouse and eye. Before using mouse tracking to evaluate observer
   behavior, however, it is important to know whether the use of the mouse leads to a
   different way of looking. That is, it is important to know that mouse use does not result in
   viewing behavior that is different than looking under normal conditions. In order to


                                                                                                                                                                             ©
                                                                                                                         7                                                     2006
Table 4
Number and Duration of Fixations as a Function of Group and Region of Interest (standard error in
parentheses)

                                                        Group

                     FocusTracker Training           Verbal Instruction           Eye Tracker Only
Region of            Number of     Duration of   Number of      Duration of   Number of     Duration of
Interest             Fixations      Fixations    Fixations       Fixations    Fixations      Fixations
Illustration   8.5     (0.57)      204 (9.50)    9.7 (0.65)     203 (11.15)   8.3 (0.43)    196 (11.33)
Headline       6.6     (0.38)      191 (9.25)    6.0 (0.40)     189 (3.41)    6.3 (0.43)    171 (8.27)
Text           3.7     (0.49)      168 (5.92)    2.9 (0.43)     181 (7.68)    3.7 (0.41)    163 (8.39)
Trademark      2.6     (0.13)      168 (6.14)    2.3 (0.25)     184 (6.94)    2.8 (0.19)    157 (5.59)

Total          21.3 (0.41)         188 (8.15)    20.8 (0.57)    190 (8.17)    21.1 (0.41)   176 (7.90)




    examine this issue, a group of observers comparable to the experimental groups was
    tested under the experimental conditions but without use of the mouse. The number and
    length of fixations in each ROI was then computed.

    Table 4 shows the average number and duration of fixations within each ROI as a
    function of whether or not observers used the mouse while viewing the advertisements.
    Separate ANOVAs with ROI (illustration, headline, text or trademark) as a within-
    subjects factor and group (FocusTracker training, verbal instruction or eye tracking only)
    as a between-subjects factor were conducted on the percentage of time spent in each ROI,
    the number of fixations in each ROI, and the average duration of the fixations. No
    differences between groups were found, nor did group interact with ROI. In short, effects
    of using the mouse on looking behavior are minimal.


    Effects of Mouse Use on Recognition and Recall

    If the mouse is used to track attention, and attention is subsequently tested with
    recognition and recall questions, it is important to know whether mouse use has an effect
    on memory for the viewed material. In order to test this we measured recall of the
    illustration, headline and trademark, and recognition of trademarks, for each of the three
    groups. For the recall test, one element of the advertisement was shown (e.g., the
    headline), and observers were asked to recall the other ROIs, excluding the text. Table 5
    shows recall performance as a function of group, ROI and cue (e.g., either the headline or
    the trademark could be shown as a cue for the illustration). An ANOVA with ROI
    (illustration, headline or trademark) as a within-subject factor and group (FocusTracker
    training, verbal instruction or eye tracking only) as a between-subject factor showed
    significant effects of ROI (F(2, 88) = 87.44, p < .001) and group (F(2, 44) = 6.75, p =


                                                                                                ©
                                                    8                                            2006
Table 5
 Percent Correct Recall as a Function of Group, Region of Interest and Cue (standard error
 in parentheses)

                                                       Group

 Region of Interest   FocusTracker training       Verbal Instruction    Eye Tracker Only

 Illustration
   Headline cue            43% (4.83)               23% (5.88)            47% (3.83)
   Trademark cue           41% (3.91)               38% (5.34)            61% (4.98)

 Headline
   Illustration cue       14%    (3.94)              9%   (3.36)          42% (6.87)
   Trademark cue          14%    (4.18)             14% (4.62)            29% (5.05)
 Trademark
   Illustration cue        21% (3.91)               24% (5.34)            36% (5.63)
   Headline cue            18% (4.06)               17% (4.67)            18% (3.19)



.003), as well as a significant ROI x Group interaction (F(4, 88) = 6.17, p < .001). Both
groups who used the mouse remembered fewer elements of the advertisements than the
group who performed the task without the mouse. This effect was significant for the
ROIs “illustration” and “headline”, but not for the ROI “trademark”.

For the recognition test, participants were presented with a list of the 21 trademarks seen
in the advertisement, combined with a list of 21 similar trademarks. On average 33% of
the presented trademarks were recognized. Of the similar trademarks, 13% were
incorrectly classified as having been seen. Recognition performance did not differ
between groups.


Summary

High correlations between eye and mouse scan paths and between percentage of time
spent in each ROI by the eye and the mouse indicate that the mouse is a viable alternative
to the eye tracker for measuring attention under natural viewing conditions. More than
75% of the variability in eye movements is captured by the mouse. Moreover, the lack of
differences between the FocusTracker and verbal instruction groups suggest that
instructions can be given remotely without any decrement to the accuracy of the
technique.

Use of the mouse had little influence on the way in which observers viewed the
advertisements. Thus, the results would seem to be generalizable to other viewing


                                                                                        ©
                                              9                                            2006
situations. Use of the mouse did influence how much could be remembered of what was
seen. This suggests that using the mouse does make demands on mental resources.


Recommendations

Several aspects of the data should be taken into account when using the mouse to track
attention. First, both the FocusTracker training and verbal instruction groups showed a
lower correlation between eye and mouse scan paths on the first advertisement than on
subsequent advertisements. This suggests that at least one “practice trial” should be used
before the stimuli of interest are shown. Second, consideration should be taken of the fact
that the scan path of the mouse covers a smaller area than that of the eye, and is shifted
somewhat to the right and to the top of the display. The smaller area is due to the fact that
the eye sometimes made movements that were not followed with the mouse. The shift to
the right and to the top of the display may be a result of using the right hand to move the
mouse. Another factor that may have played a role in this shift is that in many of the
advertisements, the trademark was at the bottom of the advertisement. In at least some
cases, the eye moved to the trademark while the mouse lagged behind, perhaps because
the cursor would get in the way of reading the text or because the trial ended before the
mouse movement was completed.

Specifically, the following recommendations are made:

   •   Present a “practice” stimulus after training and before beginning the evaluation of
       the stimuli of interest
   •   Define regions of interest to take into account the tendency of the mouse to be
       shifted to the right and top of the display
   •   Do not analyze the first 0.63 s of the trial data, or measure the time taken to move
       the mouse
   •   Do not rely on memory for viewed advertisements as an indication of where
       observers allocated their attention.

Mouse tracking is a viable alternative to eye tracking for determining which elements of
advertisements receive attention during a short viewing period. Moreover, the
FocusTracker training is a viable, on-line method of instructing observers to perform the
task.




                                                                                       ©
                                             10                                         2006
References
Duchowski, A. T. (2002). A breadth-first survey of eye-tracking applications. Behavior
   Research Methods Instruments & Computers, 34, 455-470.
Johnson, A., & Proctor, R. W. (2004). Attention: Theory and practice. Thousand Oaks,
   CA: Sage Publications.
Logan, G. D., & Compton, B. J. (1998). Attention and automaticity. In R. D. Wright
   (Ed.), Visual attention. Vancouver studies in cognitive science (Vol. 8, pp. 108-131).
   New York: Oxford University Press.
von Helmholtz, H. (1894). Über den Ursprung der richtigen Deutung unserer
   Sinneseindrücke (The origin of the correct interpretation of our sensory impressions).
   Translated in R. M. Warren & R. P. Warren (1968). Helmholtz on perception, its
   physiology, and development (pp. 249-260). New York: Wiley.




                                                                                    ©
                                           11                                        2006

Mais conteúdo relacionado

Semelhante a Measuring Attention with Mouse Movements

Consumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_VehiclesConsumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_VehiclesVisteon123
 
Consumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_VehiclesConsumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_VehiclesDr. Alexander van Laack
 
User-Defined-Gesture-for-Flying-Objects
User-Defined-Gesture-for-Flying-ObjectsUser-Defined-Gesture-for-Flying-Objects
User-Defined-Gesture-for-Flying-ObjectsPuchin Chen
 
Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...
Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...
Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...Jordan Lewis
 
Eye Tracking - FEK marketing experiment
Eye Tracking - FEK marketing experimentEye Tracking - FEK marketing experiment
Eye Tracking - FEK marketing experimentHugo Guyader
 
70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...
70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...
70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...ShivalikaGoyal1
 
WebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy testWebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy testSzymon Deja
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...ijujournal
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...ijujournal
 
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...ijujournal
 
IRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile SmartphoneIRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile SmartphoneIRJET Journal
 
Voice based Application as Medicine Spotter for Visually Impaired
Voice based Application as Medicine Spotter for Visually ImpairedVoice based Application as Medicine Spotter for Visually Impaired
Voice based Application as Medicine Spotter for Visually ImpairedIRJET Journal
 
Journal of Computer Science Research | Vol.4, Iss.1 January 2022
Journal of Computer Science Research | Vol.4, Iss.1 January 2022Journal of Computer Science Research | Vol.4, Iss.1 January 2022
Journal of Computer Science Research | Vol.4, Iss.1 January 2022Bilingual Publishing Group
 
The-Digitalization-of-the-Walking-Stick-for-the-Blind
The-Digitalization-of-the-Walking-Stick-for-the-BlindThe-Digitalization-of-the-Walking-Stick-for-the-Blind
The-Digitalization-of-the-Walking-Stick-for-the-BlindVandan Patel
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes TechnologyColloquium
 
Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...
Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...
Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...Objective Experience
 

Semelhante a Measuring Attention with Mouse Movements (20)

Consumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_VehiclesConsumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_Vehicles
 
Consumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_VehiclesConsumer_Insights_about_Gesture_Interaction_in_Vehicles
Consumer_Insights_about_Gesture_Interaction_in_Vehicles
 
User-Defined-Gesture-for-Flying-Objects
User-Defined-Gesture-for-Flying-ObjectsUser-Defined-Gesture-for-Flying-Objects
User-Defined-Gesture-for-Flying-Objects
 
Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...
Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...
Journal of Otolaryngology-Head and Neck Surgery Face and Content Validity of ...
 
Eye Tracking - FEK marketing experiment
Eye Tracking - FEK marketing experimentEye Tracking - FEK marketing experiment
Eye Tracking - FEK marketing experiment
 
70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...
70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...
70_2018_Head mounted eye gaze tracking devices An overview of modern devices ...
 
WebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy testWebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy test
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
 
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
 
IRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile SmartphoneIRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile Smartphone
 
Voice based Application as Medicine Spotter for Visually Impaired
Voice based Application as Medicine Spotter for Visually ImpairedVoice based Application as Medicine Spotter for Visually Impaired
Voice based Application as Medicine Spotter for Visually Impaired
 
Posture Patch
Posture PatchPosture Patch
Posture Patch
 
Journal of Computer Science Research | Vol.4, Iss.1 January 2022
Journal of Computer Science Research | Vol.4, Iss.1 January 2022Journal of Computer Science Research | Vol.4, Iss.1 January 2022
Journal of Computer Science Research | Vol.4, Iss.1 January 2022
 
paper
paperpaper
paper
 
Contact lenses
Contact lensesContact lenses
Contact lenses
 
The-Digitalization-of-the-Walking-Stick-for-the-Blind
The-Digitalization-of-the-Walking-Stick-for-the-BlindThe-Digitalization-of-the-Walking-Stick-for-the-Blind
The-Digitalization-of-the-Walking-Stick-for-the-Blind
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes Technology
 
Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...
Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...
Eye Tracking & Understanding Effectiveness of Menu Boards in Quick Service Re...
 
icmi2233-bixler
icmi2233-bixlericmi2233-bixler
icmi2233-bixler
 

Mais de MetrixLab - Global Online Consumer Research (6)

Metrix lab wat kan en moet je weten
Metrix lab wat kan en moet je wetenMetrix lab wat kan en moet je weten
Metrix lab wat kan en moet je weten
 
MetrixLab
MetrixLabMetrixLab
MetrixLab
 
Estudio lectura-de-prensa-de-los-internautascmvocento-metrixlab-2-12282209710...
Estudio lectura-de-prensa-de-los-internautascmvocento-metrixlab-2-12282209710...Estudio lectura-de-prensa-de-los-internautascmvocento-metrixlab-2-12282209710...
Estudio lectura-de-prensa-de-los-internautascmvocento-metrixlab-2-12282209710...
 
ML Pre Launch Pack Design Effectiveness
ML Pre Launch Pack Design EffectivenessML Pre Launch Pack Design Effectiveness
ML Pre Launch Pack Design Effectiveness
 
Rhetorical figures in TV commercials
Rhetorical figures in TV commercialsRhetorical figures in TV commercials
Rhetorical figures in TV commercials
 
Gillette a YouTube case Study
Gillette a YouTube case StudyGillette a YouTube case Study
Gillette a YouTube case Study
 

Último

Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CVKhem
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 

Último (20)

Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 

Measuring Attention with Mouse Movements

  • 1. Action as a Window to Perception: Measuring Attention with Mouse Movements A Validation Study of the MetrixLab FocusTracker Prof. dr. A. Johnson and dr. ir. L.J.M. Mulder
  • 2. Action as a Window to Perception: Measuring Attention with Mouse Movements Introduction Attention has been described as the interface between memory and events in the world. We have to attend to information if we are to encode it, and retrieval of past experiences depends on attention to appropriate cues in the environment (Logan & Compton, 1998). In visual processing, such as when shopping for a product on the internet or scanning supermarket shelves, attention is needed to locate relevant information and to guide action. Although it is possible to move the focus of attention at least a few degrees of visual angle away from the focus of the eyes (von Helmholtz, 1894), we almost always attend where we look (Johnson & Proctor, 2004). Therefore, if we want to know whether someone has attended to information, we will want to know if they have looked at it. Eye movement tracking is one means of measuring attention to scenes (Duchowski, 2002). In practice, however, the inconvenience and cost of collecting and analyzing eye-movement data limit the effectiveness of the technique for evaluating visual information displays. Recently, market research institute MetrixLab has developed a computer-based tool for tracking visual attention. This tool, the FocusTracker™, is based on the assumption that mouse movements provide a reliable indication of when and where attention is allocated on a computer screen. Because it is internet based, the tool can be used with hundreds of participants from any location in the world, allowing maximal freedom in targeting specific groups. It has the advantage that participants stay in a natural setting and that no laboratory or specialized equipment is necessary. Participants are first trained to point the mouse at high speeds, moving over images or texts on the computer screen. After this short training period, the displays of interest are presented with the instruction to the participant to “point to whatever catches your eye.” “Scan path” data from the FocusTracker can then be replayed using MetrixLab’s online FT Replay™ program to determine how attention is allocated to objects in the scene. The FocusTracker is based on the assumption that there is a one-to-one relationship between where we fix our gaze, where we point via a mouse, and what we are attending to. The question addressed in this research is thus whether the hand can be trained to follow visual spatial attention and whether attentional processing can be measured by tracking pointing movements with a handheld computer mouse. A related question is whether viewers can be adequately instructed to perform the task from their own homes, without the direct intervention of a researcher. In brief, the results of the research are very promising, showing high correlations between the scan paths for the eye and the mouse as well as high correlations between the percentage of time spent in designated regions of interest for the mouse and the eye. An additional comparison of the data with that of a group who did not use a mouse while viewing the experimental stimuli showed that viewing patterns were not disrupted by mouse use. © 1 2006
  • 3. The Experiment: Tracking the Focus of Attention In order to determine whether the hand can effectively follow the eye while viewing visual information, an experiment was conducted in which 21 advertisements were viewed for 5 seconds each. Participants were instructed to view the advertisements and to attempt to move the mouse in the same way as the eyes. Eye movements were registered with an eye tracker and hand movements were registered by logging the position of a handheld computer mouse. Two conditions were compared: one in which the participants received a short, verbal instruction and demonstration of how the mouse should be moved, and one in which participants followed the FocusTracker instruction program. Method Participants. Each group included 15 participants. The FocusTracker instruction group included 5 men and 10 women, 11 of which were university students. The mean age in this group was 25 years old (sd = 7.15). The verbal instruction group included 3 men and 12 women; 11 of which were university students. The mean age in this group was 25 years old (sd = 9.58). All but one participant had completed either a VWO or HBO program. Stimuli. The stimuli were relatively unknown advertisements1 taken from relatively expensive magazines. Advertisements were 25.5 cm high and 16-20.5 cm wide and were presented on a 17-inch computer screen. Each advertisement was made up of four regions of interest (ROIs): A headline, an illustration, text, and a trademark (see Figure 1). The position of each ROI varied across the advertisements and in some cases they overlapped. Apparatus. Eye movements were recorded with an Applied Science Laboratories model 504 eye tracker equipped with a pan/tilt camera.2 Eye position was determined 50 times per s. A Logitech infrared mouse was used to record hand position. Mouse position was also sampled 50 times per s. Participants were tested individually in a dimly lit room. Procedure. Participants were randomly assigned to either the verbal instruction or FocusTracker training groups. Participants in both groups were told that they should move the mouse to follow their eye movements. Participants in the verbal instruction group were also given a demonstration of how the mouse should be moved along with the eyes. Participants in the FocusTracker training group performed the tasks in the FocusTracker training: following a moving butterfly with the mouse, moving the mouse to each of a series of sequentially presented objects of the same type, moving the mouse to each of a series of sequentially presented objects of different types and moving the 1 The familiarity of the advertisements was tested in a pilot study in which 15 people (aged 17-40) were asked whether they had ever seen each of the advertisements. Only advertisements that were recognized by no more than 3 of the 15 participants in the pilot study were used as stimuli in the experiment. 2 Eye position is determined by comparing the pupil and the corneal reflection of infrared light emitted from the camera. © 2 2006
  • 4. (b) (a) (c) (d) Figure 1. A sample advertisement showing the four regions of interest (ROIs): the headline (a), illustration (b), text (c) and trademark (d). ROIs were defined for analysis by enclosing them in rectangles. In some cases (see ROI d), two rectangles were used to define the region. In case of overlap, the smaller ROI was subtracted from the larger ROI. mouse along with the eyes while viewing each of two advertisements. The training was followed independently by each participant, without the intervention of the researcher, and lasted approximately 1.5 – 2 min. The session began with the calibration of the eye tracker. The training was then given and was immediately followed by the experimental trials. Trials were separated with the presentation of the mouse cursor centered in a light gray screen. This screen was shown for 6 s before the first trial and for 2 s between subsequent trials. Participants were instructed to look at the mouse cursor until the advertisement appeared and were told that it would be impossible to move the mouse during these 2 s. The 21 advertisements were presented for 5 s each in the same order for all participants. After participants had viewed all of the advertisements, a surprise memory test was given. In Part 1 of the memory test, participants were shown either the headline, illustration or trademark from one of the advertisements and were asked to recall the other two attributes of the advertisement (e.g., if the trademark was shown, participants should report the illustration and the headline; recall of the text was not tested). Each cue was used seven times. Part 2 of the memory test was a recognition test in which 42 © 3 2006
  • 5. Eye scan path Mouse scan path Figure 2. Eye and mouse scan paths from a single participant. trademarks were presented, 21 from the advertisements used in the experiment and 21 from related products. The participant’s task was to classify each trademark as having been presented or not. Finally, participants were asked how familiar they were with each of the advertisements. At the end of the experimental session personal data (e.g., age, level of education, familiarity with the computer) was collected and participants were debriefed. The entire experiment lasted approximately 1 hour. Data analysis. Not all trials could be included in the analysis because of missing or unreliable eye movement data. If more than 0.5 s data was not usable the trial was not analyzed. Missing data resulted from the camera losing the eye position, extreme eye position readings as a result of the camera misreading the reflection point and as a result of correcting for eye blinks. Approximately 33 trials per group (9.6% of the data) were excluded from analysis. Additionally, occasional extreme values (outliers) were removed and replaced by the average of the two values before and after the outliers. The same procedure was applied to brief eye blinks. The relation between eye and hand movements was tested by (1) comparing the scan paths for the eye and hand using bidimensional regression techniques and (2) by comparing the percentages of time spent by the eye and hand, respectively in each of the four ROIs. Because mouse movements lagged behind eye movements, it was necessary to compensate for the lag on each trial. This was done by determining the best fit (bidimensional r) between the mouse and eye data. On average, 0.63 s of the mouse data at the beginning of the trial was discarded, as was a corresponding amount of the eye movement data at the end of the trial. The FocusTracker training group took on average 0.58 s to move the mouse whereas the verbal instruction group needed 0.69 s (F(1, 25) = © 4 2006
  • 6. Table 1. Average Values from the Bimendimensional Regression as a Function of Group (standard error in parentheses) Group FocusTracker training Verbal Instruction Value Correlation (r) 0.87 (0.04) 0.89 (0.04) Rotation (θ) 0.14 (0.06) 0.09 (0.05) Expansion (φ) 0.80 (0.02) 0.79 (0.02) Left-right translation (α1) 90.2 (13.3) 87.0 (12.0) Up-down translation (α2) 79.8 (9.2) 70.9 (6.1) 4.37, p = .047). Both groups were slower to move the mouse on the first advertisement (m = 1.17 s) as compared to all other advertisements (m = 0.61 s; (F(1, 25) = 22.63, p < .001). Results The relation of mouse and eye scan paths. Sample eye and mouse scan paths are shown in Figure 2. The degree to which the mouse followed the position of the eyes was assessed with bidimensional regression. The overall correlation between the scan paths (r), and the rotation (θ), expansion (φ), and translation (to the right or left or up or down; α) of the mouse scan path relative to that of the eye were determined individually for each participant and each advertisement (see Table 1). Correlations were Fisher transformed for analysis. Large correlations (ranging from .83 - .92 across advertisements) were found between the mouse and eye scan paths. These correlations did not significantly differ as a function of group. Differences between the advertisements were also minimal.3 Analysis of the rotation parameter, θ, revealed that the mouse scan path showed a slight (m = 11.5°) rotation to the right relative to the eye scan path. The expansion parameter, φ, with an average value of 0.8, reflected that the mouse scan path covered a somewhat smaller area than that covered by the eye. Finally, the mouse scan path was shifted, on average, about 4 cm above and to the right of the eye scan path. 3 One advertisement significantly differed from two others; no other differences between advertisements were found. © 5 2006
  • 7. Table 2 Percent Time Spent in Each Region of Interest by Eye and Mouse as a Function of Group (standard error in parentheses) Group FocusTracker Training Verbal Instruction Region of Interest % time eye % time mouse % time eye % time mouse Illustration 41.2 (3.12) 47.1 (3.51) 46.3 (2.32) 50.1 (2.16) Headline 30.1 (1.83) 28.0 (2.15) 27.7 (1.61) 26.9 (1.75) Text 14.8 (1.89) 14.8 (2.10) 11.6 (1.76) 10.9 (1.69) Trademark 11.3 (1.27) 9.5 (1.27) 11.0 (1.22) 10.9 (1.44) Percentage time in each region of interest. The overall percent time spent in each of the ROIs by the eye and mouse, respectively, is shown in Table 2. Two analyses were carried out: A correlational analysis in which the correlation between the time spent in each region by the eye and the mouse was computed across all advertisements for each participant, and an ANOVA with ROI (illustration, headline, text or trademark), effector (eye or mouse) and group as factors. Correlations were Fisher transformed for analysis. Overall, the correlation between time spent in each of the ROIs for the eye and mouse was high (r = .88). No significant difference in the correlation as a function of group was found (r = .88 and r = .87 for the FocusTracker and verbal instruction groups, respectively). Additional analysis carried out per advertisement showed average correlation coefficients ranging from .68 to .97. Figure 3 shows the time spent in each ROI by the mouse as a function of the time spent in each ROI by the eye. An ANOVA with ROI (illustration, headline, text or trademark) and effector (eye or mouse) as within subject factors and group (FocusTracker training or verbal instruction) as a between subject factor showed a main effect of ROI (F(3, 84) = 102.81, p < .001). Follow-up tests showed that significantly more time was spent on the illustration than the headline, and on these two ROIs than on the text or trademark. Furthermore, there was a significant Effector x ROI interaction (F(3, 84) = 14.14, p < .001). As can be seen in Table 2, the tendency to spend the most time on the illustration was more pronounced for the mouse than for the eye. No differences between groups were found. To investigate in more detail the differences in percentage time spent in the ROIs, the number of cases in which the eye visited a ROI not visited by the mouse, and vice versa, was computed. These percentages are shown in Table 3. The eye was more likely to visit an area not visited by the mouse than vice versa (F(1, 28) = 31.98, p < .001). This was especially the case for the trademark. © 6 2006
  • 8. Region of Interest 100 100,00 aandachtsgebieden headline Headline percentage tijd muis per aandachtsgebied Percent time mouse in ROI tekst Text afbeelding Illustration 80 80,00 merk Trademark 60 60,00 40 40,00 20 20,00 0 0,00 0,00 20,00 40,00 60,00 80,00 100,00 0 20 40 60 80 100 percentage tijd oog per aandachtsgebied Percent time eye in ROI Figure 3. Percent time spent in each region of interest (ROI) by the mouse as a function of percent time spent in each region of interest by the eye Table 3 Average Percentage of Trials in Which Either the Eye Visited a Region of Interest Not Visited by the Mouse or the Mouse Visited a Region of Interest Not Visited by the Eye Group FocusTracker Training Verbal Instruction Region of Interest Eye only Mouse only Eye only Mouse only Illustration 1.47 (0.65) 0.0 (0.0) 0.73 (0.50) 0.33 (0.33) Headline 6.27 (1.78) 0.0 (0.0) 7.73 (1.73) 1.07 (0.57) Text 9.47 (2.58) 5.53 (1.66) 10.40 (2.30) 8.27 (1.30) Trademark 13.00 (2.01) 1.73 (1.03) 7.53 (1.62) 0.33 (0.33) Effects of Mouse Use on Viewing Behavior The results of the experiment comparing mouse and eye scan paths suggest that “mouse tracking” can be an excellent substitute for eye tracking. High correlations were found between the forms of the scan paths for mouse and eye and for the amount of time spent in each ROI by the mouse and eye. Before using mouse tracking to evaluate observer behavior, however, it is important to know whether the use of the mouse leads to a different way of looking. That is, it is important to know that mouse use does not result in viewing behavior that is different than looking under normal conditions. In order to © 7 2006
  • 9. Table 4 Number and Duration of Fixations as a Function of Group and Region of Interest (standard error in parentheses) Group FocusTracker Training Verbal Instruction Eye Tracker Only Region of Number of Duration of Number of Duration of Number of Duration of Interest Fixations Fixations Fixations Fixations Fixations Fixations Illustration 8.5 (0.57) 204 (9.50) 9.7 (0.65) 203 (11.15) 8.3 (0.43) 196 (11.33) Headline 6.6 (0.38) 191 (9.25) 6.0 (0.40) 189 (3.41) 6.3 (0.43) 171 (8.27) Text 3.7 (0.49) 168 (5.92) 2.9 (0.43) 181 (7.68) 3.7 (0.41) 163 (8.39) Trademark 2.6 (0.13) 168 (6.14) 2.3 (0.25) 184 (6.94) 2.8 (0.19) 157 (5.59) Total 21.3 (0.41) 188 (8.15) 20.8 (0.57) 190 (8.17) 21.1 (0.41) 176 (7.90) examine this issue, a group of observers comparable to the experimental groups was tested under the experimental conditions but without use of the mouse. The number and length of fixations in each ROI was then computed. Table 4 shows the average number and duration of fixations within each ROI as a function of whether or not observers used the mouse while viewing the advertisements. Separate ANOVAs with ROI (illustration, headline, text or trademark) as a within- subjects factor and group (FocusTracker training, verbal instruction or eye tracking only) as a between-subjects factor were conducted on the percentage of time spent in each ROI, the number of fixations in each ROI, and the average duration of the fixations. No differences between groups were found, nor did group interact with ROI. In short, effects of using the mouse on looking behavior are minimal. Effects of Mouse Use on Recognition and Recall If the mouse is used to track attention, and attention is subsequently tested with recognition and recall questions, it is important to know whether mouse use has an effect on memory for the viewed material. In order to test this we measured recall of the illustration, headline and trademark, and recognition of trademarks, for each of the three groups. For the recall test, one element of the advertisement was shown (e.g., the headline), and observers were asked to recall the other ROIs, excluding the text. Table 5 shows recall performance as a function of group, ROI and cue (e.g., either the headline or the trademark could be shown as a cue for the illustration). An ANOVA with ROI (illustration, headline or trademark) as a within-subject factor and group (FocusTracker training, verbal instruction or eye tracking only) as a between-subject factor showed significant effects of ROI (F(2, 88) = 87.44, p < .001) and group (F(2, 44) = 6.75, p = © 8 2006
  • 10. Table 5 Percent Correct Recall as a Function of Group, Region of Interest and Cue (standard error in parentheses) Group Region of Interest FocusTracker training Verbal Instruction Eye Tracker Only Illustration Headline cue 43% (4.83) 23% (5.88) 47% (3.83) Trademark cue 41% (3.91) 38% (5.34) 61% (4.98) Headline Illustration cue 14% (3.94) 9% (3.36) 42% (6.87) Trademark cue 14% (4.18) 14% (4.62) 29% (5.05) Trademark Illustration cue 21% (3.91) 24% (5.34) 36% (5.63) Headline cue 18% (4.06) 17% (4.67) 18% (3.19) .003), as well as a significant ROI x Group interaction (F(4, 88) = 6.17, p < .001). Both groups who used the mouse remembered fewer elements of the advertisements than the group who performed the task without the mouse. This effect was significant for the ROIs “illustration” and “headline”, but not for the ROI “trademark”. For the recognition test, participants were presented with a list of the 21 trademarks seen in the advertisement, combined with a list of 21 similar trademarks. On average 33% of the presented trademarks were recognized. Of the similar trademarks, 13% were incorrectly classified as having been seen. Recognition performance did not differ between groups. Summary High correlations between eye and mouse scan paths and between percentage of time spent in each ROI by the eye and the mouse indicate that the mouse is a viable alternative to the eye tracker for measuring attention under natural viewing conditions. More than 75% of the variability in eye movements is captured by the mouse. Moreover, the lack of differences between the FocusTracker and verbal instruction groups suggest that instructions can be given remotely without any decrement to the accuracy of the technique. Use of the mouse had little influence on the way in which observers viewed the advertisements. Thus, the results would seem to be generalizable to other viewing © 9 2006
  • 11. situations. Use of the mouse did influence how much could be remembered of what was seen. This suggests that using the mouse does make demands on mental resources. Recommendations Several aspects of the data should be taken into account when using the mouse to track attention. First, both the FocusTracker training and verbal instruction groups showed a lower correlation between eye and mouse scan paths on the first advertisement than on subsequent advertisements. This suggests that at least one “practice trial” should be used before the stimuli of interest are shown. Second, consideration should be taken of the fact that the scan path of the mouse covers a smaller area than that of the eye, and is shifted somewhat to the right and to the top of the display. The smaller area is due to the fact that the eye sometimes made movements that were not followed with the mouse. The shift to the right and to the top of the display may be a result of using the right hand to move the mouse. Another factor that may have played a role in this shift is that in many of the advertisements, the trademark was at the bottom of the advertisement. In at least some cases, the eye moved to the trademark while the mouse lagged behind, perhaps because the cursor would get in the way of reading the text or because the trial ended before the mouse movement was completed. Specifically, the following recommendations are made: • Present a “practice” stimulus after training and before beginning the evaluation of the stimuli of interest • Define regions of interest to take into account the tendency of the mouse to be shifted to the right and top of the display • Do not analyze the first 0.63 s of the trial data, or measure the time taken to move the mouse • Do not rely on memory for viewed advertisements as an indication of where observers allocated their attention. Mouse tracking is a viable alternative to eye tracking for determining which elements of advertisements receive attention during a short viewing period. Moreover, the FocusTracker training is a viable, on-line method of instructing observers to perform the task. © 10 2006
  • 12. References Duchowski, A. T. (2002). A breadth-first survey of eye-tracking applications. Behavior Research Methods Instruments & Computers, 34, 455-470. Johnson, A., & Proctor, R. W. (2004). Attention: Theory and practice. Thousand Oaks, CA: Sage Publications. Logan, G. D., & Compton, B. J. (1998). Attention and automaticity. In R. D. Wright (Ed.), Visual attention. Vancouver studies in cognitive science (Vol. 8, pp. 108-131). New York: Oxford University Press. von Helmholtz, H. (1894). Über den Ursprung der richtigen Deutung unserer Sinneseindrücke (The origin of the correct interpretation of our sensory impressions). Translated in R. M. Warren & R. P. Warren (1968). Helmholtz on perception, its physiology, and development (pp. 249-260). New York: Wiley. © 11 2006