Emotions are a form of non verbal communication that we use to reflect our physiological and mental state. We express emotions when we are dealing with everything around us even with our computers. Since we are becoming more dependent on computers in our lives, we need to design more interactive systems. In other words, we need to adapt computers to our needs as well as to our behavior; make computers emotionally intelligent, in order to be able to detect ours mood and make decisions based on that.
1. Affective Computing:
Comparing Computer-Face-Based Emotion Recognition
with Human Emotion Perception.
Principal Investigator:
Dr. Winslow Burleson.
Researchers:
Dr. Kasia Muldner,
MC. Javier González Sánchez,
MC. María Elena Chávez Echeagaray,
BS. Patrick Lu,
BS. Natalie Freed.
Developed by the Motivational Environments Team at Arizona State University, the MIT Media Lab, and The Exploratorium museum
of science, art and human perception.
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
1
2. Context
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
2
3. Affec+ve
Compu+ng
Emo+ons
are
a
form
of
non
verbal
communicaAon
that
we
use
to
reflect
our
physiological
and
mental
state.
We
express
emoAons
when
we
are
dealing
with
everything
around
us
even
with
our
computers.
We
need
to
adapt
computers
to
our
needs
as
well
as
to
our
behavior;
make
computers
emoAonally
intelligent,
in
order
to
be
able
to
detect
our
mood
and
make
decisions
based
on
that.
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
3
4. Vision
-‐
Based
?
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
4
5. Facial
Analysis
• Based
on
a
MIT
Media
Lab
project
soCware
MindReader
API
that
enables
the
real
Ame
analysis,
tagging
and
inference
of
cogniAve
affecAve
mental
states
from
facial
video.
This
framework
combines
vision-‐based
processing
of
the
face
with
predicAons
of
mental
state
models
to
interpret
the
meaning
underlying
head
and
facial
signals
overAme.
• (Ekman
and
Friesen
1978)
–
Facial
Ac+on
Coding
System,
46
ac+ons
(plus
head
movements)
• Standard
to
systemaAcally
categorize
the
physical
expression
of
emoAons,
and
it
has
proven
useful
to
psychologists
and
to
animators
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
5
6. MindReader
API
CollaboraAon:
Rana
El
Kalubi,
MIT.
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
6
7. MindReader
API
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
7
8. Knowledge
-‐
Based
This
is
a
Data
Mining
applicaAon.
Support
vector
machines:
given
a
set
of
training
examples
an
SVM
training
algorithm
builds
a
model
that
predicts
whether
a
new
example
falls
into
one
category
or
the
other.
We
need
data!.
Our
applicaAon
was
exhibited
at:
Exploratorium,
the
Museum
of
science
Art
and
Human
Percep+on.
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
8
9. Users
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
9
10. Approach
One
CollaboraAon:
Ken
Perlin,
NYU
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
10
11. Approach
One
CollaboraAon:
Ken
Perlin,
NYU
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
11
12. Approach
Two
• Our
applicaAon
was
exhibited
in
the
museum
for
a
couple
of
months.
• At
stage,
the
exhibits
requires
two
simultaneous
users:
a
subject
and
an
observer.
observer
subject
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
12
13. Approach
Two
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
13
14. Uses
We
are
able
to
detect
the
following
states:
EducaAon
Interested
User
Interfaces
Agreeing
ConcentraAng
Disagreement
Thinking
Unsure
Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
14
15. Enero
22,
2010
Javier
González
Sánchez
|
María
E.
Chávez
Echeagaray
15