Report on research of using a new video annotation/rating tool in business communication courses to help assess presentation and interviewing skills. The tool facilitates peer and instructor/TA feedback on a student’s performance, including the continuous rating of the performance. I describe the perceived effects of using this technology on the students’ learning experiences.
Perceived Value of Peer and Instructor Performance Assessment Using Video Annotation:AECT 2011
1. PERCEIVED VALUE OF
PEER AND INSTRUCTOR
PERFORMANCE
ASSESSMENT USING
VIDEO ANNOTATION
Michael C. Johnson
BYU Center for Teaching & Learning
(4103.C0.b)
2. Background: The Technology
Used the online video annotation tool REACT
Live commenting and rating of peer/student
performance
Rate throughout the performance and at the end
Provide timeline specific comments and general
comments at the end.
Feedback immediately available
3.
4.
5.
6. Background: The Case
Undergraduate Business Communications
Course
Group Presentation (end of semester)
Only time they used REACT in the semester
250+ students
15 + instructors
7. Background: The Case
Tools was in Beta: So some technical
difficulties were experienced
The classes only used the tool for this one
assignment
Comments from peers were anonymous
(instructor comments were identified)
8. Methods
Classroom observations of
presentations and rating/commenting
Survey of Students (100+ responses)
Survey of instructors (12 responses)
Thematic analysis (van Manen, 1990;
2002) taking a phenomenographic
approach (Micari, 2007)
9. Issues/Questions
How did students and instructors perceive:
The effects of the commenting/rating
process on students as presenters
The effects of the commenting/rating
process on students as raters
The usefulness of the comments/ratings
students received from their peers and
instructors
11. Perceived Effects on Presenters
No real difference
“It was normal, I didn't really pay any
attention to the fact that people were
rating us.”
“I didn't think much about it. I actually
forgot they were rating me. It was a good
experience”
12. Perceived Effects on Presenters
Encouraged them to
practice/prepare more
“We might have been a little nervous
being filmed but I think it helped us be
prepared.” [emphasis added]
13. Perceived Effects on Presenters
Changed the way they presented,
trying to really reach their audience,
etc.
“I think it improved our presentation. We
tried to make our presentation relate to
the people watching as much as possible
so they would enjoy themselves ”
14. Perceived Effects on Presenters
Try harder/care more.
“I wanted to do better because I knew
class could rate every small thing I did.”
“Being rated made me care more about
giving a good presentation.”
15. Perceived Effects on Presenters
Audience was distracted—made it
harder to keep audience’s attention
“I could tell some people were typing
instead of listening ”
16. Perceived Effects on Presenters
As presenters, they were distracted
by the audience’s laptops, rating
activities
“It was a tad distracting because everyone
was typing things while we were
talking.”
17. Perceived Effects on Presenters
Created additional pressure
“Nerve raking”
“Stressful”
“Intimidating”
18. Effects on Presenters
Instructor Perspective
Raised the level of expectations on their
performance
Students liked receiving feedback
Being rated and recorded caused some
student anxiety
20. Perceived Effects on Raters
Paid closer attention, watched for
details
“I learned more because I was paying
more attention and being critical.”
21. Perceived Effects on Raters
Distracted them
“Rating others I do feel like took away
from watching the presentation.”
22. Perceived Effects on Raters
They saw examples of what was
good and bad performance.
“I liked rating others because I was able to see
what did and did not work for other groups
and then apply that to my own presentation.”
“It allowed me to compare what good presenters
did or didn't do as opposed to poor
presenters.”
23. Perceived Effects on Raters
Critically analyzed performance
“By rating others, I looked at what they
did well, and in return thought about
how I would act if I was the one
presenting. It helped me learn how to
better engage the audience by using, or
not using the techniques used by those I
critiqued.”
24. Perceived Effects on Raters
They became overly critical
“I saw bad posture, I saw lots of "um's,"
and I also saw some very effective use of
PowerPoint and other presenting skills.
By looking for them, I found them.”
“I WAS SUPER CRITICAL AND LOOKED
FOR ERRORS.”
25. Perceived Effects on Raters
Didn’t make a difference
“I don't think it makes that much
difference, because I am an amateur so
I only know like 4 or 5 things to look
for, so I focus on those.”
26. Perceived Effects on Raters
Instructor Perspective
Helped students focus on the elements of
good presentations
Students liked giving feedback
Students (as the audience) were distracted
by having to rate
28. Usefulness of Comments/Ratings
Helpful to see what is going well and what
needs to be improved:
“They picked up things that I would have
never noticed with my experience doing
it alone. They said nice things and
constructive things which was also
appreciated.”
29. Usefulness of Comments/Ratings
Helpful to see what you were doing
related to comment (contextualized
and specific comments)
“VERY USEFUL. The comments were
more specific than if they had just
written comments at the end of the
presentation. I can see what I was doing
when they made a certain comment.”
30. Usefulness of Comments/Ratings
Instructor comments perceived as
more valuable by some students
“My teacher's comments were the most
helpful - some of the other comments
were unclear or contradictory”
(However, other students felt thatthe peer comments
added more)
31. Usefulness of Comments/Ratings
Student comments perceived as
biased/overly critical/based on
personal opinion
“There were still several comments that
were clearly personal opinions, but that
will always happen with peer reviews.”
“BIASED AND OVER ANALYTICAL.”
32. Usefulness of Comments/Ratings
Student comments perceived by
some as unclear
“It was very useful and good to know,
but the comments were at times
vague.”
33. Usefulness of Comments/Ratings
Comments were at times
contradictory
“I think it was useful, but not so much
when contradicting information was
given. For example, one person could
say that a transition was great, while
someone else can say it was not good.”
34. Anonymity of Peer Comments
Some students felt this helped them
and/or their peers be more open and
honest
Others felt that it allowed their peers to be
more critical and mean spirited or even
purposefully bring their grades down
Note: Comments, average, and overall ratings were not
anonymous to the instructor
35. Comment Type Preference
Vast majority of students prefer timeline
comments—specific, actionable
A few prefer the general comments at the
end—general, overall feel for how things
went
A few others just liked reviewing the video
of their performance
A couple liked the ratings better than the
comments
36. Use of Comments
Student’s and faculty report:
Student have used feedback to improve
subsequent individual presentations
Students intend to use the feedback they
received
37. Some Implications for Practice?
Give the students clear information about
criteria of the performance
Give students more low-stakes practice in
front of cameras and peers, as needed
Train students how to rate—based on the
criteria and how to write clearly
Provide more low-stakes practice as raters
Hold “common judgment” sessions
38. Some Implications for Practice?
Help students with the analysis and
application of comments they receive
Provide opportunities to perform again, use
feedback to improve
39. References
Micari, M., Light, G., Calkins, S., Streitwiesser, B.
(2007). Assessment beyond performance:
Phenomenography in educational evaluation.
American Journal of Evaluation. 28(4), 458-
476.
van Manen, M. (1990). Researching Lived
Experience: A Human Science for Action
Sensitive Pedagogy. Albany, NY: State
University of New York Press.
van Manen, M. (2002). Phenomenology Online.
Retrieved December 23, 2006, from
http://www.phenomenologyonline.com/
40. Contact Information
Follow me on Twitter
Michael C. Johnson
Email:
mc_johnson@byu.edu
Twitter:
@michaelcjohnson
Center for Teaching &
Learning
Website:http://ctl.byu.e
du
Facebook:http://www.fa
cebook.com/byuctl
Twitter:@byuctl
Notas do Editor
The primary source of data collection was a series of in-depth interviews. The instructor of the course and fourteen students who completed the course were the primary data sources. We also used the course materials (including the readings from the course) and the students’ assignments, where possible, as secondary sources of data. We conducted some brief follow-up interviews through e-mail conversations.We conducted several thematic analyses of the data, using a combination of holistic, selective, and detailed approaches to the thematic analyses (van Manen, 1990; 2002). The secondary forms of data were used to obtain a richer view of the themes arising from the interview data, for negative case analysis and to elicit discussion in follow-up interviews. As part of the analysis we wrote brief summaries of the themes that emerged from the data of each individual participant as well as collective themes. We conducted the conversations about the themes primarily through written correspondence. We also discussed the findings with other researchers and instructional designers to in what van Manen terms collaborative analysis (van Manen, 1990).
In this case, students developed a passion for what they learn, beyond just memorizing material for a grade and moving on.