Qualtrics experts will share with you new advanced methods to measure leadership traits and highlight individual strengths and weaknesses. Multi-rater assessments, 360-degree employee or student feedback provides a holistic view of an individual by gathering feedback from peers, direct reports while comparing the results with their own self evaluation.
Building a Peer Evaluation Program: Best practices for beginners
What is peer evaluation
Why run peer evaluation
Peer evaluation workflow / process
Competencies & items
Reports
What to do with results
2. Housekeepin
g
SOM E GENERAL HOUSEKEEPING RULES
o A copy of the slides is available in the resources panel
o The webinar will be recorded and made available ON-
Demand
o To keep the webinar short questions will be answered
via email later today.
3. Speaker
Sarah Marrs
PRINCIPAL CONSULTANT, PEOPLE INSIGHTS
Sarah has dedicated her career to designing and running feedback programs for
over 7 years. She joined Qualtrics in April 2015, after working in-house at Tesco
overseeing their global colleague insights survey. At Qualtrics, Sarah works with
clients to design feedback solutions that drive real value, through assessment
design, sampling methodologies, communication programs, or interpreting
results.
She’s a big believer in the power of listening to and acting on feedback, and
always energised to talk about new and innovative ways to collect and act on that
information.
4. Agenda
TODAY WE WILL BE COVERING
o What is a peer evaluation program
o How to create a peer evaluation program
1. Designing a competency framework and items
2. Designing your reports
3. Designing your workflow
5. What is peer
evaluation? A peer evaluation program collects, quantifies,
and reports observations of a student by their
classmates or faculty.
COMMONLY INDIVIDUALS ARE RATED BY:
o Themselves
o Cohort / classmates
o Tutor / Professor
o Potentially course director
o Potentially academic advisor
6. What is peer
evaluation? A peer evaluation program collects,
quantifies, and reports observations of a
student by their classmates or faculty.
7. What is peer
evaluation? A peer evaluation program collects,
quantifies, and reports observations of a
student by their classmates or faculty.
8. What is peer
evaluation? A peer evaluation program collects, quantifies,
and reports observations of a student by their
classmates or faculty.
9. What is peer
evaluation? A peer evaluation program collects, quantifies,
and reports observations of a student by their
classmates or faculty.
Peer evaluation is ultimately designed to drive
lasting behaviour change
10. What is peer
evaluation?
WHAT DO STUDENTS GET FROM PEER EVALUATION?
o Self-awareness (no.1 leadership trait!)
o Improvement in learning
o Teaching how to give feedback
o Teaching how to take feedback
WHAT DOES A TUTOR GET FROM PEER EVALUATION
o A feedback instrument that works for different student styles
o An instrument that allows them to get a gauge on student dynamics
o An instrument that allows them to get data-driven feedback aggregation
11. What is peer
evaluation?
THREE COMMON USES FOR PEER EVALUATION
1. Measure contributions and teamwork at the end of a specific
project to feed into project grades
2. Mid-year measurement of a student’s behaviours to provide
developmental feedback
3. End-of-placement review of a student work placement to get
feedback from both professional and academic sides
13. How to create a
peer evaluation
program
THREE STEPS TO CREATE YOUR PEER EVALUATION
1. Design your competency framework and items
2. Design your reports
3. Design your workflow
14. COM PETENCY A
I T E M
( B E H AV I O U R )
1
I T E M
( B E H AV I O U R )
2
I T E M
( B E H AV I O U R )
3
COM PETENCY B
COM PETENCY C
I T E M
( B E H AV I O U R )
4
I T E M
( B E H AV I O U R )
5
I T E M
( B E H AV I O U R )
6
I T E M
( B E H AV I O U R )
7
I T E M
( B E H AV I O U R )
8
I T E M
( B E H AV I O U R )
9
STEP 1 - Design
a competency
framework &
items
15. TEAM WORKING SKILLS
L i s t e n s t o
c l a s s m a t e s
’ i d e a s
Ta k e s o n
o t h e r ’s
f e e d b a c k
W o r k s w e l l
w i t h
c l a s s m a t e s
CONTRIBUTION TO WORKLOAD
At t e n d s a l l
w o r k i n g
s e s s i o n s
P a r t i c i p a t e s
a c t i v e l y i n
w o r k i n g
s e s s i o n s
I s a b l e t o
c a r r y o u t t a s k s
i n d e p e n d e n t l y
PLANNING & ORGANISING
Ar r i v e s o n
t i m e t o
w o r k i n g
s e s s i o n s
M a n a g e s t h e i r
s c h e d u l e w e l l
M e e t s p r o j e c t -
r e l a t e d
d e a d l i n e s
W o r k s w e l l
w i t h
c l a s s m a t e s
PRO-TIP
Frequency scales are the best
way to focus on the consistency
of behaviours, e.g:
• Never
• Rarely
• Sometimes
• Often
• All the time
STEP 1 - Design
a competency
framework &
items
16. GOOD ITEMS...
o Capture an important aspect of a key competency
o Focus on a single behaviour
o Start with an action verb (e.g., motivates, listens, etc.)
o Are written in simple plain language
o Can be acted upon if they are identified as
weaknesses/ opportunities
BAD ITEMS...
o Loosely or fail to measure an important competency
o Focus on multiple behaviours (double, triple barreled)
o Include too many unnecessary adverbs (e.g., efficiently,
effectively)
o Are too technical or hard to understand
o Do not match the rating scale
o Are culturally biased
STEP 1 - Design
a competency
framework &
items
17. STEP 1 - Design
a competency
framework &
items
OTHER COMPETENCY AND ITEM TIPS
o Avoid including too many items (generally 50-60 total items max)
o Include a few (up to 3) open-ended items that ask students to provide constructive
and actionable feedback to their cohort
o Remind evaluators not to include identifying information in open-ended items to
protect confidentiality
18. STEP 2 - Design
your reports
TWO M AIN TYPES OF REPORT
1. The individual report (to go back to the student)
2. The aggregated report (to be seen by the tutor / professor / course director only)
19. STEP 2 - Design
your reports
GOOD REPORTS:
AGGREGATE RESULTS WITHIN SOURCE – Aggregate ratings within source for each competency
and item
20. GOOD REPORTS:
AGGREGATE RESULTS WITHIN SOURCE – Aggregate ratings within source for each competency
and item
PROVIDE MEANINGFUL COMPARISONS – Compare source-level ratings to internal benchmarks
STEP 2 - Design
your reports
21. GOOD REPORTS:
AGGREGATE RESULTS WITHIN SOURCE – Aggregate ratings within source for each competency
and item
PROVIDE MEANINGFUL COMPARISONS – Compare source-level ratings to internal benchmarks
INCLUDE OPEN-ENDED COMMENTS – Present both positive and constructive comments
STEP 2 - Design
your reports
22. STEP 3 - Design
your workflow
COMMON WORKFLOW DECISIONS
o Who chooses who should evaluate an individual?
o Should the tutor / professor approve nominations for evaluators?
o Should evaluators have an opportunity to opt out?
o Should the individual receive their report directly?
23. STEP 3 - Design
your workflow
SCENARIO 1. MEASURE CONTRIBUTIONS AND
TEAMW ORK AT THE END OF A SPECIFIC PROJECT TO
FEED INTO PROJECT GRADES
Tutor / professor
adds all students in
project groups –
and asks all
students in group to
rate each other
All students need to
complete their
project group
evaluations
(no opting out)
Report goes back to
tutor / professor directly
– for each individual.
Aggregate report also
created allowing them to
compare individuals.
Tutor discusses
report with student
and feeds into
project grades
24. STEP 3 - Design
your workflow
SCENARIO 2 . MID-YEAR MEASUREMENT OF AN
INDIVIDUAL’S BEHAVIOURS TO PROVIDE
DEVELOPMENTAL FEEDBACK
Individual is
asked to select
3-5 peers and
up to 3 tutors
who they would
like to receive
feedback from
Individual will
automatically
receive
evaluations from
their course
director and
academic
advisor
Peer
evaluations are
opt-in – so not
mandatory for
all classmates
to complete
Report goes
back to
individual
student first, for
them to digest
and later
discuss with
academic
advisor or tutor.
Course directors
get aggregated
feedback
reports
25. STEP 3 - Design
your workflow
Course director
adds placement
students -
asking tutor,
classmates and
professional
coworkers to
evaluate them
Placement
evaluations are
‘nudged’ but
not mandatory
Individual
student report
goes to each
tutor for
feedback
discussion
Aggregate
report goes to
course director
to review
placement
program as a
whole
Course directors
reach out to
company
placement
teams to
discuss
feedback / any
red flags
SCENARIO 3. END-OF-PLACEMENT REVIEW OF A
STUDENT W ORK PLACEMENT TO GET FEEDBACK FROM
BOTH PROFESSIONAL AND ACADEMIC SIDES
26. In Summary
1. Define Purpose
2. Define Competencies & Behaviours
3. Define Report
4. Define Workflow
5. Launch evaluation
6. Follow-up with students
Degree to which behaviours are exhibited by the Subjects
PEER EVALUATION FACILITATES DATA-DRIVEN OUTCOMES:
Degree to which behaviours are exhibited by the Subjects
Allows for meaningful comparisons…
of evaluator perceptions across multiple Subjects (internal benchmarking)
of evaluator perceptions across sources of ratings (e.g., Subordinate vs. Manager ratings of a Subject)
of ratings over time
Self-awreness – Study conducted 2010 by Cornell School of Industrial and Labour relations and Green Peak partners
Think executive coaching for CEOs – all about self-awareness
Works for different student styles – individual accountability during group work
- Students who like it are happy. Gives acc for those who don’t.
Talk about how with no 2, it might potentially tie into overall course evaluations.
Talk about how with no 2, it might potentially tie into overall course evaluations.
Talk about how with no 2, it might potentially tie into overall course evaluations.
The foundation of a 360 Feedback Assessment is a relevant competency model However, Subjects SHOULD NOT be rated directly on competencies but rather specific behaviors associated with each competency
The foundation of a 360 Feedback Assessment is a relevant competency model However, Subjects SHOULD NOT be rated directly on competencies but rather specific behaviors associated with each competency
Open-ends – think Stop, Start Continue.
360 Feedback reports must provide robust data for employees to process, but be simple enough to easily digest the insights so action can be taken
360 Feedback reports must provide robust data for employees to process, but be simple enough to easily digest the insights so action can be taken
360 Feedback reports must provide robust data for employees to process, but be simple enough to easily digest the insights so action can be taken
Establish one solid workflow based on best practice
Point of running this on a tech as opposed to forms – admin control over workflow, confidentiality etc
Very easy to scale across the orgs.
Establish one solid workflow based on best practice
Point of running this on a tech as opposed to forms – admin control over workflow, confidentiality etc
Very easy to scale across the orgs.