Influencing policy (training slides from Fast Track Impact)
Using engagement surveys to evaluate institutional students enhancement initiatives�
1. Using the framework of engagement
surveys to evaluate institutional
students enhancement initiatives
Prof. Rhona Sharpe,
Berry O’Donovan,
Dr. Metaxia Pavlakou
Oxford Brookes University
June 2014
2. Experience or engagement surveys?
Activity 1
Can you identify different types of questions
from the list?
3. Experience vs engagement surveys:
Perception vs behaviour
Experience type questions ask:
• Students’ perceptions
• Other people’s behaviours
• How you feel
• How satisfied you are
Engagement type questions ask:
• Students’ behaviour
• About how often
• About how much
4.
5. Cognitive Interviewing:
what it is and how to do it
Cognitive
interviewing in
a nutshell:
the ‘puppy
word problem’
example
(Willis, 2005)
7. Practise your Cognitive Interviewing
Activity 2
Use the script and test the questions!
• Split in groups of 3
(1 interviewer, 1 respondent and 1 observer)
• Keep notes of the responses - Be prepared to
feedback!
8. Sharing results from our cognitive interviews
• Cognitive interviews with 7 students (4 female, 3
male; 4 undergrad, 3 grad).
• Age ranges from 20 to 52.
• Interviews lasted between 45 to 70 min.
• Audio recorded and transcribed.
9. Results from our cognitive interviews
Before
• Actively engaging with issues of
equity and social justice such as
the reduction of prejudice,
stereotyping and discrimination.
• Assessed your own work/your
peers’ work.
• Meeting the professional
requirement of a role e.g. being
assessed on professional skills,
duties, behaviors, values or
decision making.
• Indicate how many times has
your Academic Adviser contacted
you.
After
10. Results from our cognitive interviews
Before
• Actively engaging with issues of
equity and social justice such as
the reduction of prejudice,
stereotyping and discrimination.
• Assessed your own work/your
peers’ work.
• Meeting the professional
requirement of a role e.g. being
assessed on professional skills,
duties, behaviors, values or
decision making.
• Indicate how many times has
your Academic Adviser contacted
you.
After
• Considering issues of equality and
fairness.
11. Results from our cognitive interviews
Before
• Actively engaging with issues of
equity and social justice such as
the reduction of prejudice,
stereotyping and discrimination.
• Assessed your own work/your
peers’ work.
• Meeting the professional
requirement of a role e.g. being
assessed on professional skills,
duties, behaviors, values or
decision making.
• Indicate how many times has
your Academic Adviser contacted
you.
After
• Considering issues of equality and
fairness.
• Formally self-evaluated your own
work/your peers’ work.
12. Results from our cognitive interviews
Before
• Actively engaging with issues of
equity and social justice such as
the reduction of prejudice,
stereotyping and discrimination.
• Assessed your own work/your
peers’ work.
• Meeting the professional
requirement of a role e.g. being
assessed on professional skills,
duties, behaviors, values or
decision making.
• Indicate how many times has
your Academic Adviser contacted
you.
After
• Considering issues of equality and
fairness.
• Formally self-evaluated your own
work/your peers’ work.
• Acquiring job or work related
knowledge and skills.
13. Results from our cognitive interviews
Before
• Actively engaging with issues of
equity and social justice such as
the reduction of prejudice,
stereotyping and discrimination.
• Assessed your own work/your
peers’ work.
• Meeting the professional
requirement of a role e.g. being
assessed on professional skills,
duties, behaviors, values or
decision making.
• Indicate how many times has
your Academic Adviser contacted
you.
After
• Considering issues of equality and
fairness.
• Formally self-evaluated your own
work/your peers’ work.
• Acquiring job or work related
knowledge and skills.
• Dropped.
14. Required UK pilot scales:
HOL= higher order learning
CC= course challenge
AI= academic integration
CL= collaborative learning
Based on NSSE scales:
SB= sense of belonging
CE= co-curricular engagement
SD= skills development
Brookes scales:
AC= Assessment Compact
AA= Academic Advising
GA= Graduate Attributes
(AL= Academic Literacy
RL= Research Literacy
DL= Digital and Information Literacy
GC= Global Citizenship
PL= Critical Self-awareness and
Personal Literacy)
The Brookes Student Engagement Survey
15. Research Literacy Scale
During the current academic year, how much has your coursework emphasised the following
mental activities? (Very much/ Quite a bit/ Some / Very Little)
RL1
HOL4
Evaluating or judging a point of view, decision, or information source
During the current academic year, about how often have you done each of the following
(Very often / Often / sometimes / never)
RL2 Identified, located and gathered information from a variety of sources e.g. online
libraries and databases
RL3 Conducted your own research, using the methods taught in your programme of
study?
How much has your experience at this institution contributed to your knowledge, skills and
personal development in the following areas? (very much / quite a bit/ some /very little)
RL4 Using information you have gathered to make an argument or decision
16. References
Gibbs, G. (2010). Dimensions of quality. York: Higher Education Academy.
Kuh, G. (2009). The National Survey of Student Engagement: Conceptual
and empirical foundations. In R. M. Gonyea & G. Kuh (Eds.), New
Directions for Institutional Research (Vol. 141, pp.5-20). San
Francisco, CA: Jossey-Bass.
Ouimet, J. A., Bunnage, J. C., Carini, R. M., Kuh, G. & Kennedy, J. (2004).
Using focus groups, expert advice and cognitive interviewing to
establish the validity of a college student survey. Research in Higher
Education, 45(3), 233-250.
Tourangeau, R. (1984). Cognitive science and survey methods: A cognitive
perspective. In T. Jabine, M. Straf, J. Tanur & R. Tourangeau (Eds.),
Cognitive aspects of survey design: Building a bridge between
disciplines (pp.73-100). Washington, DC: National Academy Press.
Willis, G. B. (2005). Cognitive interviewing: A tool for improving
questionnaire design. London: Sage
Notas do Editor
This presentation is aimed at those faced with the challenge of demonstrating the impact of institutional initiatives to enhance the student experience.
This year we have been members of the HEA group piloting student engagement surveys for the UK context;
we have devised, piloted and run a student engagement survey suitable for our institutional context.
This workshop will share some of the decisions and tools we used along the way; and I hope it will be useful to those developing survey tools which to evaluate the impact of university wide projects designed to improve the student experience.
What’s the difference between experience or satisfaction and engagement surveys?
I have photocopied for you a list of possible survey questions; Can you identify different types of questions from this list?
The National Student Survey (NSS) was introduced in 2005 as a measure of the student experience; however NSS does not ask students to consider their own behaviours and the role that these play in how they experience institutional conditions. In his landmark paper ‘Dimensions of Quality’, Graham Gibbs concluded that ‘satisfaction is not a valid indicator of educational quality’ (Gibbs, 2010, p.14).
Our project developed a student engagement survey tailored for Oxford Brookes, which uses Kuh’s construct of ‘educationally purposeful activities’ which identified student activities that correlate with educational gain (Kuh, 2009).
We applied this to activities which are being promoted through the institution-wide strategic Programme for Enhancing the Student Experience and other enhancement activities.
Once you have your questions, these need to be cognitively tested…Cognitive interviewing is a technique typically used in survey design and development…to examine the extent to which the students understand the questions being asked and to provide a more contextualized understanding of survey responses…
(Has anyone already done this? Or read this book?)
In a nutshell: Gordon Willis, a cognitive psychologist at the National Institute of Health, recently published this book on the technique of CI. He open with an anecdote that illustrates the value of the approach: When evaluating the effectiveness of math word problems, Willis read the following word problem to a group of second graders:
A poodle has 9 puppies
A collie has 5 puppies
How many more puppies does the poodle have?
He was surprised to find many children giving the incorrect answer ‘none’ or ‘zero’. In order to understand the thought process that lead to this response, he asked a set of probing questions that revealed the problem: because the connection between the poodle and the puppy wasn’t clear, they focused on each separately. Children were told that the poodle had 9 puppies and were given no further information – so when asked ‘how many more the poodle had’ they assumed that nothing had changed and therefore the answer was no more (children interpreted ‘more’ as an increase in quantity, not as a comparison!)
Consequently, a modification was made: how many more puppies does the poodle have than the collie / compared to the collie?
In most scenarios cognitive interviewing is much more sophisticated than in this example, but the point is the same: identify question wording that is confusing, biases their answers or leads them in the wrong direction.
So, how are survey findings being used? How do you use these types of questions to evaluate a project?
In Oxford Brookes we had some big institutional projects we wanted to evaluate/to monitor progress on these specific initiatives: AC, AA and GA. The survey went live in March, and we just got back the first data.
So, this is how we used it.
How would you used it? What are you going to do next in your institutions?