1. ROUNDTABLE
Dr Christian Bokhove
University of Southampton
C.Bokhove@soton.ac.uk
Dr Sally Bamber
University of Chester
s.hughes@chester.ac.uk
How can we enhance student
teachers’ evidence-based decision
competences?
2. Aim - Research in Teacher Education (RiTE) project
Promote and facilitate (student) teachers to create
an evidence- informed teaching practice in science,
technology, engineering and mathematics (STEM)
education.
In this RiTE project, (student) teachers are stimulated
to use evidence from educational and scientific
research to experiment and innovate their teaching
and learning processes
3. This roundtable
In this roundtable we try to mix statements from
existing research (references on the last slide) with
often heard views on evidence-informed practices.
We have changed them into statements, which we
discuss in a roundtable discussion. You can both raise
your hand and say what you think, as post views in
the chat. We will sometimes interactively probe
further and try to get the existing views clear.
4. Biesta (2007, p. 20) challenges ‘what works’
“[r]esearch cannot supply us with rules for action but
only with hypotheses for intelligent problem solving.
Research can only tell us what has worked in a
particular situation, not what will work in any future
situation. The role of the educational professional in
this process is not to translate general rules into
particular lines of action. It is rather to use research
findings to make one’s problem-solving more
intelligent.”
5. Gorard et al. (2020, p. 599)
“Programmes shown not to work, or where there has
been no robust evaluation, should be actively
discouraged, given that there is growing evidence of
programmes that do seem to work.”
6. Breckon and Dodson (2016, p. 28)
“we need more impact evaluations on interventions
to increase evidence-use”
7. Hanley et al. (2016, p. 296)
“Perhaps it is time for the bluntness of the ‘what
works?’ agenda to evolve into one that also
establishes who it works for, through what means,
and in what circumstances”
8. Labaree (1998, p. 11)
“In this sense, educational researchers may not have
the kind of authority that comes with hard-pure
science, but they have a ready rhetorical access to
the public that is lacking in more authoritative fields.
As a result, the lesser form of knowledge produced
by educational researchers may in fact offer them a
political and social opportunity that is largely closed
to the more prestigious realms of the university.”
9. Labaree (2011, p. 621)
“by adopting this rationalized, quantified, abstracted,
statist, and reductionist vision of education,
educational policymakers risk imposing reforms that
will destroy the local practical knowledge that makes
the ecology of the classroom function effectively.”
10. Berliner (2002, p. 18)
“Educational Research:The Hardest Science of All”…..
“We face particular problems and must deal with
local conditions that limit generalizations and theory
building—problems that are different from those
faced by the easier-to-do sciences.”
11. Nelson & Campbell (2017, p. 132)
“‘evidence’ constitutes a range of types and sources
of knowledge and information, including professional
expertise and judgement, as well as data and
research. Indeed, despite the considerable debate
about ‘gold standards’ of research methodologies,
the most frequently used sources of ‘evidence’ are
often derived from professional experiences and
colleagues rather than original research studies. […]
we need to consider carefully the accessibility,
appeal and capacity to use a range of evidence from,
in and for practice.”
12. Morgan (2021) citing Prof Becky Francis
“applying the findings from educational research in
classrooms is a challenge. However, it’s one that
teachers know to be worth tackling in order to
maximise the impact of their practice and, in turn,
accelerate pupil progress”.
13. Morgan (2021) citing Dr Tom Perry
“Even if you support the [cognitive science]
principles … there’s still quite a lot we need to know
about what a good CPD programme looks like”
14. Conclusions
We try to summarise the views expressed during the
roundtable.
Dr Christian Bokhove
University of Southampton
C.Bokhove@soton.ac.uk
Dr Sally Bamber
University of Chester
s.hughes@chester.ac.uk
15. References
Berliner, D. C. (2002). Comment: Educational research: The hardest science of all. Educational
researcher, 31(8), 18-20. Link.
Biesta, G. (2007). Why what works, won’t work: Evidence-based practice and the democratic deficit
in educational research. Education Theory, 57(1), 1–22. Link.
Breckon, J., & Dodson, J. (2016). Using evidence. London: Alliance for Useful Evidence. Link.
Gorard, S., See, B. H., & Siddiqui, N. (2020). What is the evidence on the best way to get evidence
into use in education? Review of Education, 8(2), 570–610. Link.
Hanley, P., Chambers, B., & Haslam, J. (2016). Reassessing RCTs as the ‘gold standard’: synergy not
separatism in evaluation designs. International Journal of Research & Method in Education, 39(3),
287-298.
Labaree, D. F. (1998). Educational researchers: Living with a lesser form of knowledge. Educational
researcher, 27(8), 4-12. Link.
Labaree, D. F. (2011). The lure of statistics for educational researchers. Educational Theory, 61(6),
621-632. Link.
Morgan, J. (2021). Is cognitive science a load of trouble? TES magazine. Link.
Nelson, J., & Campbell, C. (2017). Evidence-informed practice in education: meanings and
applications. Educational Research, 59(2), 127-135. Link.