Virtual Patient (VP) evaluation and development has focused on learning outcomes and software design. In this presentation, Janet Tworek proposes design-based research be used to understand the socio-cultural contributions and interrelated nature of design and learning. Nabil Zary proposed implementation science as a way to explore the same. Taken together, the authors suggest that the learning outcomes with VPs may be better understood by appreciating VPs through an iterative design approach and where learning is appreciated simultaneously as an outcome but as a formative activity between learners, teachers and curriculum.
VPs for Medical Education: A Design-Based Research Approach
1. Virtual Patients as drivers of Innovation in Health
Care Education:
A Framework of Design-based Research
Janet Tworek
Nabil Zary
AMEE 2011
2. Overview
• Virtual Patients & Contemporary Research
• Innovation in Education
• Design-Based Research
– Overview
– For VPs
3. Janet and Nabil
VPs for Med Education
INGREDIENTS:
1 VP software, installed locally
1 Instructional Designer
3-5 Subject Matter Experts
Multiple programmers
COOKING INSTRUCTIONS:
Build a VP in a certain style and add in video, pictures and text, maybe
lab values and costs. Publish to server and hope med students learn as
assessed by RCT.
6. Reeves, T.C. (2006), Design research from the technology perspective. In: Education Design Research.
van den Akker, J., Gravemeijer, K., McKenney, S., Nieveen, N., Eds.
7. Gravemejier & Cobb. (2006). Design Research from a Learning Perspective. In: Education Design
Research. van den Akker, J., Gravemeijer, K., McKenney, S., Nieveen, N., Eds.
8. Ellaway & Davies. (2011). Design for learning: deconstructing virtual
patient activities. Medical Teacher, 33:303-310
9. Program
• Year
• Program objective(s)
Curriculum
• Course
• Course objective(s)
Virtual Patient
• Medical discipline, Clinical Presentation
• Student learning goal(s)
• Sequence, Flow, Multimedia
10. Program
Imple
ment
ation
(Re)d Feed
esign back
Course
Imple
ment
ation Feedback
(Re)d Feed (Re)design
esign back
Implementation
Virtual Patient
Imple
ment
ation
(Re)d Feed
esign back
11. • Belief: VPs can facilitate learning
• RCT: VPs do not facilitate learning
• Design-based research: VPs facilitate
learning, but need much more design until
we get it right
Our approach in building VPs has been focused on getting software and VPs built, and large collections of VPs (eVIP project). Within building VPs, work has focused on typology (branched, semi-linear and linear) Many ISD models and theories of multimedia design (Mayer, 2003, 2007) are deterministic (e.g. use video to accomplish learning goal X) Learning might not be a “recipe card” and thus the elements within a VP might not follow one recipe to result in student learning.
VP research has been descriptive, or RCT. Reflects larger themes in medical education research (Norman & Eva, 2010). RCTs with VPs and meta-analysis have demonstrated lack of strong knowledge outcomes and transfer of learning The conclusion has been that VPs are poor constructivist learning environments (Cook & Triola, 2010).
innovations in education are fundamentally long-term undertakings requiring iterative feedback and design (Bereiter, 2002).
Comparing predictive research frames to design-based research. Desgin based research appreciates: Cognition in context feedback iterations Though experiments become curric experiments that develop/inform local instructional design theory.
Each local design experiment builds into a “project’ Comparing across projects (maybe 2 different interventions of VPs), we learn about the design needed specific to the course as well as across courses.
Beginning the framework We need to consider the macro and micro layers that impact design (and results) Ellaway & Davies (2011) discussed this with respect to activity theory in designing elearning interventions (e.g. VPs)
Adding to the framework – need to consider the feedback and changes at multiple levels
There is feedback & changes at each level These can be summarized at the end of the “intervention” (course, year, graduation, learning event, etc) and implemented in preparation for the next time VPs are used Typically our med schools run that the course offered in Year 1 will end and not be offered again utnil the next year. In this case, the feedback à redesign à implementation follows a linear timeline. However, true research-based design is we would make notes and make changes throughout the year as we reflect on the issues and continue to build & refine until the next implementation (more agile than waterfall in method).
We can see with the older model (predictive research), this philosophy impacts: VP design – as a recipe to be followed – deterministic design practices VP research – deterministic approach – resulted in “false” findings – we know there is more going on – need new method of research and analysis to be able to represent it Design-based research gives: a mechanism to improve the VP so we have a viable intervention to make VP value be visible – we see the learning impact, the value in students ’ eyes, the value to the curriculum & program leads to potential learning outcomes that predictive models do not account for
Corning Glass is a company that had to adapt to survive: they make dishes for the kitchen (losing profits), but Gorilla Glass is highest grossing product in last 2 quarters. eLearning may look like slippery banana peels: tried to do it, but RCT results = no learning, so we need to drop the intervention and move on. Instead of slipping on banana peels and assuming failure, we should instead have wider eyes to see that new interventions may not work at first – we need to keep designing them (and re-designing them) until we get to a place of student learning and acceptance. Then we need to find the right assessment tools!