Designing useful evaluations - An online workshop for the Jisc AF programme_Inspire Research Ltd
1. JISC Assessment & Feedback Programme Online workshop 7 th November 20 11 Designing useful evaluations Dr Rachel A Harris
2.
3.
4.
5.
6. Action Research McNiff and Whitehead’s (2006) ‘ A ction-reflection cycle’ Observe Reflect Act Evaluate Modify Move in new directions Baseline Pilot Revisit Review
21. Programme Evaluation Resources https://programmesupport.pbworks.com/w/page/42511148/Evaluation%20Resources Dr Rachel A Harris twitter : raharris email: rachel @inspire-research.co. uk web: www.inspire-research.co.uk
Editor's Notes
13 projects funded by JISC, and 2 by Becta
The traditional view would be that by describing the current position, the baseline can “ provide a sound basis on which the success of the project could later be evaluated ” (Cascade). However, a baseline study can also be seen as the start of the essential project management requirement “ to be aware of all factors which may impact positively or negatively on the effectiveness of the project ” (KUBE).
Atelier-D Achieving Transformation, Enhanced Learning & Innovation through Educational Resources in Design , Open University Investigated web technologies for developing a virtual design studio space to support student learning throughout the Design programme of the Open University. CASCADE University of Oxford Implemented new strategies to improve curriculum delivery models to allow the University of Oxford’s Department for Continuing Education to respond more flexibly to stakeholders’ needs. COWL C oventry Online Writing Laboratory , Coventry University Developed and extended the pedagogy, reach and diversity of academic writing services, through a technology-enhanced teaching and learning environment. DUCKLING Delivering University Curricula: Knowledge, Learning & Innovation Gains , University of Leicester Developed delivery, presentation and assessment processes to enhance the work-based learning experience of students studying remotely. ESCAPE Effecting Sustainable Change in Assessment Practice & Experience , University of Hertfordshire Responding to National and Institutional concerns regarding assessment and feedback, the project worked with two Schools to develop assessment for learning activities to enhance the assessment experience for learners and staff. ISCC Information Spaces for Creative Conversations , Middlesex University and City University Addressed a recurrent problem in design education of students sometimes being disengaged from key creative conversations, a problem that can be exacerbated by learning technologies. KUBE Kingston Uplift for Business Education , Kingston College Set out to enhance the learning experience of students studying on higher-level business education programmes delivered at Kingston College on behalf of Kingston University. MAC Making Assessment Count , University of Westminster Enhanced curriculum delivery through the development of an innovative assessment feedback process, eReflect. Springboard TV : An internet TV station to enrich teaching & learning , College of West Anglia Set out to address challenges associated with recruitment, learner satisfaction, engagement, progression and employability by designing an innovative learner journey delivered in a simulated TV production and broadcast environment.
“ Action research is a form of enquiry that enables practitioners everywhere to investigate and evaluate their work” [1] For DUCKLING, the cycle started by ‘observing’ via a baseline study of staff, students and employers from both disciplines involved in the project. This was conducted using surveys and interviews and addressed the challenges of course delivery. The team ‘reflected’ by analysing the results of the baseline and feeding these back to course teams to inform the course redesign. The ‘action’ involved integrating four technologies into the redesign. This was subsequently ‘evaluated’ by gathering feedback from students and staff, analysing the findings and feeding this back to the course teams to inform any further ‘modifications’. The blue circular arrows show the suggested evaluation cycle that projects within the Assessment and Feedback programme are likely to follow. [1] McNiff, J. & Whitehead J. (2006). All you need to know about action research . London: SAGE Publications Ltd. See p9
CIPP – Context, Input, Process & Product/Outcome Evaluation The independent evaluator first devised an Evaluation Plan for Cascade , with evaluation questions, activities, and data-collection methods, and defined measures of success. This was developed after reviewing project documents, such as the JISC call, and project plan, and meeting with the project team. Two days were spent identifying aims and key measures of success for the Cascade focus areas. The relationship of project aims to focus areas and from there the evaluation areas is shown above . This demonstrates how the project aims were used as the starting point for the Cascade evaluation.
The programme call notes the focus “is on large-scale changes in assessment and feedback practice, supported by technology , with the aim of enhancing the learning and teaching process and delivering efficiencies and quality improvements ”. Projects will be expected to address (at least some of) these areas in their evaluation activities. This will feed into the development of an Evaluation and Synthesis Framework , The Framework will identify broad focus areas, such as though highlighted above. It will include key questions that we anticipate will be answered by project activities, but also the key evaluation questions projects highlight in their evaluation plans.
Adapted from the HEA Enhancement Academy Evaluation and Impact Assessment Approach. 1. What do you want to happen as a result of your project’s activities? (Intended outcomes) 2. Who and/or what might your activities impact on? (Levels of indicators) 3. How will you know you have achieved your intended outcomes? Or What indicators could you use to demonstrate what you have achieved? (Indicators) Remembering that it may not be possible to evaluate the full range of project activities. (How might you investigate why the impact occurred ?)
Graphic downloaded from the A merican E valuation A ssociation (AEA) eLibrary http://bit. ly / evalsignificance
Adapted from Davidson (2009) . She also provides a useful evaluation questions “cheat sheet” 1. What was the quality of the project’s content/design and how well was it implemented? 2. How valuable were the outcomes to participants (students, lecturing staff, administrators)? To the institution, the community, the economy? 3. What were the barriers and enablers that made the difference between successful and disappointing implementation and outcomes? 4. What else was learned (about how or why the effects were caused/prevented, what went right/wrong, lessons for next time)? 5. Was the project worth implementing? Did the value of the outcomes outweigh the value of the resources used to obtain them? 6. To what extent is the project, or aspects of its content, design or implementation, likely to be valuable in other settings? How reusable is it elsewhere? 7. How strong is the project’s sustainability? Can it survive/grow in the future with limited additional resources? Davidson, E.J. (2009) Improving evaluation questions and answers: Getting actionable answers for real-world decision makers. Presented at the American Evaluation Association conference, Orlando, Florida. Retrieved November 20, 2009, from http://comm. eval .org/EVAL/EVAL/Resources/ ViewDocument /Default. aspx ? DocumentKey =e5bac388-f1e6-45ab-9e78-10e60cea0666
Adapted from Davidson (2009) . She also provides a useful evaluation questions “cheat sheet” 1. What was the quality of the project’s content/design and how well was it implemented? 2. How valuable were the outcomes to participants (students, lecturing staff, administrators)? To the institution, the community, the economy? 3. What were the barriers and enablers that made the difference between successful and disappointing implementation and outcomes? 4. What else was learned (about how or why the effects were caused/prevented, what went right/wrong, lessons for next time)? 5. Was the project worth implementing? Did the value of the outcomes outweigh the value of the resources used to obtain them? 6. To what extent is the project, or aspects of its content, design or implementation, likely to be valuable in other settings? How reusable is it elsewhere? 7. How strong is the project’s sustainability? Can it survive/grow in the future with limited additional resources? Davidson, E.J. (2009) Improving evaluation questions and answers: Getting actionable answers for real-world decision makers. Presented at the American Evaluation Association conference, Orlando, Florida. Retrieved November 20, 2009, from http://comm. eval .org/EVAL/EVAL/Resources/ ViewDocument /Default. aspx ? DocumentKey =e5bac388-f1e6-45ab-9e78-10e60cea0666
Adapted from the HEA Enhancement Academy Evaluation and Impact Assessment Approach.
This page was ran as a poll within the session.
Gill Ferrell spoke to this slide regarding the experience of the Curriculum Design projects with Baselining.
Graphic downloaded from the A merican E valuation A ssociation (AEA) eLibrary http://bit.ly/evalcorrelation