SlideShare uma empresa Scribd logo
1 de 84
Adam Dubrowski PhD Models for Assessing Learning and Evaluating Learning Outcomes
www.wordle.net
Acknowledgement Dr. Kathryn Parker, Learning Institute
Introduction
Introduction Is Eli OK?
Introduction Environment Intake Interaction Eli Reaction Function
Content Assessment Evaluation: Models and Frameworks  Working Example: Evaluation of a Hand Hygiene Program Assessment Instruments Standards and Rigor: Assessment Instruments Working Example: OSATS Standards and Rigor: Evaluations Summary and Take-home Message
Introduction Formative Assessments Summative
Introduction Evaluations Assessments
Introduction Process Evaluations Outcome
Introduction Process Evaluations Outcome
Introduction Alkin, M.C. & Christie, C.A. (2004). The evaluation theory tree. In Alkin, M.C. (ed.), Evaluation Roots: Tracing Theorists’ Views and Influences. London: Sage Publications.
Introduction
Reflections Where would you sit on the Evaluation Tree? What about others in your “system”?
Introduction Alkin, M.C. & Christie, C.A. (2004). The evaluation theory tree. In Alkin, M.C. (ed.), Evaluation Roots: Tracing Theorists’ Views and Influences. London: Sage Publications.
Introduction
Models and Frameworks CIPP Evaluation Model (Stufflebeam, 1983) Kirkpatrick’s Model of Evaluating Training Programs (Kirkpatrick, 1998) Moore’s Evaluation Framework (Moore et al., 2009) Miller’s Framework for Clinical Assessments (Miller, 1990)
CIPP CIPP: Context, Input, Process, Product Stufflebeam, 1983; Steinert, 2002
CIPP Context What is the relation of the course to other courses? Is the time adequate? Should courses be integrated or separate?
CIPP Inputs What are the entering ability, learning skills and motivation of students? What is the students’ existing knowledge? Are the objectives suitable? Does the content match student abilities? What is the theory/practice balance? What resources/equipment are available? How strong are the teaching skills of teachers? How many students/teachers are there?
CIPP Process What is the workload of students? Are there any problems related to teaching/learning? Is there effective 2-way communication? Is knowledge only transferred to students, or do they use and apply it? Is the teaching and learning process continuously evaluated? Is teaching and learning affected by practical/institutional problems?
CIPP Product Is there one final exam at the end or several during the course? Is there any informal assessment? What is the quality of assessment? What are the students’ knowledge levels after the course? How was the overall experience for the teachers and for the students?
CIPP Methods used to evaluate the curriculum Discussions Informal conversation or observation Individual student interviews Evaluation forms Observation in class by colleagues Performance test Questionnaire Self-assessment Written test
CIPP CIPP focuses on the process and informs the program/curriculum for future improvements. 	Limited emphasis on the outcome.
Kirkpatrick’s Model of Evaluating Training Programs  Results Behavior Learning Reaction
Kirkpatrick’s Model of Evaluating Training Programs  Results Behavior Learning Reaction Reaction: The degree to which the expectations of the participants about the setting and delivery of the learning activity were met  Questionnaires completed by attendees after the activity
Kirkpatrick’s Model of Evaluating Training Programs  Results Behavior Learning Reaction Learning: The degree to which participants recall and demonstrate in an educational setting what the learning activity intended them to be able to do  Tests and observation in educational setting
Kirkpatrick’s Model of Evaluating Training Programs  Results Behavior Learning Reaction Behavior: The degree to which participants do what the educational activity intended them to be able to do in their practices  Observation of performance in patient care setting; patient charts; administrative databases
Kirkpatrick’s Model of Evaluating Training Programs  Results: The degree to which the health status of patients as well as the community of patients changes due to changes in the practice Health status measures recorded in patient charts or administrative databases, epidemiological data and reports Results Behavior Learning Reaction
Kirkpatrick’s Model of Evaluating Training Programs  The importance of Kirkpatrick’s Model is in its ability to identify a range of dimensions that needs to be evaluated in order to inform us about the educational quality of a specific program. It focuses on the outcomes. However, it provides limited information about the individual learner.
Moore’s Framework Results Behavior Learning Reaction Does Shows How  Knows How  Knows
Miller’s Framework for Clinical Assessments Does (Performance) Shows How  (Competence) Knows How  (Procedural Knowledge) Knows (Declarative Knowledge)
Miller’s Framework for Clinical Assessments Does (Performance) Shows How  (Competence) Knows How  (Procedural Knowledge) Knows (Declarative Knowledge) Knows: Declarative knowledge. The degree to which participants state what the learning activity intended them to know Pre- and posttests of knowledge.
Miller’s Framework for Clinical Assessments Does (Performance) Shows How  (Competence) Knows How  (Procedural Knowledge) Knows (Declarative Knowledge) Knows how: Procedural knowledge. The degree to which participants state how to do what the learning activity intended them to know how to do  Pre- and posttests of knowledge
Miller’s Framework for Clinical Assessments Does (Performance) Shows How  (Competence) Knows How  (Procedural Knowledge) Knows (Declarative Knowledge) Shows how: The degree to which participants show in an educational setting how to do what the learning activity intended them to be able to do  Observation in educational setting
Miller’s Framework for Clinical Assessments Does: Performance. The degree to which participants do what the learning activity intended them to be able to do in their practices. Observation of performance in patient care setting; patient charts; administrative databases  Does (Performance) Shows How  (Competence) Knows How  (Procedural Knowledge) Knows (Declarative Knowledge)
Miller’s Framework for Clinical Assessments The importance of Miller’s Framework is in its ability to identify learning objectives and link them with appropriate testing contexts (where) and instruments (how).
Assumption Moore’s and Kirkpatrick’s models assume a relationship between the different levels. If learners are satisfied, they will learn more, will be able to demonstrate the new skills, transfer them to the clinical setting, and consequently the health of patients and communities will improve!
Models assumptions are not met!
Building Skills, Changing Practice: Simulator Training for Hand Hygiene Protocols Canadian Institutes of Health Research Partnerships for Health System Improvement A. McGeer, MA. Beduz, A. Dubrowski
Purpose Current models of knowledge delivery about proper hand hygiene rely on didactic session However, transfer to practice is low
Purpose Does Shows How  Knows How  Knows Clinical practice Simulation Didactic courses
Purpose The primary purpose was to investigate the effectiveness of simulation-based training of hand hygiene on transfer of knowledge to clinical practice
Project Recruitment Clinical monitoring (audits) Δ in Behavior Simulation Δ in Learning Reaction Clinical monitoring (audits)
Results: Reactions Simulation Design Scale Educational Practices Questionnaire 159respondents - average response: ‘Agree’ or ‘Strongly Agree’ (4 or 5 on a 5-pt scale) to each of 40 items Rated 39 of these items as  ‘Important’ (4 on a 5-pt scale)
Results: Learning Moment 1     Moment 2    Moment 3     Moment 4
Results: Behaviour
Conclusions No transfer of knowledge No clear relationship between the assessments of reactions, learning and behaviour.
Summary
Summary Outcome based models of program evaluation may be providing limited use (i.e. they are mostly on the research branch). Need new, more complex models that incorporate both processes and outcomes (i.e.  span across more branches).
Summary Assessment instruments, which feed these models are critical for successful evaluations. We need to invest efforts in standardization and rigorous development of these instruments. Evaluations Assessments
Assessment Instruments
Assessment Instruments
Standards and Rigor
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments that apply to measuring three properties: Distinguish between two or more groups,  assess change over time, predict future status.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations The concept to be measured needs to be defined properly and should match its intended use.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Reliability is the degree to which the instrument is free of random error, which means free from errors in measurement caused by chance factors that influence measurement.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Validity is the degree to which the instrument measures what it purports to measure.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Responsiveness is an instrument’s ability to detect change over time.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Interpretability is the degree to which one can assign easily understood meaning to an instrument's score.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Burden refers to the time, effort and other demands placed on those to whom the instrument is administered (respondent burden) or on those who administer the instrument (administrative burden)
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Alternative means of administration include self report, interviewer-administered, computer assisted, etc.   Often it is important to know whether these modes of administration are comparable
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Cultural and Language adaptations or translations.
Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations
Scientific Advisory Committee of the Medical Outcomes Trust Do our assessment instruments fit these criteria?
Working Example: OSATS Objective Assessments of Technical Skills Bell-ringer exam (12 minutes per station) Minimum of 6-8 stations Direct or video assessments of technical performance
Working Example: OSATS The Conceptual and Measurement Model Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill  via an innovative "bench station" examination. Am J Surg. 1997 Mar;173(3):226-30. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M.  Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997 Feb;84(2):273-8.
Working Example: OSATS Reliability and Validity Faulkner H, Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med. 1996 Dec;71(12):1363-5. Kishore TA, Pedro RN, Monga M, Sweet RM. Assessment of validity of an OSATS for cystoscopic and ureteroscopic cognitive and psychomotor skills. J Endourol. 2008 Dec;22(12):2707-11.  Goff B, Mandel L, Lentz G, Vanblaricom A, Oelschlager AM, Lee D, Galakatos A, Davies M, Nielsen P. Assessment of resident surgical skills: is testing feasible? Am J Obstet Gynecol. 2005 Apr;192(4):1331-8; discussion 1338-40.  VAL   Martin JA, Reznick RK, Rothman A, Tamblyn RM, Regehr G. Who should rate candidates in an objective structured clinical examination? Acad Med. 1996 Feb;71(2):170-5.
Working Example: OSATS Alternative Forms of Administration Maker VK, Bonne S. Novel hybrid objective structured assessment of technical skills/objective structured clinical examinations in omprehensiveperioperative breast care: a three-year analysis of outcomes. J Sualrg Educ. 2009 Nov-Dec;66(6):344-51.  Lin SY, Laeeq K, Ishii M, Kim J, Lane AP, Reh D, Bhatti NI. Development and pilot-testing of a feasible, reliable, and valid operative competency assessment tool for endoscopic sinus surgery. Am J Rhinol Allergy. 2009 May-Jun;23(3):354-9.  Goff BA, VanBlaricom A, Mandel L, Chinn M, Nielsen P. Comparison of objective, structured assessment of technical skills with a virtual reality hysteroscopy trainer and standard latex hysteroscopy model. J Reprod Med. 2007 May;52(5):407-12.  Datta V, Bann S, Mandalia M, Darzi A. The surgical efficiency score: a feasible, reliable, and valid method of skills assessment. Am J Surg. 2006, Sep;192(3):372-8.
Working Example: OSATS Adaptations Setna Z, Jha V, Boursicot KA, Roberts TE. Evaluating the utility of workplace-based assessment tools for specialty training. Best Pract Res ClinObstetGynaecol. 2010 Jun 30.  Dorman K, Satterthwaite L, Howard A, Woodrow S, Derbew M, Reznick R, Dubrowski A. Addressing the severe shortage of health care providers in Ethiopia: bench model teaching of technical skills. Med Educ. 2009 Jul;43(7):621-7.
Working Example: OSATS The SAC defined a set of eight key attributes of instruments: ,[object Object]
Reliability
Validity	Responsiveness or sensitivity to change 	Interpretability 	Burden ,[object Object]
Cultural And Language Adaptations,[object Object]
Standards and Rigor Evaluations Assessments
The Joint Committee on Standards for Educational Evaluation (1990)  Utility Standards  The utility standards are intended to ensure that an evaluation will serve the information needs of intended users. 
The Joint Committee on Standards for Educational Evaluation (1990)  Utility Standards Feasibility Standards The feasibility standards are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal.
The Joint Committee on Standards for Educational Evaluation (1990)  Utility Standards Feasibility Standards Propriety Standards The propriety standards are intended to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results.
The Joint Committee on Standards for Educational Evaluation (1990)  Utility Standards Feasibility Standards Propriety Standards Accuracy Standards The accuracy standards are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated.
Conclusions
Conclusions Process Evaluations Outcome
Conclusions Evaluations Assessments
Conclusions Three philosophies/paradigms: ,[object Object]
Methods

Mais conteúdo relacionado

Mais procurados

The Evaluation Checklist
The Evaluation ChecklistThe Evaluation Checklist
The Evaluation Checklistwmartz
 
Peering through the Looking Glass: Towards a Programmatic View of the Qualify...
Peering through the Looking Glass: Towards a Programmatic View of the Qualify...Peering through the Looking Glass: Towards a Programmatic View of the Qualify...
Peering through the Looking Glass: Towards a Programmatic View of the Qualify...MedCouncilCan
 
Grading Your Assessments: How to Evaluate the Quality of Your Exams
Grading Your Assessments: How to Evaluate the Quality of Your ExamsGrading Your Assessments: How to Evaluate the Quality of Your Exams
Grading Your Assessments: How to Evaluate the Quality of Your ExamsExamSoft
 
Improve your test item writing skills to help create better nursing exams
Improve your test item writing skills to help create better nursing examsImprove your test item writing skills to help create better nursing exams
Improve your test item writing skills to help create better nursing examsExamSoft
 
Thesis Defense - Tommy May
Thesis Defense - Tommy MayThesis Defense - Tommy May
Thesis Defense - Tommy MayTommy May
 
Pushing the Boundaries of Medical Licensing
Pushing the Boundaries of Medical Licensing Pushing the Boundaries of Medical Licensing
Pushing the Boundaries of Medical Licensing MedCouncilCan
 
Properties of-assessment-methods
Properties of-assessment-methodsProperties of-assessment-methods
Properties of-assessment-methodsCatherine Matias
 
Criteria for Good Assessment - John Norcini, Ph.D.
Criteria for Good Assessment - John Norcini, Ph.D.Criteria for Good Assessment - John Norcini, Ph.D.
Criteria for Good Assessment - John Norcini, Ph.D.mmcavani
 
Criteria to Consider when Constructing Good tests
Criteria to Consider when Constructing Good testsCriteria to Consider when Constructing Good tests
Criteria to Consider when Constructing Good testsShimmy Tolentino
 
Properties of Assessment Method
Properties of Assessment MethodProperties of Assessment Method
Properties of Assessment MethodLarry Sultiz
 
Standardized and non standardized tests
Standardized and non standardized testsStandardized and non standardized tests
Standardized and non standardized testsshaziazamir1
 
Effective course evaluation
Effective course evaluationEffective course evaluation
Effective course evaluationsurveyresults
 
Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)Kheang Sokheng
 
Evaluation methods
Evaluation methodsEvaluation methods
Evaluation methodsEva Durall
 
Seeking Empirical Validity in an Assurance of Learning System
Seeking Empirical Validity in an Assurance of Learning SystemSeeking Empirical Validity in an Assurance of Learning System
Seeking Empirical Validity in an Assurance of Learning SystemRochell McWhorter
 
Interdisciplinary collaboration in the Veterans health care sector - Final
Interdisciplinary collaboration in the Veterans health care sector - FinalInterdisciplinary collaboration in the Veterans health care sector - Final
Interdisciplinary collaboration in the Veterans health care sector - FinalMichael Clarkson-Hendrix
 

Mais procurados (20)

The Evaluation Checklist
The Evaluation ChecklistThe Evaluation Checklist
The Evaluation Checklist
 
Peering through the Looking Glass: Towards a Programmatic View of the Qualify...
Peering through the Looking Glass: Towards a Programmatic View of the Qualify...Peering through the Looking Glass: Towards a Programmatic View of the Qualify...
Peering through the Looking Glass: Towards a Programmatic View of the Qualify...
 
Grading Your Assessments: How to Evaluate the Quality of Your Exams
Grading Your Assessments: How to Evaluate the Quality of Your ExamsGrading Your Assessments: How to Evaluate the Quality of Your Exams
Grading Your Assessments: How to Evaluate the Quality of Your Exams
 
Improve your test item writing skills to help create better nursing exams
Improve your test item writing skills to help create better nursing examsImprove your test item writing skills to help create better nursing exams
Improve your test item writing skills to help create better nursing exams
 
Thesis Defense - Tommy May
Thesis Defense - Tommy MayThesis Defense - Tommy May
Thesis Defense - Tommy May
 
Development of Competency Indexes to Assess Nursing Postgraduate’s Tutor
Development of Competency Indexes to Assess Nursing Postgraduate’s TutorDevelopment of Competency Indexes to Assess Nursing Postgraduate’s Tutor
Development of Competency Indexes to Assess Nursing Postgraduate’s Tutor
 
Pushing the Boundaries of Medical Licensing
Pushing the Boundaries of Medical Licensing Pushing the Boundaries of Medical Licensing
Pushing the Boundaries of Medical Licensing
 
Effectiveness of admission procedures among master of ot programs
Effectiveness of admission procedures among master of ot programs Effectiveness of admission procedures among master of ot programs
Effectiveness of admission procedures among master of ot programs
 
W 2 WASC 101
W 2 WASC 101W 2 WASC 101
W 2 WASC 101
 
Properties of-assessment-methods
Properties of-assessment-methodsProperties of-assessment-methods
Properties of-assessment-methods
 
Criteria for Good Assessment - John Norcini, Ph.D.
Criteria for Good Assessment - John Norcini, Ph.D.Criteria for Good Assessment - John Norcini, Ph.D.
Criteria for Good Assessment - John Norcini, Ph.D.
 
Criteria to Consider when Constructing Good tests
Criteria to Consider when Constructing Good testsCriteria to Consider when Constructing Good tests
Criteria to Consider when Constructing Good tests
 
Preview
PreviewPreview
Preview
 
Properties of Assessment Method
Properties of Assessment MethodProperties of Assessment Method
Properties of Assessment Method
 
Standardized and non standardized tests
Standardized and non standardized testsStandardized and non standardized tests
Standardized and non standardized tests
 
Effective course evaluation
Effective course evaluationEffective course evaluation
Effective course evaluation
 
Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)
 
Evaluation methods
Evaluation methodsEvaluation methods
Evaluation methods
 
Seeking Empirical Validity in an Assurance of Learning System
Seeking Empirical Validity in an Assurance of Learning SystemSeeking Empirical Validity in an Assurance of Learning System
Seeking Empirical Validity in an Assurance of Learning System
 
Interdisciplinary collaboration in the Veterans health care sector - Final
Interdisciplinary collaboration in the Veterans health care sector - FinalInterdisciplinary collaboration in the Veterans health care sector - Final
Interdisciplinary collaboration in the Veterans health care sector - Final
 

Destaque

Simulation: From theory to implementation
Simulation: From theory to implementationSimulation: From theory to implementation
Simulation: From theory to implementationAdam Dubrowski
 
Perceived Usability Evaluation of Learning Management Systems: A First Step t...
Perceived Usability Evaluation of Learning Management Systems: A First Step t...Perceived Usability Evaluation of Learning Management Systems: A First Step t...
Perceived Usability Evaluation of Learning Management Systems: A First Step t...Nikolaos Tselios
 
Clinical Skills Training and Simulation Pedagogy
Clinical Skills Training and Simulation Pedagogy Clinical Skills Training and Simulation Pedagogy
Clinical Skills Training and Simulation Pedagogy K Raman Sethuraman
 
Step 4: Empowering Market Actors
Step 4: Empowering Market ActorsStep 4: Empowering Market Actors
Step 4: Empowering Market ActorsPMSD Roadmap
 
Step 9: Monitoring, Evaluation and Learning
Step 9: Monitoring, Evaluation and LearningStep 9: Monitoring, Evaluation and Learning
Step 9: Monitoring, Evaluation and LearningPMSD Roadmap
 
The Millers Tale
The Millers TaleThe Millers Tale
The Millers TaleMichael Rua
 
Appropriateness of assessment method
Appropriateness of assessment methodAppropriateness of assessment method
Appropriateness of assessment methodJhizelle Almonte
 
Properties of Assessment Methods
Properties of Assessment MethodsProperties of Assessment Methods
Properties of Assessment MethodsMonica Angeles
 
1 Reliability and Validity in Physical Therapy Tests
1  Reliability and Validity in Physical Therapy Tests1  Reliability and Validity in Physical Therapy Tests
1 Reliability and Validity in Physical Therapy Testsaebrahim123
 
Principles of high quality assessment
Principles of high quality assessmentPrinciples of high quality assessment
Principles of high quality assessmentaelnogab
 
The Principle of Appropriateness of Assessment
The Principle of Appropriateness of AssessmentThe Principle of Appropriateness of Assessment
The Principle of Appropriateness of AssessmentReymark Velasco
 
Principles of high quality assessment
Principles of high quality assessmentPrinciples of high quality assessment
Principles of high quality assessmentShaneLagnirac
 
Constructivism in the classroom
Constructivism in the classroomConstructivism in the classroom
Constructivism in the classroommlegan31
 
Evaluation of educational programs in nursing
Evaluation of educational programs in nursingEvaluation of educational programs in nursing
Evaluation of educational programs in nursingNavjyot Singh
 

Destaque (20)

Simulation: From theory to implementation
Simulation: From theory to implementationSimulation: From theory to implementation
Simulation: From theory to implementation
 
Perceived Usability Evaluation of Learning Management Systems: A First Step t...
Perceived Usability Evaluation of Learning Management Systems: A First Step t...Perceived Usability Evaluation of Learning Management Systems: A First Step t...
Perceived Usability Evaluation of Learning Management Systems: A First Step t...
 
Clinical Skills Training and Simulation Pedagogy
Clinical Skills Training and Simulation Pedagogy Clinical Skills Training and Simulation Pedagogy
Clinical Skills Training and Simulation Pedagogy
 
Step 4: Empowering Market Actors
Step 4: Empowering Market ActorsStep 4: Empowering Market Actors
Step 4: Empowering Market Actors
 
Step 9: Monitoring, Evaluation and Learning
Step 9: Monitoring, Evaluation and LearningStep 9: Monitoring, Evaluation and Learning
Step 9: Monitoring, Evaluation and Learning
 
The Millers Tale
The Millers TaleThe Millers Tale
The Millers Tale
 
Appropriateness of assessment method
Appropriateness of assessment methodAppropriateness of assessment method
Appropriateness of assessment method
 
Evaluation of Learning
Evaluation of LearningEvaluation of Learning
Evaluation of Learning
 
Curriculum Leadership
Curriculum LeadershipCurriculum Leadership
Curriculum Leadership
 
Properties of Assessment Methods
Properties of Assessment MethodsProperties of Assessment Methods
Properties of Assessment Methods
 
1 Reliability and Validity in Physical Therapy Tests
1  Reliability and Validity in Physical Therapy Tests1  Reliability and Validity in Physical Therapy Tests
1 Reliability and Validity in Physical Therapy Tests
 
Assessment ppt
Assessment pptAssessment ppt
Assessment ppt
 
Principles of high quality assessment
Principles of high quality assessmentPrinciples of high quality assessment
Principles of high quality assessment
 
The Principle of Appropriateness of Assessment
The Principle of Appropriateness of AssessmentThe Principle of Appropriateness of Assessment
The Principle of Appropriateness of Assessment
 
Curriculum Evaluation
Curriculum EvaluationCurriculum Evaluation
Curriculum Evaluation
 
Principles of high quality assessment
Principles of high quality assessmentPrinciples of high quality assessment
Principles of high quality assessment
 
Constructivism in the classroom
Constructivism in the classroomConstructivism in the classroom
Constructivism in the classroom
 
Models of curriculum
Models of curriculumModels of curriculum
Models of curriculum
 
Curriculum models and types
Curriculum models and typesCurriculum models and types
Curriculum models and types
 
Evaluation of educational programs in nursing
Evaluation of educational programs in nursingEvaluation of educational programs in nursing
Evaluation of educational programs in nursing
 

Semelhante a Dubrowski assessment and evaluation

Rekkas_presentation.ppt
Rekkas_presentation.pptRekkas_presentation.ppt
Rekkas_presentation.pptJaniceBulao3
 
This week, you examine cellular processes that are subject to alte
This week, you examine cellular processes that are subject to alteThis week, you examine cellular processes that are subject to alte
This week, you examine cellular processes that are subject to alteTakishaPeck109
 
How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.Healthcare consultant
 
Assessment in an Outcomes-Based Education
Assessment in an Outcomes-Based EducationAssessment in an Outcomes-Based Education
Assessment in an Outcomes-Based EducationCarlo Magno
 
Selectingn course evaluation
Selectingn course evaluationSelectingn course evaluation
Selectingn course evaluationAshley Kovacs
 
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010TEST Huddle
 
DiP committee presentation
DiP committee presentationDiP committee presentation
DiP committee presentationCPEDInitiative
 
Jeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference SlidesJeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference SlidesShawnHoke
 
Evaluating Student Success Initiatives
Evaluating Student Success InitiativesEvaluating Student Success Initiatives
Evaluating Student Success Initiatives3CSN
 
Evaluating Student Success Initiatives
Evaluating Student Success InitiativesEvaluating Student Success Initiatives
Evaluating Student Success InitiativesBradley Vaden
 
Workshop-Self-Assessment-Process-Oriental-Learning.ppt
Workshop-Self-Assessment-Process-Oriental-Learning.pptWorkshop-Self-Assessment-Process-Oriental-Learning.ppt
Workshop-Self-Assessment-Process-Oriental-Learning.pptsadafshahbaz7777
 
Evaluating the quality of quality improvement training in healthcare
Evaluating the quality of quality improvement training in healthcareEvaluating the quality of quality improvement training in healthcare
Evaluating the quality of quality improvement training in healthcareDaniel McLinden
 
National university assessment process
National university assessment processNational university assessment process
National university assessment processAshley Kovacs
 
Reliability And Validity
Reliability And ValidityReliability And Validity
Reliability And ValidityCrystal Torres
 
Training evaluation
Training evaluationTraining evaluation
Training evaluationNancy Raj
 
A Competency based Management Workshop.pptx
A Competency based Management Workshop.pptxA Competency based Management Workshop.pptx
A Competency based Management Workshop.pptxSaqib Mansoor Ahmed
 
A Workshop On Competency based Management
A Workshop On Competency based ManagementA Workshop On Competency based Management
A Workshop On Competency based ManagementSaqib Mansoor Ahmed
 
Outcomnes-based Education
Outcomnes-based EducationOutcomnes-based Education
Outcomnes-based EducationCarlo Magno
 

Semelhante a Dubrowski assessment and evaluation (20)

Rekkas_presentation.ppt
Rekkas_presentation.pptRekkas_presentation.ppt
Rekkas_presentation.ppt
 
This week, you examine cellular processes that are subject to alte
This week, you examine cellular processes that are subject to alteThis week, you examine cellular processes that are subject to alte
This week, you examine cellular processes that are subject to alte
 
How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.How to Implement Quality in Health Care Organizations.
How to Implement Quality in Health Care Organizations.
 
Assessment in an Outcomes-Based Education
Assessment in an Outcomes-Based EducationAssessment in an Outcomes-Based Education
Assessment in an Outcomes-Based Education
 
Presentation2 with norbert boruett
Presentation2 with norbert boruettPresentation2 with norbert boruett
Presentation2 with norbert boruett
 
Presentation2
Presentation2Presentation2
Presentation2
 
Selectingn course evaluation
Selectingn course evaluationSelectingn course evaluation
Selectingn course evaluation
 
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010
 
DiP committee presentation
DiP committee presentationDiP committee presentation
DiP committee presentation
 
Jeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference SlidesJeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference Slides
 
Evaluating Student Success Initiatives
Evaluating Student Success InitiativesEvaluating Student Success Initiatives
Evaluating Student Success Initiatives
 
Evaluating Student Success Initiatives
Evaluating Student Success InitiativesEvaluating Student Success Initiatives
Evaluating Student Success Initiatives
 
Workshop-Self-Assessment-Process-Oriental-Learning.ppt
Workshop-Self-Assessment-Process-Oriental-Learning.pptWorkshop-Self-Assessment-Process-Oriental-Learning.ppt
Workshop-Self-Assessment-Process-Oriental-Learning.ppt
 
Evaluating the quality of quality improvement training in healthcare
Evaluating the quality of quality improvement training in healthcareEvaluating the quality of quality improvement training in healthcare
Evaluating the quality of quality improvement training in healthcare
 
National university assessment process
National university assessment processNational university assessment process
National university assessment process
 
Reliability And Validity
Reliability And ValidityReliability And Validity
Reliability And Validity
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
A Competency based Management Workshop.pptx
A Competency based Management Workshop.pptxA Competency based Management Workshop.pptx
A Competency based Management Workshop.pptx
 
A Workshop On Competency based Management
A Workshop On Competency based ManagementA Workshop On Competency based Management
A Workshop On Competency based Management
 
Outcomnes-based Education
Outcomnes-based EducationOutcomnes-based Education
Outcomnes-based Education
 

Mais de Adam Dubrowski

Mais de Adam Dubrowski (6)

Sbme state of the science 2
Sbme state of the science 2Sbme state of the science 2
Sbme state of the science 2
 
Dubrowski draft
Dubrowski draftDubrowski draft
Dubrowski draft
 
Dubrowski final 1
Dubrowski final 1Dubrowski final 1
Dubrowski final 1
 
Dubrowski bricks
Dubrowski bricksDubrowski bricks
Dubrowski bricks
 
Methods for Tina
Methods for TinaMethods for Tina
Methods for Tina
 
Dubrowski 1 SSE
Dubrowski 1 SSEDubrowski 1 SSE
Dubrowski 1 SSE
 

Último

microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajanpragatimahajan3
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 

Último (20)

microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 

Dubrowski assessment and evaluation

  • 1. Adam Dubrowski PhD Models for Assessing Learning and Evaluating Learning Outcomes
  • 3. Acknowledgement Dr. Kathryn Parker, Learning Institute
  • 6. Introduction Environment Intake Interaction Eli Reaction Function
  • 7. Content Assessment Evaluation: Models and Frameworks Working Example: Evaluation of a Hand Hygiene Program Assessment Instruments Standards and Rigor: Assessment Instruments Working Example: OSATS Standards and Rigor: Evaluations Summary and Take-home Message
  • 12. Introduction Alkin, M.C. & Christie, C.A. (2004). The evaluation theory tree. In Alkin, M.C. (ed.), Evaluation Roots: Tracing Theorists’ Views and Influences. London: Sage Publications.
  • 14. Reflections Where would you sit on the Evaluation Tree? What about others in your “system”?
  • 15. Introduction Alkin, M.C. & Christie, C.A. (2004). The evaluation theory tree. In Alkin, M.C. (ed.), Evaluation Roots: Tracing Theorists’ Views and Influences. London: Sage Publications.
  • 17. Models and Frameworks CIPP Evaluation Model (Stufflebeam, 1983) Kirkpatrick’s Model of Evaluating Training Programs (Kirkpatrick, 1998) Moore’s Evaluation Framework (Moore et al., 2009) Miller’s Framework for Clinical Assessments (Miller, 1990)
  • 18. CIPP CIPP: Context, Input, Process, Product Stufflebeam, 1983; Steinert, 2002
  • 19. CIPP Context What is the relation of the course to other courses? Is the time adequate? Should courses be integrated or separate?
  • 20. CIPP Inputs What are the entering ability, learning skills and motivation of students? What is the students’ existing knowledge? Are the objectives suitable? Does the content match student abilities? What is the theory/practice balance? What resources/equipment are available? How strong are the teaching skills of teachers? How many students/teachers are there?
  • 21. CIPP Process What is the workload of students? Are there any problems related to teaching/learning? Is there effective 2-way communication? Is knowledge only transferred to students, or do they use and apply it? Is the teaching and learning process continuously evaluated? Is teaching and learning affected by practical/institutional problems?
  • 22. CIPP Product Is there one final exam at the end or several during the course? Is there any informal assessment? What is the quality of assessment? What are the students’ knowledge levels after the course? How was the overall experience for the teachers and for the students?
  • 23. CIPP Methods used to evaluate the curriculum Discussions Informal conversation or observation Individual student interviews Evaluation forms Observation in class by colleagues Performance test Questionnaire Self-assessment Written test
  • 24. CIPP CIPP focuses on the process and informs the program/curriculum for future improvements. Limited emphasis on the outcome.
  • 25. Kirkpatrick’s Model of Evaluating Training Programs Results Behavior Learning Reaction
  • 26. Kirkpatrick’s Model of Evaluating Training Programs Results Behavior Learning Reaction Reaction: The degree to which the expectations of the participants about the setting and delivery of the learning activity were met Questionnaires completed by attendees after the activity
  • 27. Kirkpatrick’s Model of Evaluating Training Programs Results Behavior Learning Reaction Learning: The degree to which participants recall and demonstrate in an educational setting what the learning activity intended them to be able to do Tests and observation in educational setting
  • 28. Kirkpatrick’s Model of Evaluating Training Programs Results Behavior Learning Reaction Behavior: The degree to which participants do what the educational activity intended them to be able to do in their practices Observation of performance in patient care setting; patient charts; administrative databases
  • 29. Kirkpatrick’s Model of Evaluating Training Programs Results: The degree to which the health status of patients as well as the community of patients changes due to changes in the practice Health status measures recorded in patient charts or administrative databases, epidemiological data and reports Results Behavior Learning Reaction
  • 30. Kirkpatrick’s Model of Evaluating Training Programs The importance of Kirkpatrick’s Model is in its ability to identify a range of dimensions that needs to be evaluated in order to inform us about the educational quality of a specific program. It focuses on the outcomes. However, it provides limited information about the individual learner.
  • 31. Moore’s Framework Results Behavior Learning Reaction Does Shows How Knows How Knows
  • 32. Miller’s Framework for Clinical Assessments Does (Performance) Shows How (Competence) Knows How (Procedural Knowledge) Knows (Declarative Knowledge)
  • 33. Miller’s Framework for Clinical Assessments Does (Performance) Shows How (Competence) Knows How (Procedural Knowledge) Knows (Declarative Knowledge) Knows: Declarative knowledge. The degree to which participants state what the learning activity intended them to know Pre- and posttests of knowledge.
  • 34. Miller’s Framework for Clinical Assessments Does (Performance) Shows How (Competence) Knows How (Procedural Knowledge) Knows (Declarative Knowledge) Knows how: Procedural knowledge. The degree to which participants state how to do what the learning activity intended them to know how to do Pre- and posttests of knowledge
  • 35. Miller’s Framework for Clinical Assessments Does (Performance) Shows How (Competence) Knows How (Procedural Knowledge) Knows (Declarative Knowledge) Shows how: The degree to which participants show in an educational setting how to do what the learning activity intended them to be able to do Observation in educational setting
  • 36. Miller’s Framework for Clinical Assessments Does: Performance. The degree to which participants do what the learning activity intended them to be able to do in their practices. Observation of performance in patient care setting; patient charts; administrative databases Does (Performance) Shows How (Competence) Knows How (Procedural Knowledge) Knows (Declarative Knowledge)
  • 37. Miller’s Framework for Clinical Assessments The importance of Miller’s Framework is in its ability to identify learning objectives and link them with appropriate testing contexts (where) and instruments (how).
  • 38. Assumption Moore’s and Kirkpatrick’s models assume a relationship between the different levels. If learners are satisfied, they will learn more, will be able to demonstrate the new skills, transfer them to the clinical setting, and consequently the health of patients and communities will improve!
  • 40. Building Skills, Changing Practice: Simulator Training for Hand Hygiene Protocols Canadian Institutes of Health Research Partnerships for Health System Improvement A. McGeer, MA. Beduz, A. Dubrowski
  • 41. Purpose Current models of knowledge delivery about proper hand hygiene rely on didactic session However, transfer to practice is low
  • 42. Purpose Does Shows How Knows How Knows Clinical practice Simulation Didactic courses
  • 43. Purpose The primary purpose was to investigate the effectiveness of simulation-based training of hand hygiene on transfer of knowledge to clinical practice
  • 44. Project Recruitment Clinical monitoring (audits) Δ in Behavior Simulation Δ in Learning Reaction Clinical monitoring (audits)
  • 45. Results: Reactions Simulation Design Scale Educational Practices Questionnaire 159respondents - average response: ‘Agree’ or ‘Strongly Agree’ (4 or 5 on a 5-pt scale) to each of 40 items Rated 39 of these items as ‘Important’ (4 on a 5-pt scale)
  • 46. Results: Learning Moment 1 Moment 2 Moment 3 Moment 4
  • 48. Conclusions No transfer of knowledge No clear relationship between the assessments of reactions, learning and behaviour.
  • 50. Summary Outcome based models of program evaluation may be providing limited use (i.e. they are mostly on the research branch). Need new, more complex models that incorporate both processes and outcomes (i.e. span across more branches).
  • 51. Summary Assessment instruments, which feed these models are critical for successful evaluations. We need to invest efforts in standardization and rigorous development of these instruments. Evaluations Assessments
  • 55. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments that apply to measuring three properties: Distinguish between two or more groups, assess change over time, predict future status.
  • 56. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations The concept to be measured needs to be defined properly and should match its intended use.
  • 57. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Reliability is the degree to which the instrument is free of random error, which means free from errors in measurement caused by chance factors that influence measurement.
  • 58. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Validity is the degree to which the instrument measures what it purports to measure.
  • 59. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Responsiveness is an instrument’s ability to detect change over time.
  • 60. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Interpretability is the degree to which one can assign easily understood meaning to an instrument's score.
  • 61. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Burden refers to the time, effort and other demands placed on those to whom the instrument is administered (respondent burden) or on those who administer the instrument (administrative burden)
  • 62. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Alternative means of administration include self report, interviewer-administered, computer assisted, etc. Often it is important to know whether these modes of administration are comparable
  • 63. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations Cultural and Language adaptations or translations.
  • 64. Scientific Advisory Committee of the Medical Outcomes Trust The SAC defined a set of eight key attributes of instruments: The Conceptual and Measurement Model Reliability Validity Responsiveness or sensitivity to change Interpretability Burden Alternative Forms of Administration Cultural And Language Adaptations
  • 65. Scientific Advisory Committee of the Medical Outcomes Trust Do our assessment instruments fit these criteria?
  • 66. Working Example: OSATS Objective Assessments of Technical Skills Bell-ringer exam (12 minutes per station) Minimum of 6-8 stations Direct or video assessments of technical performance
  • 67. Working Example: OSATS The Conceptual and Measurement Model Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative "bench station" examination. Am J Surg. 1997 Mar;173(3):226-30. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997 Feb;84(2):273-8.
  • 68. Working Example: OSATS Reliability and Validity Faulkner H, Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med. 1996 Dec;71(12):1363-5. Kishore TA, Pedro RN, Monga M, Sweet RM. Assessment of validity of an OSATS for cystoscopic and ureteroscopic cognitive and psychomotor skills. J Endourol. 2008 Dec;22(12):2707-11. Goff B, Mandel L, Lentz G, Vanblaricom A, Oelschlager AM, Lee D, Galakatos A, Davies M, Nielsen P. Assessment of resident surgical skills: is testing feasible? Am J Obstet Gynecol. 2005 Apr;192(4):1331-8; discussion 1338-40. VAL   Martin JA, Reznick RK, Rothman A, Tamblyn RM, Regehr G. Who should rate candidates in an objective structured clinical examination? Acad Med. 1996 Feb;71(2):170-5.
  • 69. Working Example: OSATS Alternative Forms of Administration Maker VK, Bonne S. Novel hybrid objective structured assessment of technical skills/objective structured clinical examinations in omprehensiveperioperative breast care: a three-year analysis of outcomes. J Sualrg Educ. 2009 Nov-Dec;66(6):344-51. Lin SY, Laeeq K, Ishii M, Kim J, Lane AP, Reh D, Bhatti NI. Development and pilot-testing of a feasible, reliable, and valid operative competency assessment tool for endoscopic sinus surgery. Am J Rhinol Allergy. 2009 May-Jun;23(3):354-9. Goff BA, VanBlaricom A, Mandel L, Chinn M, Nielsen P. Comparison of objective, structured assessment of technical skills with a virtual reality hysteroscopy trainer and standard latex hysteroscopy model. J Reprod Med. 2007 May;52(5):407-12. Datta V, Bann S, Mandalia M, Darzi A. The surgical efficiency score: a feasible, reliable, and valid method of skills assessment. Am J Surg. 2006, Sep;192(3):372-8.
  • 70. Working Example: OSATS Adaptations Setna Z, Jha V, Boursicot KA, Roberts TE. Evaluating the utility of workplace-based assessment tools for specialty training. Best Pract Res ClinObstetGynaecol. 2010 Jun 30. Dorman K, Satterthwaite L, Howard A, Woodrow S, Derbew M, Reznick R, Dubrowski A. Addressing the severe shortage of health care providers in Ethiopia: bench model teaching of technical skills. Med Educ. 2009 Jul;43(7):621-7.
  • 71.
  • 73.
  • 74.
  • 75. Standards and Rigor Evaluations Assessments
  • 76. The Joint Committee on Standards for Educational Evaluation (1990) Utility Standards The utility standards are intended to ensure that an evaluation will serve the information needs of intended users. 
  • 77. The Joint Committee on Standards for Educational Evaluation (1990) Utility Standards Feasibility Standards The feasibility standards are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal.
  • 78. The Joint Committee on Standards for Educational Evaluation (1990) Utility Standards Feasibility Standards Propriety Standards The propriety standards are intended to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results.
  • 79. The Joint Committee on Standards for Educational Evaluation (1990) Utility Standards Feasibility Standards Propriety Standards Accuracy Standards The accuracy standards are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated.
  • 83.
  • 85.
  • 86. Take-home Need more complex evaluation models that look at both processes and outcomes Choices will depend on the intended use (i.e. the branch on the Evaluation Tree) Both the evaluation models and the assessments instruments need standardization and rigor.
  • 87. Beast Environment Intake Interaction Fluffy Reaction Function