O slideshow foi denunciado.
Seu SlideShare está sendo baixado. ×

Principles of student assessment in medical education 2017 SATYA

Carregando em…3

Confira estes a seguir

1 de 142 Anúncio

Mais Conteúdo rRelacionado

Diapositivos para si (20)

Semelhante a Principles of student assessment in medical education 2017 SATYA (20)


Mais de sathyanarayanan varadarajan (20)

Mais recentes (20)


Principles of student assessment in medical education 2017 SATYA

  1. 1. * *Goal of Education is to bring desirable change of Knowledge, Skills, Attitudes In the learner
  2. 2. *
  3. 3. Education is a cyclical process
  4. 4. *
  5. 5. Prof. Dr. V. Sathyanarayanan MBBS., MD.,( F.I.M.E ) SRM MCH & RC, SRM University, Kattankulathur, INDIA *PRINCIPLES OF STUDENT ASSESSMENT
  6. 6. * *What is assessment? *Why is it required ? *Types of assessment *Characteristics of assessment *Assessment tools *Steps in assessment *Limitations of assessment *Summary
  7. 7. * *AT THE END OF THIS SESSION, PARTICIPANTS BE ABLE TO 1. Name the types of assessment 2. Compare formative and summative assessment 3. Explain the characteristics of an assessment tool 4. Identify the steps in student assessment 5. Choose an appropriate assessment tool 6. Express enthusiasm to apply the principles of student assessment in everyday practice
  8. 8. *
  9. 9. * *The process of determining whether predetermined educational objectives have been achieved…
  10. 10. We should assess what we teach and teach what we assess.
  11. 11. *
  12. 12. * *Assessment and evaluation are often used interchangeably *However for our purposes… *Assessment describes the measurement of learner outcomes *Evaluation describes the measurement of course/program outcomes
  13. 13. *
  14. 14. *
  15. 15. *Done at the classroom level *Done for planning Teaching- Learning *For student development *To provide feedback for students and teachers *Can help to modify T-L methods *Before final test of competence * Done at a wider level (school, college, university , national ) * Done scoring/grading or pass/fail decision * Used to find out the competence before certification *
  16. 16. Crucial Distinction Assessment OF Learning (Summative): How much have students learned as of at a particular point in time? Assessment FOR Learning (Formative): How can we use assessment to help students learn more?
  17. 17. *“When the cook tastes the soup, that’s formative assessment
  18. 18. *when the customer tastes the soup, that’s summative assessment.” (Brookhart, 1999)
  19. 19. * *Student’s performance is related to that of his peers *Used if only a fixed number of places are available *Eg entry to PG course
  20. 20. * *Performance is related to certain pre-determined criteria *Standard setting
  21. 21. * *Why do we assess? *What should we assess? *When should we assess? *How should we assess?
  22. 22. *
  23. 23. Assessment Feedback Learning Linking Learning, Assessment, and Feedback
  24. 24. Evolution of Medical Students Website by NUS students: http://medicus.tk “Assessment drives learning in the direction you wish.”
  25. 25. * *“Drives learning” *Provides baseline data *Provides summative and formative feedback *Allows to measure individual progress. *Encourages “student” reflection *Assures public that providers are competent *Licensure/credentialing requirements
  26. 26. * *Students *Teachers *Department, Faculty; University; Administrators *Public; Governmental Agencies *Stakeholders’ interest in assessment is not necessarily aligned.
  27. 27. * *Students *Teacher *Faculty, University *Public, Government
  28. 28. *
  29. 29. Effective Education Knowledge AttitudeSkills What Should We Assess?
  30. 30. *
  31. 31. * *Measures what the student can do
  32. 32. * *Measures what the student actually DOES *Applicable to only real life situations *Authentic assessment
  33. 33. * Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7. Knows Shows how Knows how Does Professionalauthenticity Cognition Performance
  34. 34. *
  35. 35. *Assessment of ‘Knows’ and ‘Knows How’
  36. 36. *Assessment of ‘Knows’ and ‘Knows How’: 1.Long Essay Questions (LEQ) 2.Short Answer Questions (SAQ) 3.Multiple Choice Questions (MCQ) 4.Extended Matching Items (EMI) 5.Oral Examination/Viva
  37. 37. *Assessment of ‘Shows How’:
  38. 38. *Assessment of ‘Shows How’: 1.Long Case 2.Short Case 3.Objective Structured Clinical Examination (OSCE) 4.Objective Structured Practical Examination (OSPE)
  39. 39. *Assessment of ‘Does’ 1.Mini Clinical Evaluation Exercise (Mini- CEX) 2.Direct Observation of Procedural Skills (DOPS) 3.360-Degree Evaluation 4.Logbook 5.Portfolio
  40. 40. *
  41. 41. Continuum of Performance *‘Learning Curve’
  42. 42. A examination that attempts to test students’ mastery at a given point of time is less preferable than one that tests the mastery over a span of time.
  43. 43. *
  44. 44. *
  46. 46. *Validity *Validity: Ability of the assessment instrument to test what it is supposed to test.
  47. 47. * *The most important characteristic of an assessment tool *Ask “ what is the learning outcome to be measured?” *Matter of degree ( less valid or more valid)
  48. 48. Content validity: ability of the assessment instrument to sample representative content of the course. Course content Assessment
  49. 49. * *Reliability refers to the consistency of test scores and the concept of reliability is linked to specific types of consistency. *Over time *Between different examiners, *Different testing conditions *Instruments for student assessment needs high reliability to ensure transparency and fairness
  50. 50. * *Is a measure of the reproducibility of the results of a test *Measure of the correlation between two sets of scores obtained when the test is repeated after an interval ( test – retest method) *When the test is split into 2 halves and results are compared ( split half method )
  51. 51. Ex 1 Ex 2 Ex 3 Ex 4 Ex 5 Q 1 X Q 2 X Q 3 X Q 4 X Q 5 X Examiner Question
  52. 52. Ex 1 Ex 2 Ex 3 Ex 4 Ex 5 Q 1 X X X X X Q 2 Q 3 Q 4 Q 5 Examiner Question
  53. 53. Ex 1 Ex 2 Ex 3 Ex 4 Ex 5 Q 1 X Q 2 X Q 3 X Q 4 X Q 5 X Examiner Question
  54. 54. * *Assessment must be appropriate in the context of the needs *Should be obvious to the teacher and the student
  55. 55. * *Is the degree to which the assessment adheres to specific criteria
  56. 56. * *Is the degree to which the assessment adheres to specific criteria *Measures To increase objectivity of scoring 1. Structuring the questions 2. Preparing model answers 3. Agreement on marking scheme 4. Independent assessment by more than one examiner
  57. 57. * *Is the degree to which the process of assessment is practical *And possible to implement in the circumstances *Time *Expertise *cost
  58. 58. * *Are we measuring what we are supposed to be measuring? *Use the appropriate instrument for the knowledge, skill, or attitude you are testing *The major types of validity should be considered (content, predictive, and face)
  59. 59. * *Does the test consistently measure what it is supposed to be measuring ? *Types of reliability: *Inter-rater (consistency over raters) *Test-retest (consistency over time) *Internal consistency (over different items/forms)
  60. 60. * *Is the administration of the assessment instrument feasible in terms of time and resources? *Time to construct? *Time to Score ? *Ease of interpreting the score/producing results ? *Practical given staffing/organization ? *Quality of feedback ? *Learner takeaway ? *Motivate Learner ?
  61. 61. * *Number of students to be assessed *Time available for the assessment *Number of staff available *Resources/equipment available *Special accommodations
  62. 62. * 1. OBJECTIVE MEASUREMENT  marks, rank, percentile 2. VALUE JUDGEMENT  Regarding desirability of the result of measurement to relook SLO and TL Process  decide on changes
  63. 63. * 1. Define Learning objectives 2. provide Teaching –Learning experiences 3. select measuring instrument 4. administer test 5. decide marking 6. score test 7. analyse result 8. make a final decision 9. not right, choose an alternative method
  64. 64. *Process of assessment must measure change in all three domains
  65. 65. * *Most commonly long answer or essay for theory ( assessment of knowledge) *Long case or short case for clinical examination *Oral examination for the assessment of practical skills *OSCE , OSPE  increase reliability *Log book, diary  attitude assessment
  66. 66. * *Low Stake Examinations High Stake Examinations *Long essay question —> Multiple short answer question *Traditional long case —> Multi-station OSCE
  67. 67. * * Correlation between performance of students in examinations and performance on the job is less than ideal..
  68. 68. *
  69. 69. *
  70. 70. *
  71. 71. * *Any assessment is anxiety provoking for the students and (staff) *Assessment has potential positive and negative steering effects on learning and professional development
  72. 72. *
  73. 73. Student Learning Outcomes Curriculum Teaching & Learning Assessment Evaluation ± change/refine
  74. 74. *
  75. 75. Critical questions in assessment 1. WHY are we doing the assessment? 2. WHAT are we assessing? 3. HOW are we assessing it? 4. HOW WELL is the assessment working?
  76. 76. 1. WHY are we doing the assessment? What Is its purpose?  Formative?  Summative?
  77. 77. 2. WHAT are we testing?  Elements of competence Knowledge  factual  applied: clinical reasoning Skills  communication  clinical Attitudes  professional behaviour Tomorrow’s Doctors, GMC 2003
  78. 78. 3. How are we doing the assessment? Test formats Knows Shows how Knows how Does Knows Factual tests: SBAs ( MCQs ) Knows how (Clinical) Context based tests: SBAs, EMQs, SAQ Shows how Performance assessment in vitro: OSCEs Does Performance assessment in vivo: Video, WBA eg mini-CEX, DOPs
  79. 79. 4. HOW WELL is the assessment working? Evaluation of assessment systems •Is it valid? •Is it reliable? •Is it doing what it is supposed to be doing? • To answer these questions, we have to consider the characteristics of assessment instruments
  80. 80. Principles of Assessment There is no perfect assessment: compromise is always required The compromise depends on the context of the assessment The Quality of assessments is a matter of the integral assessment programme, rather than of the individual instruments
  81. 81. *
  82. 82. *No single assessment method can provide all the data required for judgment of anything so complex as the delivery of professional services by a successful physician : George Miller 1990
  83. 83. *THANK YOU…