3. WHY do we assess ?
• To ensure safety of patients
our responsibility to the public
• Achievement of a minimum standard
responsibility to the candidate and University
In principle…
4. WHY do we assess ?
• To ensure competence
• Often as a means of academic competition
In practice: the scope
5. WHY do we assess ?
• Formative: to give feedback and advice
• Summative: to grade
• Qualificative or licensing
In practice: the purpose
6. WHAT do we measure ?
To test not only presence of knowledge
…but also the application of knowledge
In principle…
7. Several types of clinical assessment
One model presented in some detail to
provide a framework for later discussion
8. One model of clinical assessment
• Certification of competence - pass / fail
a state (and legal) requirement
• Grading in rank order
for employment / placement purposes
• A competition for the award of a prize
In principle…a three-fold aim
9. One model of clinical assessment
Measurement of:
• adequacy of basic clinical skills
• ability to interpret clinical findings
• facility of communication in practical settings
• ability to think analytically about diagnosis
• ability to discuss management logically
In practice…
10. Practical steps for assessment
• 6 encounters with different clinical situations
• Two examiners at every encounter, each
examiner giving an individual assessment
• Highly structured examination and detailed
assessment of skills
• Examiners from other Universities for
process evaluation and quality control
11. Practical steps for assessment
• Encounters with at least six real patients
• Ability to interpret and discuss clinical data
• Management of an emergency scenario
• Appraisal of communication skills and attitude
12. Set-points for testing
• Attitude to patient
• Actual examination skills
• Presentation of findings
• Clinical judgment
Clinical examination of patients
13. Set-points for testing
• Evaluation of data
• Significance of data
• Clinical reasoning
Interpretation of clinical data
14. Set-points for testing
• Ability to solve problems
• Ability to discuss logically
• Clinical judgment and prioritization
• General medical knowledge
Management of emergency situations
15. Set-points for testing
• Attitude to ‘patient’
• Ability to communicate well
• Clinical judgment
• General medical knowledge
Communication skills
18. Evaluating outcome
• Pattern of results
• Consistency of results
• Patterns of marking
• Process shortcomings
• Basis for improvement
Analysis of data to assess effectiveness
19. What happens to candidates who fail ?
• Review of performance – a formative exercise
• Counselling at a personal level
• Specific attention and individual training
• Repeat assessment after a period of time
20. Points for discussion
• Competence versus performance
• Exam based versus continuous assessment
• Methodology related issues
• Organisational issues
21. Competence versus Performance
• In relation to methods used
• Use of core curriculum with additional
modules as required
• Eventual medical practice in different
environment
• Impact on mobility
22. Competence versus Performance
• In relation to methods used
• Use of core curriculum with additional
modules as required
• Eventual medical practice in different
environment
• Impact on mobility
23. Competence versus Performance
• In relation to methods used
• Use of core curriculum with additional
modules as required
• Eventual medical practice in different
environment
• Impact on mobility
24. Competence versus Performance
• In relation to methods used
• Use of core curriculum with additional
modules as required
• Eventual medical practice in different
environment
• Impact on mobility
25. Exam based vs continuous assessment
• Assessment of a modular curriculum
• Written exams: MCQs, SAQ, Essays
• Real-time exams
• Orals: clinical / table viva, OSCEs
• Clinical assessment on site: mini-CEX,
DOPS, multi-source feedback
• Log books, portofolios, CATs
26. Exam based vs continuous assessment
• Assessment of a modular curriculum
• Written exams: MCQs, SAQ, Essays
• Real-time exams
• Orals: clinical / table viva, OSCEs
• Clinical assessment on site: mini-CEX,
DOPS, multi-source feedback
• Log-books, portfolios, CATs
29. Role of grading
• Is there need for grading ?
• Selection process for employment
• Selection for postgraduate training
• Quality of assessment method and
performance of candidates
30. Conclusions
There is wide diversity among European medical schools
regarding methods of assessment of clinical skills:
• some schools aim at pass/fail outcomes, others use systems that
lead to grading
• several quality assurance mechanisms are used to varying degrees
There is place for widespread application of agreed
standard methods to:
• assess clinical competences in core curricula
• assess additional competences essential to individual practice
31. UNIVERSITY OF MALTA MEDICAL SCHOOL
1676 - 2007
EU UNIVERSITY PARTNER
● joseph.cacciottolo@um.edu.mt ● josanne.vassallo@um.edu.mt