2. On the Agenda
SETTING THE SCENE
• Who? What? Where? When?
WHY?
• Assessment in Graduate Medical Education
FIVE PRACTICAL TIPS
• How we addressed the problem
USE OF ASSESSMENT RESULTS
• How were these methods useful?
DISCUSSION
• Questions?
4. Office of Medical
Education, Research, and Development
(OMERAD)
• Job Description
• Consultation and education
• What was happening with GME at
our institution?
• New Office Structure
• PhD students in ESM brought in
• Office given the reigns of the clinical
research skills curricula
6. What is EBM?
Clinical
Epidemiology
Biostatistics
Critical
Appraisal
Evidence-Based Medicine in GME1,2
7. What We Know About Resident
Knowledge of Clinical Research Skills?
• Error rate in reporting and
interpreting statistics in medicine
3
is estimated between 30-90%
• Consistent…
• Lack of knowledge
4,5
• Lack of confidence
An example…
9. TIP ONE
Know Your Situation
• Learning environment factors
• Statistics and research methods
as a topic
• No formal “courses”, nothing is
“required”
• No previous learning
objectives, syllabus, or
assessment structure
• Work environment factors
• Hospital obligations
• Attending physician buy-in &
priorities
10. TIP ONE
Know Your Situation
• Population-specific factors
• Variable background
experience
• Low average competence
and confidence
• Realities of being a
physician
• Availability of resources
• Limited time
• Limited money
11. TIP TWO
Clarify Your Purpose
• Ask two questions:
• How will the assessment
audience benefit from the
results?
• How will the students benefit
from the assessment results?
• In our case
• Audience (OMERAD, GSM
faculty/administration)
• Students
(Residents, fellows, physicians
, & staff)
12. TIP THREE
Use What You Have
• Gather the Necessary
Background Data
• Existing content
• Faculty interviews
• Direct observation
• Literature
• Clinical/Work experience
• Three benefits
• What instructors think the
students are learning
• What is being taught
• Where the gaps are in the
curriculum
13. TIP FOUR
Fit the Instrument to Your Purpose, Not
the Other Way Around
• Again, consider situational
factors
• Resources for types of
assessment instruments
• What worked for us
• Background knowledge
6
probe
14. TIP FIVE
Get Consistent and Critical Feedback
Assessment must be
viewed as a never-
Develop/
Modify ending, iterative process
• An instrument is
developed or modified
• The instrument is
“Feedback tested
Loop” of
Assessment • Testing generates
Practice feedback
• Feedback leads to
Feedback Test
modifications…
15. TIP FIVE
Get Consistent and Critical Feedback
Assessment must be
viewed as a never-
Develop/
Modify ending, iterative process
• An instrument is
developed or modified
• The instrument is
“Feedback tested
Loop” of
Assessment • Testing generates
Practice feedback
• Feedback leads to
Feedback Test
modifications…
• These modifications
are tested
17. Improvements to the Course:
Learning Objective Development
• Multiple sources
of data
• Assessment
Concrete • Experiential
“List of Topics” Learning • Evaluation
Objectives • Data identified
• Salient topics
• Missing content
• Student needs
• Need for a
BEFORE AFTER responsive
curriculum
18. Improvements to the Assessment:
Test blueprint process used to improve the
assessment instrument
Module 2:Comparing Research Designs
1. Start with course
learning objectives
2. Identify test “topics”
Identify major from learning
epidemiologic objectives
research “after this 3. Expand each topic
designs
module to as many
participants “concepts” as
should be Apply each possible
able to…” design to their 4. Collapse list of
own area of
research concepts to remove
redundancy
5. Create/modify items
19. Next Steps
• Continue instrument & curriculum
revisions
• Standardized assessment for
residents, fellows, physicians on
clinical research skills and
statistics.
21. References
1. Green, M. L. (2000). Evidence-based medicine training in graduate
medical education: past, present and future. Journal of evaluation in
clinical practice, 6(2), 121–38. Retrieved from
http://www.ncbi.nlm.nih.gov/pubmed/10970006
2. Stewart, M.G. (2001). ACGME Core Compentencies Accreditation
Council for Graduate Medical Education. Retrieved from
http://www.acgme.org/acWebsite/RRC_280/280_coreComp.asp
3. Novack, L., Jotkowitz, A., Knyazer, B., & Novack, V. (2006). Evidence-
based medicine: assessment of knowledge of basic epidemiological and
research methods among medical doctors. Postgraduate Medical
Journal, 82(974), 817–822. Retrieved from
http://pmj.bmj.com/content/82/974/817.abstract
4. West, C. P., & Ficalora, R. D. (2007). Clinician Attitudes Toward
Biostatistics. Mayo Clinic Proceedings, 82(8), 939–943. Retrieved from
http://www.mayoclinicproceedings.com/content/82/8/939.abstract
5. Windish, D. M., Huot, S. J., & Green, M. L. (2007). Medicine Residents’
Understanding of the Biostatistics and Results in the Medical Literature.
JAMA: The Journal of the American Medical Association, 298(9), 1010–
1022. Retrieved from http://jama.ama-
assn.org/content/298/9/1010.abstract
6. Angelo, T. & Cross, K.P. (1993). Classroom Assessment Techniques. San
Francisco: Jossey-Bass.
7. Fink, L.D. (2003). Creating Significant Learning Experiences: An
integrated approach to designing college courses. San Francisco, CA:
John Wiley & Sons, Inc.