Writing a good test engages both sides of your brain: indeed, test item writing is both an art and a skill. The good news is that practice can help you enhance your talents in both of these areas. Preparing multiple-choice and alternative-formatted questions, when done well, challenges you to use your knowledge of sound clinical practice gained over years of experience. But instead of just using this knowledge to determine if a student can recognize some basic facts, when you combine your talents and skills in creating a test that measures critical thinking ability, students’ answers to your well-written test questions can reveal not just whether or not they know the basic facts, but if they can apply them in a real-life situation that requires a high level of decision-making or problem-solving. Because most health science instructors were first clinicians and became academic faculty members much later as they moved along their career paths, the task of constructing critical-thinking test items and reliable and valid tests can seem overwhelming. Join this discussion about honing those item writing skills, and discover your talents using both sides of your brain to create a great test!
2. Crea%ng
tests
that
measure
cri%cal
thinking
in
nursing:
Test
item
wri%ng
that
is
both
art
and
science
Ainslie
t.
Nibert,
PHD,
RN,
FAAN
February
17,
2015
3. Resources
for
Developing
Cri2cal
Thinking
Test
Items
and
Alternate
Format
Items:
Na2onal
Council
Website
• www.nscbn.org
– NCLEX
Test
Plans
• 2013
RN
• 2014
PN
– Candidate
FAQ
– Alternate
item
formats
FAQ
– Exam
Development
FAQ
3
Source: https://www.ncsbn.org/2324.htm
4. Rela%onship
between
Tes%ng
&
the
Curriculum
4
Internal and External
Curriculum Evaluation
Outcome Predictors
6. 6
q Writing Critical Thinking Test Items
q Item Analysis Software & Blueprinting
q Test Item Banking & Exam Delivery
Internal Evaluation
Evaluation of course objectives (faculty designed or outsourced)
7. 7
Five Guidelines to Developing
Effective Critical Thinking Exams
q Assemble the “basics.”
q Write critical thinking test items.
q Pay attention to housekeeping duties.
q Develop a test blueprint.
q Scientifically analyze all exams.
11. 11
Bloom‘s Taxonomy: Benjamin Bloom, 1956
(revised)
Terminology changes "The graphic is a representation of the NEW verbage
associated with the long familiar Bloom's Taxonomy. Note the change from Nouns to
Verbs [e.g., Application to Applying] to describe the different levels of the taxonomy.
Note that the top two levels are essentially exchanged from the Old to the New
version." (Schultz, 2005) (Evaluation moved from the top to Evaluating in the second
from the top, Synthesis moved from second on top to the top as Creating.) Source:
http://www.odu.edu/educ/llschult/blooms_taxonomy.htm
27. 27
Cri%cal
Thinking
Test
Items
q Contain Rationale
q Written at the Application Level or Above
q Require Multilogical Thinking to Answer
q Ask for High Level of Discrimination
Source:
Morrison, Nibert, & Flick (2006)
29. 29
Written at the Application
Level and Above
q Prepare students for NCLEX®
q Promote thinking about clinical problems
q Cause teaching methods to become creative
30. 30
Require Multilogical Thinking to
Answer
Definition
Thinking that requires knowledge of more
than one fact to logically and systematically
apply concepts to a clinical problem
32. Cri%cally-‐thinking
Ques%ons
Which
interven2on
is
most
important?
Which
interven2on,
plan,
assessment
data
is/are
most
cri2cal
to
developing
a
plan
of
care?
Which
interven2on
should
be
done
first?
What
ac2on
should
the
nurse
take
first?
Which
interven2on,
plan,
nursing
ac2on
has
the
highest
priority?
What
response
is
best?
32
33. NCLEX® AlternaOve Test Item
Formats
• Mul2ple-‐response
items
• Fill-‐in-‐the-‐blank
items
• Hot
spot
• Chart/exhibit
format
• Ordered
Response
items
(Ranking)
• Audio
item
format
• Graphic
Op2ons
(graphics
imbedded
as
answer
op2ons)
• Any
item
formats,
including
standard
mul2ple-‐choice
items,
may
include
mul2media,
charts,
tables
or
graphic
images.
33
34. Latest NCLEX® Test Item Format
ConsideraOons
Units
of
Measure
• Interna2onal
Systems
of
Units
(SI)
• Metric
• Imperial
Measurement
Generic
vs.
Trade
Names
for
Medica2ons
• Generic
names
only
in
most
cases
• References
to
general
classifica2ons
of
medica2ons
34
36. 36
Item Writing
Rules
q Get rid of names
q Get rid of ‘multiple’ multiples
q Use non-sexist writing style
q Develop parsimonious writing style
Ø Cross out “of the following”
Ø Delete scenarios
q Write items independent of each other
37. 37
… and More Rules
q Use a question format when possible
q Make distracters plausible and
homogeneous
q Equal in length
q No opposites
38. 38
… and More Rules
q Eliminate “all of the above” and “none of the
above”
q Rewrite any “all except” questions
q Ensure that alternatives do not overlap
q Vary correct answer
39. 39
… and the MOST IMPORTANT Rule
Develop written testing policies
Ø A Role of the Testing Committee
Ø Guidelines: Writing style & Format
Ø Exam Administration Procedures
41. 41
Tes%ng
CommiNee
Responsibili%es
Recommenda2ons
typically
made
to
the
Curriculum
and
Student
Affairs
Commi]ees
to
coordinate
policy
crea2on
&
enforcement:
◦ Wri2ng
style
◦ Format
◦ Pilot
items
◦ Grades/scores
◦ Review
and
analysis
◦ Item
revision
◦ Track
students
◦ Accredita2on
prep
42. Standardized
Tes%ng:
Vigilance
with
Test
Security
1. Encourage
moral
behavior
(Academic
honesty
program
at
your
school
with
clear
language
placed
in
handbooks)
2. Discourage
chea2ng
a. Before
tes2ng
1. Minimize
access
to
exams
and
viewing
of
exam
content
2. Use
highest
levels
of
security
available
in
Blackboard
for
unit
tests
and
all
security
features
available
in
standardized
tes2ng
plaborm
-‐
protect
logins
&
access
codes;
ac2ve
dashboarding
3. Train
proctors
for
live
proctoring
ac2vi2es
b. During
tes2ng
1. Establish
secure
environment
2. No
devia2ons
to
test
procedures
or
breakdown
of
environmental
security
allowed.
Ex:
leaving
room
equates
to
the
test
being
over
for
that
student
regardless
of
reason
3. Vigilant
proctoring
3. Detect
chea2ng
with
Data
Forensics
and
take
ac2ons
as
needed
42
46. Have QuesOons? Need More Info?
Thanks
for
your
2me
&
a]en2on
today!
46
866-429-8889
47. References
American
Psychological
Associa2on.
(2004)
Code
of
Fair
Tes2ng
Prac2ces
in
Educa2on.
Washington,
DC:
Joint
Commi]ee
on
Tes2ng
Prac2ces.
h]p://www.apa.org/science/programs/tes2ng/fair-‐code.aspx
Morrison,
S.,
Nibert,
A.,
&
Flick,
J.
(2006).
Cri$cal
thinking
and
test
item
wri$ng
(2nd
ed.).
Houston,
TX:
Health
Educa2on
Systems,
Inc.
Morrison,
S.
(2004).
Improving
NCLEX-‐RN
pass
rates
through
internal
and
external
curriculum
evalua2on.
In
M.
Oermann
&
K.
Heinrich
(Eds.),
Annual
review
of
nursing
educa2on
(Vol.
3).
New
York:
Springer
Na2onal
Council
of
State
Boards
of
Nursing.
(2013)
2013
NCLEX-‐RN
test
plan.
Chicago,
IL:
Na2onal
Council
of
State
Boards
of
Nursing.
h]ps://www.ncsbn.org/3795.htm
Nibert,
A.
(2010)
Benchmarking
for
student
progression
throughout
a
nursing
program:
Implica$ons
for
students,
faculty,
and
administrators.
In
Capu2,
L.
(Ed.),
Teaching
nursing:
The
art
and
science,
2nd
ed.
(Vol.
3).
(pp.45-‐64).
Chicago:
College
of
DuPage
Press.
47
48. Click to edit Master title style
Click to edit Master subtitle style
For More Information:
Call: 1.866.429.8889
Email: info@examsoft.com
Visit: learn.examsoft.com