SlideShare a Scribd company logo
1 of 17
QUICK START
User’s Guide
2014
PART OF THE ASSESSMENT LITERACY SERIES
THE RIA GROUP
| 16407 Highland Club Avenue Baton Rouge, LA 70817
Quick Start©
User’s Guide-June 2014-Working Draft 1
TABLE OF CONTENTS
Introduction 2
Purpose 3
Homeroom 3
Phase I: Designing the Assessment 5
1.1 Goal Statement 5
1.2 Objectives 5
1.3 Guiding Questions 5
1.4 Resources 5
1.5 Procedural Steps 6
STEP 1: Create a Purpose Statement 6
STEP 2: Select Targeted Content Standards 6
STEP 3: Develop a Test Blueprint 7
1.6 Quality Reviews 7
Phase II: Building the Assessment 8
2.1 Goal Statement 8
2.2 Objectives 8
2.3 Guiding Questions 8
2.4 Resources 9
2.5 Procedural Steps 9
STEP 4: Item Stems/Task Prompts 9
STEP 5: Scoring Keys/Scoring Rubrics 10
STEP 6: Test Forms 11
2.6 Quality Reviews 12
Phase III: Reviewing the Assessment 13
3.1 Goal Statement 13
3.2 Objectives 13
3.3 Guiding Questions 13
3.4 Resources 14
3.5 Procedural Steps 14
STEP 7: Item/Tasks Reviews 14
STEP 8: Alignment and Performance Level Reviews 14
STEP 9: Data Reviews 15
STEP 10: Refinements 15
3.6 Quality Reviews 16
Quick Start©
User’s Guide-June 2014-Working Draft 2
Quick Start©
User’s Guide
Introduction
The purpose of this document is to provide guidance for developing measures of student
performance that will meet the criteria within the Performance Measure Rubric. The rubric is a
self-assessment tool used to ascertain the technical quality of locally-developed performance
measures. The process used to “design”, “build”, and “review” teacher-made performance
measures is contained within the Quick Start program. Quick Start delivers a
foundational understanding of the procedures necessary to create these performance measures,
which teachers may then use to assess their students’ skills, knowledge, and concept mastery of
targeted content standards.
Figure 1. Process Components
Design
• Purpose
Statement
• Targeted
Content
Standards
• Test
Blueprint
Build
• Items/Tasks
• Scoring Keys
& Scoring
Rubrics
• Test Forms
Review
• Item/Task
Reviews
• Alignment
Reviews
• Data Reviews
• Refinements
Quick Start©
User’s Guide-June 2014-Working Draft 3
Purpose
This document guides educators in the development of performance measures in three
phases: Design, Build, and Review. Each phase includes customized training and educator-
friendly tools to ensure that the performance measures meet the criteria within the Performance
Measure Rubric. This rubric, which helps determine the technical quality of performance
measures, follows a structure similar to the training process used in developing student learning
objectives (i.e., Design, Build, and Review). Educators have the flexibility to begin the process
from Orientation to Review, or simply the Review phase, based upon their needs and experience
in the assessment development process.
Homeroom
Homeroom is the learning platform that brings this effective training right to your
fingertips. To access the training and documents necessary for creating high-quality
performance measures visit www.pdehr.riagroup2013.com. It is important to note that
the user may access this training from any device whether it be a tablet, phone, or PC.
When accessing Homeroom for the first time, the user will need to register through the
Homeroom login screen. In the event of a lost password, username, or other questions,
the user may contact the Help Desk through email at helpdesk@riagroup2013.com or call
toll free at 1.855.787.9446 (see Figure 2 below).
Figure 2. Homeroom Login Screen
The home page offers the user the Quick Start icon option as shown below. The first
option, “I am a Teacher”, is oriented to teachers completing the SLO Process. The second
option, “I am a School Leader”, is designed for principals, superintendents, etc. The Quick
Start icon expands as shown in Figure 3 below to offer the user options.
Quick Start©
User’s Guide-June 2014-Working Draft 4
Figure 3. User Options
Each phase of the Quick Start Process; Design, Build, and Review contains the
components listed below. The TRAINING > VIEW THE TRAINING component provides the
user with PowerPoints and videos instructing the user in assessment creation. The TEMPLATES
> CREATE YOUR OWN component provides templates for the user to download and utilize in
developing effective student learning objectives. The RESOURCES > HELPFUL MATERIALS
component provides guides and other resources to enhance the Quick Start Process
experience (see Figure 4 below).
Figure 4. Quick Start Process Components
Quick Start©
User’s Guide-June 2014-Working Draft 5
Phase I: Designing the Assessment
1.1 Goal Statement
 Understand and apply the techniques used to design measures of student performance.
1.2 Objectives
 The professional will successfully:
o Create a purpose statement for a specific performance measure.
o Identify content standard(s) that represent the Enduring Understanding/Key
Concept within the content area.
o Develop a test blueprint outlining the performance measure’s structure.
1.3 Guiding Questions
 What is the performance measure intended to measure and at what grade?
 What are the developmental characteristics of test-takers?
 Which areas will be targeted among the various content standards?
 How will educators use the results (overall score and “growth” inferences)?
 When will the performance measure be administered?
 Do the items/tasks capture the content standards within the key concept?
 Is the number of items/tasks sufficient so that students at varying levels can demonstrate
their knowledge?
 What are the time demands for both teachers and students?
 How does the design reflect the areas of emphasis in the standards?
1.4 Resources
Training Templates Resources
 M1-Designing the
Assessment
 Template #1-Designing the
Assessment
 HO #1-Designing the Assessment-
Examples
 Cognitive Demand Crosswalk
Quick Start©
User’s Guide-June 2014-Working Draft 6
1.5 Procedural Steps
Create a Purpose Statement
Step 1. Individually create a statement about the performance measure in terms of the
content standards it will purport to measure.
Step 2. Build consensus by focusing on three components of the statement: What, How,
Why.
Step 3. Draft three sentences reflecting the group’s consensus for each component, and
review.
Step 4. Merge sentences to create a single paragraph “statement”. Again, review to
ensure that the statement reflects the group’s intent.
Step 5. Finalize the statement and double-check for editorial soundness.
Select Targeted Content Standards
Step 1. Place the course/subject’s name and Enduring Understanding/Key Concept
statement above the Targeted Content Standards table.
Step 2. Place the code for each standard/content strand in the Content ID column along
with the description for each content standard in the Content Statement column.
Step 3. Have a subject matter expert work collaboratively to identify initial (i.e., draft) set
of content standards associated with the “Enduring Understanding”/Key
Concept”.
Step 4. Review the list of targeted content standards and look for gaps and/or
redundancies and then finalize the list by placing an “X” in the Final column.
Step 5. Verify that the “final” targeted content standards will be those used to develop the
test blueprint.
Quick Start©
User’s Guide-June 2014-Working Draft 7
Develop a Test Blueprint
Step 1. Review the targeted content standards identified in Step 2.
Step 2. Insert selected Enduring Understanding/Key Concept and targeted content
standards (numeric code only) into the test blueprint table.
Step 3. Determine the number of items/tasks across the four cognitive levels.
Step 4. Tally the rows and place the values in the Total column. Tally each cognitive
level column and place the resultant values in the Grand Totals row.
Step 5. Report the total number of items/tasks and the total possible points available.
1.6 Quality Reviews
The Performance Measure Rubric is designed to help the educator review items/tasks,
scoring rubrics, and assessment forms to create high-quality performance measures. Strand 1 of
the Performance Measure Rubric evaluates the Design phase of the assessment process (purpose
statement, targeted content standards, and test blueprint). Refer to Handout #3 Performance
Measure Rubric-Scored Example for more information.
Task
ID
Descriptor Rating Evidence
1.1 The purpose of the performance measure is explicitly stated (who, what, why).
1.2
The performance measure has targeted content standards representing a range
of knowledge and skills students are expected to know and demonstrate.
1.3
The performance measure’s design is appropriate for the intended audience
and reflects challenging material needed to develop higher-order thinking
skills.
1.4
Specification tables articulate the number of items/tasks, item/task types,
passage readability, and other information about the performance measure -
OR- Blueprints are used to align items/tasks to targeted content standards.
1.5
Items/tasks are rigorous (designed to measure a range of cognitive
demands/higher-order thinking skills at developmentally appropriate levels)
and of sufficient quantities to measure the depth and breadth of the targeted
content standards.
Strand 1 Summary __out
of 5
Quick Start©
User’s Guide-June 2014-Working Draft 8
Phase II: Building the Assessment
2.1 Goal Statement
 Understand and apply the techniques used to build measures of student performance.
2.2 Objectives
 The professional will successfully:
o Create the necessary items/tasks to address the test blueprint.
o Develop scoring keys and/or scoring rubrics.
o Organize items/tasks and administration guidelines into a test form.
2.3 Guiding Questions
 Are the items aligned with targeted content standards?
 Do the selected items/tasks allow students to demonstrate content knowledge by:
o Responding to questions and/or prompts?
o Performing tasks, actions, and/or demonstrations?
 Do the items/tasks measure content knowledge, skill, or process and not an external or
environmental factor (e.g., guessing)?
 Is the number of items/tasks sufficient to sample the targeted content?
 Are the items/tasks developmentally appropriate for the intended test-takers?
 Are the correct answers and/or expected responses clearly identified?
 Do the performance measure’s directions specify:
o What the test-taker should do, read, or analyze?
o Where and how the test-taker should respond or demonstrate the task?
o How many points a correct/complete response is worth towards the overall score?
 Are there directions for different item/task types?
Quick Start©
User’s Guide-June 2014-Working Draft 9
2.4 Resources
Training Templates Resources
 M2-Building the
Assessment
 Template #2-Building the
Assessment
 Performance Task Framework
 HO #2-Building the Assessment
 Model #1-Art Grade 5-DEMO
 Model #2-Grade-Pre-Algebra-
DEMO
 Model #3-Nutrition Culinary,
Level III-DEMO
 Cognitive Demand Crosswalk
2.5 Procedural Steps
2.5.1 Item Stems/Task Prompts
Multiple Choice (MC) Items
Step 1. Review the targeted content standard.
Step 2. Determine which aspects of the standard can be measured objectively.
Step 3. Select the focused aspect and determine the cognitive demand reflected in the
standard’s description.
Step 4. Create a question (stem), one correct answer, and plausible (realistic) distractors.
Step 5. Review the item and answer options for grammatical soundness.
Short Answer/Extended Answer Items
Step 1. Review the targeted content standard(s).
Step 2. Determine which aspects of the standard(s) can be best measured by having
students “construct” a short response.
Step 3. Select and list aspects of the targeted content standard(s) to be measured.
Step 4. Create a prompt, select a passage, or develop a scenario for students.
Step 5. Develop a clear statement that articulates specific criteria for the test-taker to
provide.
Quick Start©
User’s Guide-June 2014-Working Draft 10
Extended Answer (EA) Tasks
Step 1. Review the targeted content standard(s).
Step 2. Determine which aspects of the standard(s) can be best measured by having
students “construct” an extended response to a given prompt, scenario, or passage.
Step 3. Select and list all aspects of the targeted content standard(s) to be measured.
Step 4. Create a prompt, select a passage, or develop a scenario for students.
Step 5. Develop a clear statement for each subordinate task that articulates specific
criteria for the test-taker to provide.
Extended Performance (EP) Tasks
Step 1. Review the targeted content standard(s).
Step 2. Determine which aspects of the standard(s) can be best measured by having
students “develop a complex response, demonstration, or performance over an
extend period of time (e.g., two weeks).
Step 3. Select and list all aspects of the targeted content standard(s) to be measured.
Step 4. Create a project, portfolio, or demonstration expectation statement that includes
subordinate tasks, which are aligned to the test blueprint.
Step 5. Develop a clear statement for each subordinate task that articulates specific
criteria for the test-taker to provide.
2.5.2 Scoring Keys/Scoring Rubrics
MC Items Score Key
Step 1. Enter the assessment information at the top of the Scoring Key.
Step 2. Record the item number, item tag (optional), item type, and point value.
Step 3. Record the MC answers in the Answer column.
Step 4. Repeat Steps 1-4 until all items on the test blueprint are reflected within the
Scoring Key.
Step 5. Validate that each question-to-answer relationship is recorded correctly
Quick Start©
User’s Guide-June 2014-Working Draft 11
Short Answer/Extended Answer/Extended Performance Scoring Rubrics
Step 1. Review the SA, EA, or EP task and the criteria articulated in the stem/directions.
Step 2. Select a “generic” rubric structure (see Template #2: Building the Assessment)
based upon scoring criteria and the number of dimensions being measured.
Step 3. Modify the rubric language using specific criteria expected in the response to
award the maximum number of points.
Step 4. Determine how much the response can deviate from “fully correct” in order to
earn the next (lower) point value. [Continue until the full range of possible scores
is described.]
Step 5. During the review, ensure the response expectation, scoring rubric, and test
blueprint are fully aligned.
Procedural Steps: Administration Guidelines
Step 1. Create a series of administrative steps for before, during, and after the assessment
window.
Step 2. Explain any requirements or equipment necessary, including accommodations.
State any ancillary materials (e.g., calculators) needed or allowed by the test-
takers.
Step 3. Identify the approximate time afforded to complete the assessment, including
each subtask in an EP task.
Step 4. Include detailed “scripts” articulating exactly what is to be communicated to
students, especially when administering performance tasks over a long period of
time.
Step 5. Include procedures for scoring, administering make-ups, and handling completed
assessments.
Procedural Steps: Test Forms
Step 1. Develop a cover page stating the test form developed and include any necessary
demographic information (e.g., section number, student name, date administered,
etc.).
Step 2. Organize the items/tasks/prompts in a sequence that will maximize student
engagement.
Quick Start©
User’s Guide-June 2014-Working Draft 12
Step 3. Add item tags and section dividers (optional).
Step 4. Refine the draft form to minimize “blank space”, verify picture, graph, table, and
figure placement in relationship to the item/task, and ensure MC answer options
do not drift from one page to the next.
Step 5. Add scoring rubric or criteria for constructed response tasks.
2.6 Quality Reviews
The Performance Measure Rubric is designed to help the educator review items/tasks,
scoring rubrics, and assessment forms to create high-quality performance measures. Strand 2 of
the Performance Measure Rubric evaluates the Build phase of the assessment process (refer to
Handout #3-Performance Measure Rubric-Scored Example for more information).
Task ID Descriptor Rating Evidence
2.1
Items/tasks and score keys are developed using standardized procedures,
including scoring rubrics for human-scored, open-ended questions (e.g.,
short constructed response, writing prompts, performance tasks, etc.).
2.2
Items/tasks are created and reviewed in terms of: (a) alignment to the
targeted content standards, (b) content accuracy, (c) developmental
appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and
fairness.
2.3
Administration guidelines are developed that contain the step-by-step
procedures used to administer the performance measure in a consistent
manner, including scripts to orally communicate directions to students,
day and time constraints, and allowable accommodations/adaptations.
2.4
Scoring guidelines are developed for human-scored items/tasks to
promote score consistency across items/tasks and among different scorers.
These guidelines articulate point values for each item/task used to
combine results into an overall score.
2.5
Summary scores are reported using both raw score points and a
performance level. Performance levels reflect the range of scores possible
on the assessment and use terms or symbols to denote performance levels.
2.6
The total time to administer the performance measure is developmentally
appropriate for the test-taker. Generally, this is 30 minutes or less for
young students and up to 60 minutes per session for older students (high
school).
Strand 2 Summary __out
of 6
Quick Start©
User’s Guide-June 2014-Working Draft 13
Phase III: Reviewing the Assessment
3.1 Goal Statement
 Understand and apply the techniques used to review and refine measures of student
performance.
3.2 Objectives
 The professional will successfully:
o Review developed items/tasks for validity threats, content and cognitive match;
o Examine test alignment to ensure: (a) all items/tasks match the skills, knowledge,
and concepts in the targeted content standards; (b) all rubrics match the targeted
content standards; and, (c) all items/tasks reflect higher order thinking.
Note: A supplement to this document will address alignment (Step 8), data reviews (Step 9), and
refinement (Step 10), which was addressed in the Orientation presentation.
3.3 Guiding Questions
 Does each item/task clearly address the standard?
 Is the reading difficulty and vocabulary appropriate?
 Is the language clear, consistent, and understandable?
 Are charts, tables, graphs, and diagrams clear and understandable?
 Is there only one (1) correct answer?
 Have the items been reviewed for bias and sensitivity?
o Items provide an equal opportunity for all students to demonstrate their
knowledge and skills. The stimulus material (e.g., reading passage, artwork, and
diagram) does not raise bias and/or sensitivity concerns that would interfere with
the performance of a particular group of students.
 Are the items developmentally appropriate for test-takers?
 Does the blueprint reflect the test form?
 Does the scoring rubric provide detailed scoring information?
 Does the assessment have at least two (2) performance levels?
Quick Start©
User’s Guide-June 2014-Working Draft 14
3.4 Resources
Training Templates Resources
 M3-Reviewing the
Assessment
 Template #3-Performance
Measure Rubric
 HO #3-Reviewing the
Assessment-Scored Example
3.5 Procedural Steps
Step 1. Identify at least one other teacher to assist in the review (best accomplished by
department or grade-level committees).
Step 2. Organize the test form, answer key, and/or scoring rubrics, and Handout #3-
Reviewing the Assessment-Scored Example.
Step 3. Read each item/task and highlight any “potential” issues in terms of content
accuracy, potential bias, sensitive materials, fairness, and developmental
appropriateness.
Step 4. After reviewing the entire test form, including scoring rubrics, revisit the
highlighted items/tasks. Determine if the item/tasks can be rewritten or must be
replaced.
Step 5. Print revised assessment documents and conduct an editorial review, ensuring
readability, sentence/passage complexity, and word selection are grammatically
sound. Take corrective actions prior to finalizing the documents.
Step 1. Identify at least one other teacher to assist in the alignment review (best
accomplished by department or grade-level committees).
Step 2. Organize items/tasks, test blueprint, and targeted content standards.
Quick Start©
User’s Guide-June 2014-Working Draft 15
Step 3. Read each item/task in terms of matching the standards both in terms of content
reflection and cognitive demand. For SA, EA, and EP tasks, ensure that scoring
rubrics are focused on specific content-based expectations. Refine any identified
issues.
Step 4. After reviewing all items/tasks, including scoring rubrics, count the number of
item/task points assigned to each targeted content standard. Determine the
percentage of item/task points per targeted content standard based upon the total
available. Identify any shortfalls in which too few points are assigned to a
standard listed in the test blueprint. Refine if patterns do not reflect those in the
standards.
Step 5. Using the item/task distributions, determine whether the assessment has at least
five (5) points for each targeted content standard and if points are attributed to
only developmentally appropriate items/tasks. Refine if point sufficiency does
not reflect the content standards
Step 1. Conduct after test-takers have engaged in the assessment procedures.
Step 2. Focus on data about the items/tasks, performance levels, score distribution,
administration guidelines, etc.
Step 3. Evaluate technical quality by examining aspects such as: rater reliability, internal
consistency, intra-domain correlations, decision-consistency, measurement error,
etc.
Step 1. Complete prior to the beginning of the next assessment cycle.
Step 2. Analyze results from the prior assessment to identify areas of improvement.
Quick Start©
User’s Guide-June 2014-Working Draft 16
Step 3. Consider item/task replacement or augmentation to address areas of concern.
Step 4. Strive to include at least 20% new items/tasks, or implement an item/task tryout
approach.
Step 5. Create two parallel forms (i.e., Form A and B) for test security purposes.
3.6 Quality Reviews
The Performance Measure Rubric is designed to help the educator review items/tasks,
scoring rubrics, and assessment forms to create high-quality performance measures. Strand 3 of
the Performance Measure Rubric evaluates the Review phase of the assessment process (refer to
Handout #3-Performance Measure Rubric-Scored Example for more information).
Task
ID
Descriptor Rating Evidence
3.1
The performance measures are reviewed in terms of design fidelity –
 Items/tasks are distributed based upon the design properties found
within the specification or blueprint documents.
 Item/task and form statistics are used to examine levels of difficulty,
complexity, distracter quality, and other properties.
 Items/tasks and forms are rigorous and free of bias, sensitive, or unfair
characteristics.
3.2
The performance measures are reviewed in terms of editorial soundness, while
ensuring consistency and accuracy of other documents (e.g., administration) –
 Identifies words, text, reading passages, and/or graphics that require
copyright permission or acknowledgements
 Applies Universal Design principles
 Ensures linguistic demands and/or readability is developmentally
appropriate
3.3
The performance measures are reviewed in terms of alignment characteristics –
 Pattern consistency (within specifications and/or blueprints)
 Matching the targeted content standards
 Cognitive demand
 Developmental appropriateness
3.4
Cut scores are established for each performance level. Performance level
descriptors describe the achievement continuum using content-based
competencies for each assessed content area.
3.5
As part of the assessment cycle, post-administration analyses are conducted to
examine aspects as items/tasks performance, scale functioning, overall score
distribution, rater drift, content alignment, etc.
3.6
The performance measure has score validity evidence that demonstrated item
responses were consistent with content specifications. Data suggest the scores
represent the intended construct by using an adequate sample of items/tasks
within the targeted content standards. Other sources of validity evidence such
as the interrelationship of items/tasks and alignment characteristics of the
performance measure are collected.
3.7
Reliability coefficients are reported for the performance measure, which
includes estimating internal consistency. Standard errors are reported for
summary scores. When applicable, other reliability statistics such as
classification accuracy, rater reliabilities, and others are calculated and
reviewed.
Strand 3 Summary __out
of 7
Note: A supplement to this document will address Tasks 3.5, 3.6, and 3.7 of the Performance Measure
Rubric.

More Related Content

What's hot

Preparing training session
Preparing training sessionPreparing training session
Preparing training session
Naveed Younas
 
Download-manuals-training-trainingdevelopmentmodule
 Download-manuals-training-trainingdevelopmentmodule Download-manuals-training-trainingdevelopmentmodule
Download-manuals-training-trainingdevelopmentmodule
hydrologywebsite1
 
Modular or part time learning program
Modular or part time learning programModular or part time learning program
Modular or part time learning program
nihal dias
 
PMP.Essentials.150121
PMP.Essentials.150121PMP.Essentials.150121
PMP.Essentials.150121
Rakesh P
 

What's hot (14)

Training design document - Template 2
Training design document - Template 2Training design document - Template 2
Training design document - Template 2
 
Preparing training session
Preparing training sessionPreparing training session
Preparing training session
 
E portfolio pedagogy pharmacy2021-september2020
E portfolio pedagogy pharmacy2021-september2020E portfolio pedagogy pharmacy2021-september2020
E portfolio pedagogy pharmacy2021-september2020
 
Basic 1
Basic 1Basic 1
Basic 1
 
Download-manuals-training-trainingdevelopmentmodule
 Download-manuals-training-trainingdevelopmentmodule Download-manuals-training-trainingdevelopmentmodule
Download-manuals-training-trainingdevelopmentmodule
 
MBA760 Chapter 08
MBA760 Chapter 08MBA760 Chapter 08
MBA760 Chapter 08
 
E portfolio pedagogy pharmacy2021-september2020
E portfolio pedagogy pharmacy2021-september2020E portfolio pedagogy pharmacy2021-september2020
E portfolio pedagogy pharmacy2021-september2020
 
management winc help Australia
management winc help Australiamanagement winc help Australia
management winc help Australia
 
Modular or part time learning program
Modular or part time learning programModular or part time learning program
Modular or part time learning program
 
PMP.Essentials.150121
PMP.Essentials.150121PMP.Essentials.150121
PMP.Essentials.150121
 
Project management for instructional designers
Project management for instructional designersProject management for instructional designers
Project management for instructional designers
 
Innovation through evaluation and quality development of in-company training ...
Innovation through evaluation and quality development of in-company training ...Innovation through evaluation and quality development of in-company training ...
Innovation through evaluation and quality development of in-company training ...
 
Stomp e portfolio - P. Akialis
Stomp e portfolio - P. AkialisStomp e portfolio - P. Akialis
Stomp e portfolio - P. Akialis
 
B24 t043 performance_testing
B24 t043 performance_testingB24 t043 performance_testing
B24 t043 performance_testing
 

Similar to Quick Start Users Guide-June 2014-Working Draft

Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-Final
Research in Action, Inc.
 
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALTemplate #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINAL
Research in Action, Inc.
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINAL
Research in Action, Inc.
 
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALHO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
Research in Action, Inc.
 
Template #1-Designing the Assessment-June 2014-FINAL
Template #1-Designing the Assessment-June 2014-FINALTemplate #1-Designing the Assessment-June 2014-FINAL
Template #1-Designing the Assessment-June 2014-FINAL
Research in Action, Inc.
 
Assessment_Basics[1]
Assessment_Basics[1]Assessment_Basics[1]
Assessment_Basics[1]
'Gbenga Aina
 

Similar to Quick Start Users Guide-June 2014-Working Draft (20)

Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-Final
 
QS M1-Designing the Assessment-22JAN14
QS M1-Designing the Assessment-22JAN14QS M1-Designing the Assessment-22JAN14
QS M1-Designing the Assessment-22JAN14
 
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALTemplate #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINAL
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINAL
 
M0-Orientation-June 2014-FINAL
M0-Orientation-June 2014-FINALM0-Orientation-June 2014-FINAL
M0-Orientation-June 2014-FINAL
 
M0-Orientation-JAN2014-Final
M0-Orientation-JAN2014-FinalM0-Orientation-JAN2014-Final
M0-Orientation-JAN2014-Final
 
M0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo SiteM0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo Site
 
Template #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laTemplate #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-la
 
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALHO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
 
Template #1-Designing the Assessment-June 2014-FINAL
Template #1-Designing the Assessment-June 2014-FINALTemplate #1-Designing the Assessment-June 2014-FINAL
Template #1-Designing the Assessment-June 2014-FINAL
 
M1-Designing-SLOs-13NOV13
M1-Designing-SLOs-13NOV13M1-Designing-SLOs-13NOV13
M1-Designing-SLOs-13NOV13
 
ePortfolios for Assessment and Measurement
ePortfolios for Assessment and MeasurementePortfolios for Assessment and Measurement
ePortfolios for Assessment and Measurement
 
Assessment_Basics[1]
Assessment_Basics[1]Assessment_Basics[1]
Assessment_Basics[1]
 
SLO for teachers
SLO for teachersSLO for teachers
SLO for teachers
 
M0 school leader orientation-final
M0 school leader orientation-finalM0 school leader orientation-final
M0 school leader orientation-final
 
M3 reviewing the slo-sso-final
M3 reviewing the slo-sso-finalM3 reviewing the slo-sso-final
M3 reviewing the slo-sso-final
 
M3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSiteM3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSite
 
TRAINING DESIGN
TRAINING DESIGNTRAINING DESIGN
TRAINING DESIGN
 
M0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSiteM0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSite
 
M0 orientation to the slo-sso-final
M0 orientation to the slo-sso-finalM0 orientation to the slo-sso-final
M0 orientation to the slo-sso-final
 

More from Research in Action, Inc.

Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINAL
Research in Action, Inc.
 
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Research in Action, Inc.
 
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALModel #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Research in Action, Inc.
 
M3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALM3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
Research in Action, Inc.
 
Cognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALCognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
Research in Action, Inc.
 
HO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALHO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL
Research in Action, Inc.
 
Template #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALTemplate #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINAL
Research in Action, Inc.
 
M2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINALM2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINAL
Research in Action, Inc.
 

More from Research in Action, Inc. (20)

M2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSiteM2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSite
 
M1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo SiteM1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo Site
 
Template #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jpTemplate #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jp
 
Template #2c building the sso-final
Template #2c building the sso-finalTemplate #2c building the sso-final
Template #2c building the sso-final
 
Template #2a building the slo-final
Template #2a building the slo-finalTemplate #2a building the slo-final
Template #2a building the slo-final
 
Template #1 designing the slo-sso-final
Template #1 designing the slo-sso-finalTemplate #1 designing the slo-sso-final
Template #1 designing the slo-sso-final
 
M2 building the slo-sso-final
M2 building the slo-sso-finalM2 building the slo-sso-final
M2 building the slo-sso-final
 
M1 designing the slo-sso-final
M1 designing the slo-sso-finalM1 designing the slo-sso-final
M1 designing the slo-sso-final
 
Educator evaluation policy overview-final
Educator evaluation policy overview-finalEducator evaluation policy overview-final
Educator evaluation policy overview-final
 
Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINAL
 
Depth of Knowledge Chart
Depth of Knowledge ChartDepth of Knowledge Chart
Depth of Knowledge Chart
 
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
 
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALModel #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
 
M3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALM3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
 
Cognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALCognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
 
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINALModel #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
 
Model #1-Art Grade 5-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINALModel #1-Art Grade 5-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINAL
 
HO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALHO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL
 
Template #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALTemplate #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINAL
 
M2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINALM2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINAL
 

Recently uploaded

Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdfVishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
ssuserdda66b
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 

Recently uploaded (20)

How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdfVishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 

Quick Start Users Guide-June 2014-Working Draft

  • 1. QUICK START User’s Guide 2014 PART OF THE ASSESSMENT LITERACY SERIES THE RIA GROUP | 16407 Highland Club Avenue Baton Rouge, LA 70817
  • 2. Quick Start© User’s Guide-June 2014-Working Draft 1 TABLE OF CONTENTS Introduction 2 Purpose 3 Homeroom 3 Phase I: Designing the Assessment 5 1.1 Goal Statement 5 1.2 Objectives 5 1.3 Guiding Questions 5 1.4 Resources 5 1.5 Procedural Steps 6 STEP 1: Create a Purpose Statement 6 STEP 2: Select Targeted Content Standards 6 STEP 3: Develop a Test Blueprint 7 1.6 Quality Reviews 7 Phase II: Building the Assessment 8 2.1 Goal Statement 8 2.2 Objectives 8 2.3 Guiding Questions 8 2.4 Resources 9 2.5 Procedural Steps 9 STEP 4: Item Stems/Task Prompts 9 STEP 5: Scoring Keys/Scoring Rubrics 10 STEP 6: Test Forms 11 2.6 Quality Reviews 12 Phase III: Reviewing the Assessment 13 3.1 Goal Statement 13 3.2 Objectives 13 3.3 Guiding Questions 13 3.4 Resources 14 3.5 Procedural Steps 14 STEP 7: Item/Tasks Reviews 14 STEP 8: Alignment and Performance Level Reviews 14 STEP 9: Data Reviews 15 STEP 10: Refinements 15 3.6 Quality Reviews 16
  • 3. Quick Start© User’s Guide-June 2014-Working Draft 2 Quick Start© User’s Guide Introduction The purpose of this document is to provide guidance for developing measures of student performance that will meet the criteria within the Performance Measure Rubric. The rubric is a self-assessment tool used to ascertain the technical quality of locally-developed performance measures. The process used to “design”, “build”, and “review” teacher-made performance measures is contained within the Quick Start program. Quick Start delivers a foundational understanding of the procedures necessary to create these performance measures, which teachers may then use to assess their students’ skills, knowledge, and concept mastery of targeted content standards. Figure 1. Process Components Design • Purpose Statement • Targeted Content Standards • Test Blueprint Build • Items/Tasks • Scoring Keys & Scoring Rubrics • Test Forms Review • Item/Task Reviews • Alignment Reviews • Data Reviews • Refinements
  • 4. Quick Start© User’s Guide-June 2014-Working Draft 3 Purpose This document guides educators in the development of performance measures in three phases: Design, Build, and Review. Each phase includes customized training and educator- friendly tools to ensure that the performance measures meet the criteria within the Performance Measure Rubric. This rubric, which helps determine the technical quality of performance measures, follows a structure similar to the training process used in developing student learning objectives (i.e., Design, Build, and Review). Educators have the flexibility to begin the process from Orientation to Review, or simply the Review phase, based upon their needs and experience in the assessment development process. Homeroom Homeroom is the learning platform that brings this effective training right to your fingertips. To access the training and documents necessary for creating high-quality performance measures visit www.pdehr.riagroup2013.com. It is important to note that the user may access this training from any device whether it be a tablet, phone, or PC. When accessing Homeroom for the first time, the user will need to register through the Homeroom login screen. In the event of a lost password, username, or other questions, the user may contact the Help Desk through email at helpdesk@riagroup2013.com or call toll free at 1.855.787.9446 (see Figure 2 below). Figure 2. Homeroom Login Screen The home page offers the user the Quick Start icon option as shown below. The first option, “I am a Teacher”, is oriented to teachers completing the SLO Process. The second option, “I am a School Leader”, is designed for principals, superintendents, etc. The Quick Start icon expands as shown in Figure 3 below to offer the user options.
  • 5. Quick Start© User’s Guide-June 2014-Working Draft 4 Figure 3. User Options Each phase of the Quick Start Process; Design, Build, and Review contains the components listed below. The TRAINING > VIEW THE TRAINING component provides the user with PowerPoints and videos instructing the user in assessment creation. The TEMPLATES > CREATE YOUR OWN component provides templates for the user to download and utilize in developing effective student learning objectives. The RESOURCES > HELPFUL MATERIALS component provides guides and other resources to enhance the Quick Start Process experience (see Figure 4 below). Figure 4. Quick Start Process Components
  • 6. Quick Start© User’s Guide-June 2014-Working Draft 5 Phase I: Designing the Assessment 1.1 Goal Statement  Understand and apply the techniques used to design measures of student performance. 1.2 Objectives  The professional will successfully: o Create a purpose statement for a specific performance measure. o Identify content standard(s) that represent the Enduring Understanding/Key Concept within the content area. o Develop a test blueprint outlining the performance measure’s structure. 1.3 Guiding Questions  What is the performance measure intended to measure and at what grade?  What are the developmental characteristics of test-takers?  Which areas will be targeted among the various content standards?  How will educators use the results (overall score and “growth” inferences)?  When will the performance measure be administered?  Do the items/tasks capture the content standards within the key concept?  Is the number of items/tasks sufficient so that students at varying levels can demonstrate their knowledge?  What are the time demands for both teachers and students?  How does the design reflect the areas of emphasis in the standards? 1.4 Resources Training Templates Resources  M1-Designing the Assessment  Template #1-Designing the Assessment  HO #1-Designing the Assessment- Examples  Cognitive Demand Crosswalk
  • 7. Quick Start© User’s Guide-June 2014-Working Draft 6 1.5 Procedural Steps Create a Purpose Statement Step 1. Individually create a statement about the performance measure in terms of the content standards it will purport to measure. Step 2. Build consensus by focusing on three components of the statement: What, How, Why. Step 3. Draft three sentences reflecting the group’s consensus for each component, and review. Step 4. Merge sentences to create a single paragraph “statement”. Again, review to ensure that the statement reflects the group’s intent. Step 5. Finalize the statement and double-check for editorial soundness. Select Targeted Content Standards Step 1. Place the course/subject’s name and Enduring Understanding/Key Concept statement above the Targeted Content Standards table. Step 2. Place the code for each standard/content strand in the Content ID column along with the description for each content standard in the Content Statement column. Step 3. Have a subject matter expert work collaboratively to identify initial (i.e., draft) set of content standards associated with the “Enduring Understanding”/Key Concept”. Step 4. Review the list of targeted content standards and look for gaps and/or redundancies and then finalize the list by placing an “X” in the Final column. Step 5. Verify that the “final” targeted content standards will be those used to develop the test blueprint.
  • 8. Quick Start© User’s Guide-June 2014-Working Draft 7 Develop a Test Blueprint Step 1. Review the targeted content standards identified in Step 2. Step 2. Insert selected Enduring Understanding/Key Concept and targeted content standards (numeric code only) into the test blueprint table. Step 3. Determine the number of items/tasks across the four cognitive levels. Step 4. Tally the rows and place the values in the Total column. Tally each cognitive level column and place the resultant values in the Grand Totals row. Step 5. Report the total number of items/tasks and the total possible points available. 1.6 Quality Reviews The Performance Measure Rubric is designed to help the educator review items/tasks, scoring rubrics, and assessment forms to create high-quality performance measures. Strand 1 of the Performance Measure Rubric evaluates the Design phase of the assessment process (purpose statement, targeted content standards, and test blueprint). Refer to Handout #3 Performance Measure Rubric-Scored Example for more information. Task ID Descriptor Rating Evidence 1.1 The purpose of the performance measure is explicitly stated (who, what, why). 1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate. 1.3 The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills. 1.4 Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure - OR- Blueprints are used to align items/tasks to targeted content standards. 1.5 Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth of the targeted content standards. Strand 1 Summary __out of 5
  • 9. Quick Start© User’s Guide-June 2014-Working Draft 8 Phase II: Building the Assessment 2.1 Goal Statement  Understand and apply the techniques used to build measures of student performance. 2.2 Objectives  The professional will successfully: o Create the necessary items/tasks to address the test blueprint. o Develop scoring keys and/or scoring rubrics. o Organize items/tasks and administration guidelines into a test form. 2.3 Guiding Questions  Are the items aligned with targeted content standards?  Do the selected items/tasks allow students to demonstrate content knowledge by: o Responding to questions and/or prompts? o Performing tasks, actions, and/or demonstrations?  Do the items/tasks measure content knowledge, skill, or process and not an external or environmental factor (e.g., guessing)?  Is the number of items/tasks sufficient to sample the targeted content?  Are the items/tasks developmentally appropriate for the intended test-takers?  Are the correct answers and/or expected responses clearly identified?  Do the performance measure’s directions specify: o What the test-taker should do, read, or analyze? o Where and how the test-taker should respond or demonstrate the task? o How many points a correct/complete response is worth towards the overall score?  Are there directions for different item/task types?
  • 10. Quick Start© User’s Guide-June 2014-Working Draft 9 2.4 Resources Training Templates Resources  M2-Building the Assessment  Template #2-Building the Assessment  Performance Task Framework  HO #2-Building the Assessment  Model #1-Art Grade 5-DEMO  Model #2-Grade-Pre-Algebra- DEMO  Model #3-Nutrition Culinary, Level III-DEMO  Cognitive Demand Crosswalk 2.5 Procedural Steps 2.5.1 Item Stems/Task Prompts Multiple Choice (MC) Items Step 1. Review the targeted content standard. Step 2. Determine which aspects of the standard can be measured objectively. Step 3. Select the focused aspect and determine the cognitive demand reflected in the standard’s description. Step 4. Create a question (stem), one correct answer, and plausible (realistic) distractors. Step 5. Review the item and answer options for grammatical soundness. Short Answer/Extended Answer Items Step 1. Review the targeted content standard(s). Step 2. Determine which aspects of the standard(s) can be best measured by having students “construct” a short response. Step 3. Select and list aspects of the targeted content standard(s) to be measured. Step 4. Create a prompt, select a passage, or develop a scenario for students. Step 5. Develop a clear statement that articulates specific criteria for the test-taker to provide.
  • 11. Quick Start© User’s Guide-June 2014-Working Draft 10 Extended Answer (EA) Tasks Step 1. Review the targeted content standard(s). Step 2. Determine which aspects of the standard(s) can be best measured by having students “construct” an extended response to a given prompt, scenario, or passage. Step 3. Select and list all aspects of the targeted content standard(s) to be measured. Step 4. Create a prompt, select a passage, or develop a scenario for students. Step 5. Develop a clear statement for each subordinate task that articulates specific criteria for the test-taker to provide. Extended Performance (EP) Tasks Step 1. Review the targeted content standard(s). Step 2. Determine which aspects of the standard(s) can be best measured by having students “develop a complex response, demonstration, or performance over an extend period of time (e.g., two weeks). Step 3. Select and list all aspects of the targeted content standard(s) to be measured. Step 4. Create a project, portfolio, or demonstration expectation statement that includes subordinate tasks, which are aligned to the test blueprint. Step 5. Develop a clear statement for each subordinate task that articulates specific criteria for the test-taker to provide. 2.5.2 Scoring Keys/Scoring Rubrics MC Items Score Key Step 1. Enter the assessment information at the top of the Scoring Key. Step 2. Record the item number, item tag (optional), item type, and point value. Step 3. Record the MC answers in the Answer column. Step 4. Repeat Steps 1-4 until all items on the test blueprint are reflected within the Scoring Key. Step 5. Validate that each question-to-answer relationship is recorded correctly
  • 12. Quick Start© User’s Guide-June 2014-Working Draft 11 Short Answer/Extended Answer/Extended Performance Scoring Rubrics Step 1. Review the SA, EA, or EP task and the criteria articulated in the stem/directions. Step 2. Select a “generic” rubric structure (see Template #2: Building the Assessment) based upon scoring criteria and the number of dimensions being measured. Step 3. Modify the rubric language using specific criteria expected in the response to award the maximum number of points. Step 4. Determine how much the response can deviate from “fully correct” in order to earn the next (lower) point value. [Continue until the full range of possible scores is described.] Step 5. During the review, ensure the response expectation, scoring rubric, and test blueprint are fully aligned. Procedural Steps: Administration Guidelines Step 1. Create a series of administrative steps for before, during, and after the assessment window. Step 2. Explain any requirements or equipment necessary, including accommodations. State any ancillary materials (e.g., calculators) needed or allowed by the test- takers. Step 3. Identify the approximate time afforded to complete the assessment, including each subtask in an EP task. Step 4. Include detailed “scripts” articulating exactly what is to be communicated to students, especially when administering performance tasks over a long period of time. Step 5. Include procedures for scoring, administering make-ups, and handling completed assessments. Procedural Steps: Test Forms Step 1. Develop a cover page stating the test form developed and include any necessary demographic information (e.g., section number, student name, date administered, etc.). Step 2. Organize the items/tasks/prompts in a sequence that will maximize student engagement.
  • 13. Quick Start© User’s Guide-June 2014-Working Draft 12 Step 3. Add item tags and section dividers (optional). Step 4. Refine the draft form to minimize “blank space”, verify picture, graph, table, and figure placement in relationship to the item/task, and ensure MC answer options do not drift from one page to the next. Step 5. Add scoring rubric or criteria for constructed response tasks. 2.6 Quality Reviews The Performance Measure Rubric is designed to help the educator review items/tasks, scoring rubrics, and assessment forms to create high-quality performance measures. Strand 2 of the Performance Measure Rubric evaluates the Build phase of the assessment process (refer to Handout #3-Performance Measure Rubric-Scored Example for more information). Task ID Descriptor Rating Evidence 2.1 Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (e.g., short constructed response, writing prompts, performance tasks, etc.). 2.2 Items/tasks are created and reviewed in terms of: (a) alignment to the targeted content standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness. 2.3 Administration guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions to students, day and time constraints, and allowable accommodations/adaptations. 2.4 Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task used to combine results into an overall score. 2.5 Summary scores are reported using both raw score points and a performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote performance levels. 2.6 The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older students (high school). Strand 2 Summary __out of 6
  • 14. Quick Start© User’s Guide-June 2014-Working Draft 13 Phase III: Reviewing the Assessment 3.1 Goal Statement  Understand and apply the techniques used to review and refine measures of student performance. 3.2 Objectives  The professional will successfully: o Review developed items/tasks for validity threats, content and cognitive match; o Examine test alignment to ensure: (a) all items/tasks match the skills, knowledge, and concepts in the targeted content standards; (b) all rubrics match the targeted content standards; and, (c) all items/tasks reflect higher order thinking. Note: A supplement to this document will address alignment (Step 8), data reviews (Step 9), and refinement (Step 10), which was addressed in the Orientation presentation. 3.3 Guiding Questions  Does each item/task clearly address the standard?  Is the reading difficulty and vocabulary appropriate?  Is the language clear, consistent, and understandable?  Are charts, tables, graphs, and diagrams clear and understandable?  Is there only one (1) correct answer?  Have the items been reviewed for bias and sensitivity? o Items provide an equal opportunity for all students to demonstrate their knowledge and skills. The stimulus material (e.g., reading passage, artwork, and diagram) does not raise bias and/or sensitivity concerns that would interfere with the performance of a particular group of students.  Are the items developmentally appropriate for test-takers?  Does the blueprint reflect the test form?  Does the scoring rubric provide detailed scoring information?  Does the assessment have at least two (2) performance levels?
  • 15. Quick Start© User’s Guide-June 2014-Working Draft 14 3.4 Resources Training Templates Resources  M3-Reviewing the Assessment  Template #3-Performance Measure Rubric  HO #3-Reviewing the Assessment-Scored Example 3.5 Procedural Steps Step 1. Identify at least one other teacher to assist in the review (best accomplished by department or grade-level committees). Step 2. Organize the test form, answer key, and/or scoring rubrics, and Handout #3- Reviewing the Assessment-Scored Example. Step 3. Read each item/task and highlight any “potential” issues in terms of content accuracy, potential bias, sensitive materials, fairness, and developmental appropriateness. Step 4. After reviewing the entire test form, including scoring rubrics, revisit the highlighted items/tasks. Determine if the item/tasks can be rewritten or must be replaced. Step 5. Print revised assessment documents and conduct an editorial review, ensuring readability, sentence/passage complexity, and word selection are grammatically sound. Take corrective actions prior to finalizing the documents. Step 1. Identify at least one other teacher to assist in the alignment review (best accomplished by department or grade-level committees). Step 2. Organize items/tasks, test blueprint, and targeted content standards.
  • 16. Quick Start© User’s Guide-June 2014-Working Draft 15 Step 3. Read each item/task in terms of matching the standards both in terms of content reflection and cognitive demand. For SA, EA, and EP tasks, ensure that scoring rubrics are focused on specific content-based expectations. Refine any identified issues. Step 4. After reviewing all items/tasks, including scoring rubrics, count the number of item/task points assigned to each targeted content standard. Determine the percentage of item/task points per targeted content standard based upon the total available. Identify any shortfalls in which too few points are assigned to a standard listed in the test blueprint. Refine if patterns do not reflect those in the standards. Step 5. Using the item/task distributions, determine whether the assessment has at least five (5) points for each targeted content standard and if points are attributed to only developmentally appropriate items/tasks. Refine if point sufficiency does not reflect the content standards Step 1. Conduct after test-takers have engaged in the assessment procedures. Step 2. Focus on data about the items/tasks, performance levels, score distribution, administration guidelines, etc. Step 3. Evaluate technical quality by examining aspects such as: rater reliability, internal consistency, intra-domain correlations, decision-consistency, measurement error, etc. Step 1. Complete prior to the beginning of the next assessment cycle. Step 2. Analyze results from the prior assessment to identify areas of improvement.
  • 17. Quick Start© User’s Guide-June 2014-Working Draft 16 Step 3. Consider item/task replacement or augmentation to address areas of concern. Step 4. Strive to include at least 20% new items/tasks, or implement an item/task tryout approach. Step 5. Create two parallel forms (i.e., Form A and B) for test security purposes. 3.6 Quality Reviews The Performance Measure Rubric is designed to help the educator review items/tasks, scoring rubrics, and assessment forms to create high-quality performance measures. Strand 3 of the Performance Measure Rubric evaluates the Review phase of the assessment process (refer to Handout #3-Performance Measure Rubric-Scored Example for more information). Task ID Descriptor Rating Evidence 3.1 The performance measures are reviewed in terms of design fidelity –  Items/tasks are distributed based upon the design properties found within the specification or blueprint documents.  Item/task and form statistics are used to examine levels of difficulty, complexity, distracter quality, and other properties.  Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics. 3.2 The performance measures are reviewed in terms of editorial soundness, while ensuring consistency and accuracy of other documents (e.g., administration) –  Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgements  Applies Universal Design principles  Ensures linguistic demands and/or readability is developmentally appropriate 3.3 The performance measures are reviewed in terms of alignment characteristics –  Pattern consistency (within specifications and/or blueprints)  Matching the targeted content standards  Cognitive demand  Developmental appropriateness 3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area. 3.5 As part of the assessment cycle, post-administration analyses are conducted to examine aspects as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc. 3.6 The performance measure has score validity evidence that demonstrated item responses were consistent with content specifications. Data suggest the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected. 3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliabilities, and others are calculated and reviewed. Strand 3 Summary __out of 7 Note: A supplement to this document will address Tasks 3.5, 3.6, and 3.7 of the Performance Measure Rubric.