SlideShare uma empresa Scribd logo
1 de 108
Classroom assessment
 All the processes involved in making
  decisions about students’ learning
  progress.
 It’s more than just grades and tests!!
The role of assessment
There is a cycle of planning, teaching, and assessing that insures students are learning.




                  Plan

                                                               Teach




                                                             This is the cycle
                                                             around which I built
                                  Assess                     my Educational
                                                             Psychology schema
                                                             (back in session 1)
Formative and summative
             evaluation
 Formative evaluation: occurs before or
  during instruction and is used to provide
  feedback to students and to monitor their
  growth.
 Summative evaluation: occurs after
  instruction on a given topic and is used for
  grading purposes.
How to remember formative vs.
           summative
 Summative happens at the end—like a
  summary.
 Formative happens while instruction is
  taking place (while students’
  understanding is still forming).
Types of Formative Evaluation
 Pretest: formative test for assessing
  students’ knowledge, readiness, and
  abilities.
 Diagnostic test: formative test to
  determine students’ areas of weakness.
Why use formative
          assessment?
 If you want your lessons to be in students’
 “zone of proximal development,” then you
 need to know what they know (pre-assess
 or test) or where they are
 misunderstanding (diagnostic test).
Why use formative
          assessment?
 If you want to know the effect of a lesson
 or series of lessons, then you can do a
 pre-test, teach the lessons, and then do a
 post-test in order to find out how well you
 taught this group of students.
Formative assessment
 Remember that assessment is not just tests, so
  you can do a “pre-test” that is not really a test.
  For example, music teachers can have students
  sight-read a new piece of music as a “pre-test.”
  Math teachers can have students try out a
  worksheet as a “pre-test.” A KWL (a
  collaborative chart where students tell “what they
  Know, what they Want to learn, and what they
  have Learned) can be a formative assessment.
Measurement and Evaluation
 Measurement: the process of gathering
  information about learning.
 Evaluation: the process of making
  decisions on the basis of measurements.


  These contribute to the planning phase of the assessment cycle.
Types of measurement
 Informal measurement is the process of
  gathering incidental information about
  learning progress during learning
  activities and discussions with students.
 Formal measurement is the process of
  systematically gathering information about
  learning progress.
          Good teachers are always “kid watching”—seeing what is going
          on with individual students as they participate in classroom
          activities. These teachers often make notes of what they
          observe and use this information not only in lesson planning but
          also in parent-teacher conferences.
Systematic assessment
 Students can get lost in the cracks. A
  quiet student sitting in the back may not
  attract your attention.
 You need to know what is going on with
  each student in your classroom; this is
  your responsibility as a teacher.
 You need to systematically collect
  assessment information on every student.
Validity
 This is an indicator of the extent to which
  an assessment actually measures what it
  is supposed to measure.
 Is the measure aligned with your goals as
  a teacher?
 Is the measure aligned with what
  happened in the classroom?
 Is the measure based on students’ abilities
  rather than on personality or likability?
Examples of invalid assessment
     Students have been learning music in major
      keys. The teacher tests their sight reading with
      a piece in a minor key.
     Inner city students have been learning about
      writing. The teacher gives them a writing prompt
      about “a day on a farm.”
     Students have been using manipulatives. The
      teacher gives them a test where they cannot use
      these aids.
     Learning disabled (with difficulty reading and
      writing) students have been learning about the
      Civil War. The teacher gives them a written test.
If you were the teacher, how would you improve the validity in each of these situations?
Reliability
   A description of the extent to which
    measurements are consistent and free
    from errors of measurement.
   A bathroom scale is useless
    if it is not reliable
   So are measurements we use
    in schools
Remember, reliability is not everything. A test of toe length
could be reliable but not particularly useful in the education
world.
Reliability on teacher-made
            assessments
 Use a sufficient number of items or tasks
  and look for consistency of student
  performance across tasks.
 Ensure that directions are clear so
  students know what is expected of them.
 Identify specific criteria to evaluate
  students’ performance.
 Administer assessments in similar ways
  for all students.
Traditional Assessment Strategies
 Patterns of assessment
 Constructing tests
Getting the Most from
       Traditional Assessment
 Planning for Testing
 Objective Testing: multiple-choice, matching,
  true/false, short answer, and fill-in tests; scoring
  answers does not require interpretation.
 Essay Testing
Planning for testing
 When?
 Frequently
 Soon after material is learned
 Cumulative
Judging textbook texts
 Identify your objective first and then see if
  the textbook test helps
 Feel free to adapt to the realities of your
  class
 Tests from the text are less valid if you
  make multiple adaptations.
 Make sure what gets taught matches what
  gets tested.
Making tests
 Each item on a test should be valid—
  students who know the information should
  get the item right and students who do not
  know the information should get it wrong.
 In the next few slides, we will discuss
  some of your favorite types of test items:
  multiple choice, matching, true/false,
  completion, essays, and rubrics.
Multiple choice
 Multiple-choice format is a measurement format
  that consists of a question or statement, called a
  stem, and a series of answer choices. The
  individual responding to the items chooses the
  correct or best answer.
 Distracters: the incorrect alternatives, so-called
  because they are designed to distract students
  who don’t understand the content being
  measured in the item.
 Stem: the question part of a multiple-choice
  item.
Stem and distractors
 Stem: should pose one question or
  problem.
 Distracters: should reflect common
  misconceptions. Should be grammatically
  consistent with the stem—otherwise the
  grammar will give away which distracter is
  correct.
 You can assess higher-level thinking with
  multiple choice questions.
Matching
 Matching format: a measurement format that requires learners
    to classify a series of examples using the same alternatives.
   Content should be homogeneous (all material of the same
    type).
   Use more statements than alternatives so students can’t use
    elimination to get items right.
   Let students know that alternatives can be used more than
    once or not at all (keeps students from guessing instead of
    knowing).
   Keep the material all on one page—if you have material for
    more than one page, break it up into two different groups.
   Don’t overload students’ working memories with excessively
    long matching items—if you have more than ten possibilities,
    break the item into two.
True-false
 True-false format is a measurement format that
    includes statements of varying complexity that
    learners have to judge as being correct or
    incorrect.
   Don’t put both a true fact and a false fact in the
    same item.
   “Most” is a hint that the item is true.
   “Never” and “always” are hints that the item is
    false.
   Negative wording can be confusing.
Completion
 Completion format is a measurement
  format that includes a question or an
  incomplete statement that requires the
  learner to supply appropriate words,
  numbers, or symbols.
 It is very difficult to create completion
  items where only one answer is correct.
 These items usually measure low-level
  forms of thinking.
Essay
 Essay format is a measurement format that
    requires students to make extended written
    responses to questions or problems.
   They assess creative and critical thinking
   They measure progress students make in
    creating and defending thesis statements.
   They change how students study and learn.
   Scoring them is a challenge.
   They can be ambiguous.
Rubrics
 A rubric is a scoring scale that describes the
    criteria for grading.
   Rubrics help students to plan the material that
    will be assessed.
   Establish criteria based on essential elements
    that must be present in students’ work.
   Decide on number of levels of achievement for
    each criterion.
   Develop clear descriptors for each level of
    achievement.
   Determine a rating scale for entire rubric.
Commercially Prepared Test Items
 Carefully examine your goals before using
  them.
 Select those items that are consistent with
  your goals
 Revise items that need improvement
 Create additional items that help you
  accurately measure your students’
  understanding.
Alternative Assessment
 Direct examination of student performance on significant
  tasks that are relevant to life outside of school.
 In other words: the assessment task is similar to or the
  same as tasks students would be doing in the “real
  world,” (as opposed to using a multiple choice test where
  the assessment process differs significantly from real-
  world performance).
 Performance Assessment: any form of assessment that
  requires students to carry out an activity or produce a
  product in order to demonstrate learning.
 Portfolios
Authentic Assessment: assessment procedures that test skills and abilities as
they would be applied in real-life situations.
Performance Assessment
 A form of assessment in which students
    demonstrate their knowledge and skill by
    carrying out an activity or producing a
    product.
   Specify the type of performance.
   Select the focus.
   Structure the evaluation setting—making it
    realistic but practical.
   Design evaluation procedures
Examples of Performance
                   Assessment
    Students find the area of a classroom in
     order to figure out how much carpet will be
     needed.
    Students write persuasive letters to the
     editor regarding an issue of concern to
     them.
    Students perform experiments to figure
     out a science problem.
Pay attention to and assess not just the product the student produces but also the
process by which students create the product.
Evaluating Performance
 Assessment Products/Processes
 Use systematic observation, the process of
  specifying criteria for acceptable performance on
  an activity and taking notes during observation
  based on the criteria.
 Checklists are written descriptions of
  dimensions that must be present in an
  acceptable performance of an activity. These
  help you to keep track of student performance.
 Rating scales are written descriptions of the
  evaluative dimensions of an acceptable
  performance of an activity and scales of values
  on which each dimension is rated.
Portfolios
 A collection of the student’s work in an area,
  showing growth, self-reflection, and
  achievement.
 Portfolio assessment usually involves student
  goal setting and showing how the items in the
  portfolio demonstrate the achievement of goals.
 Portfolios provide an ongoing idea of student
  work because they are a collection of items
  completed across several weeks.
Using Portfolios
 Embed portfolios in instructional practice. Refer to them
    frequently as you teach.
   Provide examples of portfolios when introducing them to
    students.
   Involve students in the selection and evaluation of their
    work.
   Require students to provide an overview of each
    portfolio, a rationale for the inclusion of individual works,
    criteria they used to evaluate, and a summary of
    progress.
   Provide students with frequent and detailed feedback
    about their decisions.
Portfolios across the curriculum
 Portfolios began in art. Professional freelance artists develop
    portfolios of their work to show potential employers.
   The first book on the use of portfolios (Tierney, Carter, and
    Desai, 1991, Portfolios in the Reading/Writing Classroom)
    focused on the use of portfolios to show students’ literacy
    development.
   Portfolios can also be used in other subjects. Students can
    collect a variety of artifacts to demonstrate their developing
    knowledge and skills.
   With technology, portfolios can be digitized and they can
    feature a wide range of formats.
   Where possible (e.g., in music programs where the teacher
    has the same students across several years) it would be very
    useful to have portfolios that span several years and show
    student improvement.
Exhibition
 A performance test or demonstration of
  learning that is public and usually takes an
  extended time to prepare.
 Examples: a music recital, an art exhibit,
  a project that is presented to a class.
Evaluating alternative assessments
 Use a scoring rubric: rules that are used
  to determine the quality of a student
  performance.
 Students can contribute to the rubric.
  When they do so, they often “buy into” the
  process of assessment.
 Assessment can also include a student
  component—students should learn to
  assess themselves.
Reliability, validity, generalizability
 Alternative assessments may not be
  reliable—two different teachers may give a
  portfolio very different grades.
 Even validity may be a problem—there’s
  not a lot of research that is helpful yet.
Diversity and equity




Professional orchestras used to be all male (except for the harpist, for some
odd reason). In order to counter this, orchestras began having blind
auditions—potential members tried out for the orchestra behind a screen.
This practice has led to a greater amount of diversity in orchestras.
Diversity and equity
 In terms of exhibition-type assessments, subtle “-isms”
  can creep in when it is impossible to judge work apart
  from the person who does the work. It is important to
  keep this in mind when making judgments on students’
  work.
 Also, it is important to remember what students have
  access to at home in terms of the ability to work. Some
  students have a lot of support at home for their
  schoolwork (a computer, a private space for working,
  social support for doing homework, etc.) and other
  students do not. If there is a lot of out-of-class work that
  goes into a portfolio or an exhibition, one needs to keep
  these potential inequities in mind.
Informal assessments
 Ungraded (formative) assessments that gather
  information from multiple sources to help
  teachers make decisions.
 Good teachers are constantly observing
  students and making informal assessments of
  student understanding.
 Other ways to do informal assessment: student
  journals, “exit tickets” where students answer a
  question in writing before they leave the
  classroom, checklists, rating scales, etc.
Self-assessment
 Alternative assessment can be just as
  dumb and alienating as testing if it is
  carried out in the wrong spirit.
 Most of the proponents of alternative
  assessment call for student involvement in
  assessment.
Self-assessment
 Truly, involving students makes sense; in
 the world of work, we often have to assess
 ourselves. This is part of taking initiative.
 We can’t simply wait for the annual review.
Student-Led Conferences: A
          Growing Trend
 For years parent-teacher conferences have been the primary
  means of parent-teacher communication. But now, many
  schools are trying something new -- student-led conferences
  that communicate not only how a student's doing but also why.
 Parent-teacher conferences -- we all know how they go. Parents
  troop into classrooms to talk with teachers about their children's
  progress in school. Often, the process feels rushed, and parents
  leave feeling vaguely dissatisfied, as if they didn't really get what
  they came for.
 For years that process has been the norm, but now it is changing. In
  more and more schools, students are leading conferences, and,
  overall, the word is that they're doing a fine job.




 http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
           Growing Trend
 Many teachers themselves speak enthusiastically of the advantages
  of student-led conferences over teacher-led ones. "We found the
  [student-led] conferences most beneficial," said Keith Eddinger of
  the Marcus Whitman Middle School in Rushville, New York. "From a
  teacher's perspective, we were able to get a better picture of each
  child. It forced us to sit down with each student and review strengths
  and weaknesses. This conversation often told us the students
  learned more than perhaps we had measured through conventional
  assessments."
 Eddinger added, "Our post-conference reviews with parents and
  students were overwhelmingly positive."
 John Osgood, of C. L. Jones Middle School in Minden, Nebraska,
  found that "comments [about student-led conferences] from parents
  and board members were very positive."



 http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
          Growing Trend
 Another staff member, Dick Philips, said, "Most
  parents listened to their child. It was interesting
  listening to [children] explain low grades to their
  parents. It did open the lines of communication."
 "Several parents really liked it because it gave
  them an opportunity to see their child's work,"
  said Sue Yant, another staff member. Yet "some
  [parents] said they hoped we [would hold] the
  traditional conference once a year."

 http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
           Growing Trend
 STUDENT PREPARATION "The format is important, but I believe the
  success of a student-led conference is most determined by how well
  students are prepared," wrote Laura Hayden, a seventh-grade
  communications teacher at Derby Middle School in Derby, Kansas, in
  Letting Students Lead Parent Conferences, an article published by the
  National Association of Elementary School Principals in Middle Matters.
 The conference format at Hayden's school had students show parents some
  of their work and explain their grades in a student-led conference. Each
  team could conduct conferences a bit differently. Hayden's team used an
  open house arrangement in which students and parents visited all team
  members' classrooms, but other teams held the entire conference in one
  classroom.
 The significance of format aside, Hayden focused her students on
  preparation. At the beginning of the school year, she had students set up a
  binder to contain a portfolio as well as graded work. She explained that
  students had to keep their binders orderly because they would use them to
  lead their conferences.


  http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
           Growing Trend
 A week before the conferences, Hayden's team sent home a letter
  informing parents of the conference and the fact that their child
  would lead it. About three days before conferences, she had
  students prepare portfolios of their work to date, including a special
  project, a quiz, a homework assignment, and one assignment from
  which they felt they had learned the most. Students also wrote a
  reflection on their grades and study habits. They set goals for the
  next semester and organized their graded work section.
 The day before conferences, teachers role-played, pretending to be
  the student, with the student playing the teacher or the parent.
  Teachers modeled, for example, how to explain a poor grade to
  parents, and they gave students a checklist of what to cover in the
  conference.




  http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
             Growing Trend
 STUDENT RESPONSIBILITY
 "The preparatory time is worth it," Hayden wrote, "especially when you hear
     a struggling student explaining what he or she learned from an assignment
     and taking responsibility for the score he or she achieved."
    "[Students] need to understand that they are in control of their own efforts to
     learn the material," said Barbara Rommel, superintendent of the David
     Douglas School District in Oregon. (Source: "New Method Puts Student in
     Charge," an article published in the Oregonian newspaper.)
    The Oregon Educational Act for the 21st Century requires students to meet
     higher standards. By having students assess how they are progressing
     toward those standards, educators say, students will know how far they've
     come and how far they have to go to meet the standards.
    "It helps them accept responsibility for their learning," said Patti Kinney,
     principal of Talent Middle School in Oregon.
    "I like being able to tell my side of the story," Josh Whitney-Wise of
     Milwaukie, Oregon's, McLoughlin Middle School told the Oregonian.


    http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
             Growing Trend
    ADVANTAGES AND DISADVANTAGES
    Educators acknowledge that there are disadvantages as well as advantages to
     student-led conferences. Although parent attendance seems higher for student-led
     conferences than for teacher-led ones, a parent's failure to attend a student-led
     conference leads to a great deal of disappointment for a student who has worked
     hard to prepare.
    Another disadvantage is that some parents want to spend more time with their child's
     teacher, receiving his or her viewpoint. Nearly all schools with student-led
     conferences will let parents make separate appointments to confer with teachers.
    For the most part, parents support the concept of student-led conferences, though
     some support them with slight reservations. "My daughter was in a class that did
     student-led conferences a couple of years ago," said one parent of a child at Jones
     Middle School. "I think the object was to make the child feel a part of the whole
     process, to get them in tune with their own progress. … As a parent, I felt like I still
     needed some info from the teachers and wanted more. … But I do think the student
     gets a new perspective on their grades. … Personally, I don't think it would be good
     to do this often, but once a year is good. When you ask if they were 'beneficial,' I can
     say yes and no. They were more beneficial to the student than to the parent."



    http://www.educationworld.com/a_admin/admin/admin112.shtml
Student-Led Conferences: A
          Growing Trend
 But the advantages, say most teachers who
  have participated in student-led conferences,
  outweigh the downside. Student accountability is
  mentioned again and again by educators as a
  plus for student-led conferences. Another plus is
  the way even a struggling student can produce
  something positive for a conference, an art
  project or an essay, perhaps, that wouldn't show
  up in a report card grade. Overall, talks with
  educators indicate, student-led conferences are
  a growing trend.
 http://www.educationworld.com/a_admin/admin/admin112.shtml
Traditional vs. alternative
                assessments
           Traditional                     Alternative
   Tends to be low-level          Tends to be higher level
    (memorization rather than
    thinking)                      Has real world application
   May not apply to “real         Reveals process as well
    world.”                         as product
   Measure outcome rather         Based on constructivism
    than process                   Harder to implement
   Based on behaviorism           Less reliable
   Easier to implement
                                   Should involve students
   Reliable                        in goal setting and
   Does not involve students       assessment
What to do?
 Students have to learn to deal with standardized
  assessment, as that is the gatekeeper to college
  and many careers. (How many standardized
  tests have you taken across your school
  experiences? How many more will you have to
  take to achieve your educational and career
  goals?)
 Yet students also need opportunities to
  demonstrate skills and abilities that multiple
  choice testing fails to assess.
 Good teachers make judicious use of both forms
  of assessment.
A personal take on standardized
           assessment
 When I applied to get a Ph.D., the
 department to which I applied required the
 GRE. At the time, I was working for a
 leader in the area of alternative
 assessment and I therefore was very
 much aware of the criticisms of
 standardized testing. I decided that I
 needed to take a stand on the GRE.
So…
 I went to the library and in about 30 minutes I
  had a small array of research articles that
  showed that the GRE is biased and that the
  Educational Testing Service has a checkered
  past (it was founded by eugenicists—people
  who shared some of Hitler’s ideas about race).
 I wrote a letter to the department that stated that
  it was against my principles to support
  something that was biased and to pay money to
  ETS. I said that I would be willing to provide a
  portfolio of my writing for them to assess.
What happened next…
 The department wrote back and said that
 they would accept my principled stand
 against ETS and would review my
 materials without the GRE. They also said
 that they would also consider waiving the
 GRE requirement for any other potential
 student who had the same concerns (this
 was OSU, Dept. of English, in case you
 are interested in testing this out).
But then…
 All during my teaching, I refused to have anything to do with
  standardized testing and the kind of teaching that supports it.
  My idea of being a teacher who made a difference in people’s
  lives was through using whole language and other student-
  centered approaches.
 Then I volunteered for a year at an African American Christian
  school, where the point of the school was to make differences
  in students’ lives. The principal there explained that since the
  Proficiency Test in Ohio was a gatekeeper, one that has been
  biased against her students, she felt the best thing to do was to
  prepare students as best she could to beat the gate.

An example of bias on the Ohio proficiency test: a few years ago, one of the writing
prompts was about taking a camping trip. Urban teachers were frustrated—many of
their students had never traveled more than a few miles from home, much less had the
experience of being in the wilderness and setting up a tent.
So…
 This altered my teaching profoundly. As much
  as I dislike ETS, I have learned that it is my job
  as a teacher to prepare my students to beat the
  gate. It is also my job to speak up against ETS
  and its biases. I can simultaneously be an
  advocate for alternative assessment and
  prepare my students for the realities they will
  face. I can also let my students know that as
  voters, citizens, and students, they can influence
  state and federal laws and university practices
  that allow ETS its monopoly on assessment.
Effective Assessment Practices:
      Instructional Strategies
 Plan systematically
 Prepare students
 Administer assessments under optimal
  conditions
 Analyze results to make sure
  assessments are accurate and valid
Planning for Assessment
 Make sure assessment is in line with
 goals.
Preparing students for assessment
 Give students opportunities to practice assessment
  procedures.
 Help students with test-taking strategies.
 Give students strategies to reduce anxiety.
 When using a test, give students practice with the type
  of item that will be on the test.
The above is what your book says. Here are my 2 cents. Consider using tests for
curricular items that will be tested through state and locally-mandated tests. You are
using your tests to help students practice for the BIG test. Use non-testing means of
assessing everything, as well, for two reasons: 1. tests are anxiety-producing and often
require and reflect low-level thinking. They turn struggling and bored students off from
learning. Well-planned alternative assessment is interesting and engaging. 2. many
students do not do well on tests, so you really have no idea if they are learning or not
unless you do some kind of alternative assessment. If they know the material but still do
not perform well on standardized-style testing, then you know you need to teach test-
taking strategies.
Test-Taking Strategies
     Use time efficiently and pace yourself
     Read directions carefully
     Identify the important information in
      questions
     Understand the demands of different
      testing formats
     Find out how the questions will be scored.
Oh, boy. If only it were that simple. If you are good at taking tests, this list would affirm
and summarize what you know, but if you struggle, this list is worthless. Please look at
the powerpoint and the word file on test taking strategies for better information.
Reducing Test Anxiety
 Test anxiety is a relatively stable,
  unpleasant reaction to testing situations
  that lowers performance.
 Symptoms: increased pulse rate, dry
  mouth, headache, feeling of dread and/or
  helplessness, going blank, worry about
  failure.
Reducing Test Anxiety: strategies
              for teachers
 Use criterion-referenced measures to minimize the
    competitive aspects of tests (these measure student
    against material to be learned rather than student against
    student).
   Avoid social comparisons, such as public displays of test
    scores and grades
   Increase frequency of quizzes and tests
   Discuss test content and procedures before testing
   Give clear directions, and ensure that students
    understand the test format and requirements
   Teach test-taking skills
   Use a variety of measures, including alternative
    assessments, to measure the range of students’
    understanding and skills
   Provide students with ample time to take tests
More strategies (not in your book)
 Think about: what will make it possible for this student to
  succeed (really succeed in demonstrating his/her
  knowledge, really succeed in learning how to deal with
  testing)?
 You may need to break the process of testing down for
  some students—for instance, if the proficiency test is
  mostly multiple-choice, then work with them only on
  multiple-choice (in your tests), showing them how to
  analyze the questions. They may need lots of practice in
  this area. You could create multiple-choice homework.
  This is a lot of work, but it is what is necessary for
  students to succeed.
 You may need to reteach critical information to your
  struggling students. This means figuring out how to
  manage a review of information for one or two students
  while other students work on other things.
Specific test preparation
             procedures
 Specify what will be on the test
 Give students a chance to practice test-
  like items under test-like conditions
 Establish positive expectations
Administering Tests
 Make environment comfortable and free
  from distractions
 Give precise directions about what
  students need to do and what will happen
  (especially, what early finishers should do
  to avoid distracting late finishers)
 Monitor students while they are taking the
  test
Analyzing results
 Grade tests quickly so you can give
  feedback to students as soon as possible
 Go over items that several people missed
  and reteach information that was clearly
  not understood
Analyzing results
 Research shows that if students are told
  they did well on a test, then they will tend
  to do better the next time. Be positive.
  (This is also an argument for setting
  students up for success).
 Note on your copy of the test any
  problems with questions or wording that
  was unclear. That way you can edit the
  test for next time.
Reducing bias in testing
 Carefully wording items
 Making provisions for non-native English
  speakers
 Accommodating diversity in scoring
Carefully wording items
 Think about what the students do know and
  have experience with and use that as much as
  possible.
 For instance, you can use school experiences
  that you know students have had instead of
  home experiences that only some students have
  had. “Tanisha has three pencils and Lamar has
  two. How many pencils do they have together?”
  rather than “Bob has a DVD player, a CD player,
  and an I-pod. How many pieces of equipment
  does he have?” (Okay, so I know the examples
  are lame, but hopefully you get the idea).
Making provisions for non-native
              speakers
   Modify test                            Modify test procedure
   Simplify test language                 Provide extra time to take
   Simplify test directions                test
   Provide visual supports                Allow students to use a
   Assess students in their                glossary and dictionary
    native language                        Read directions aloud (in
                                            native language would be
                                            better)
                                           Read test aloud and
                                            clarify misunderstandings

     These provisions would also work for students with learning disabilities.
Accommodating diversity in scoring
 Multiple-trait scoring: different dimensions
  of a product are judged according to
  different criteria.
 For example, grade an essay on both
  content and grammatical conventions or
  break down a task into three important
  characteristics and score for each one.
Grading and Reporting: the total
          assessment system
 What types of assessment will be used?
 How will you use traditional assessment?
 How will you use authentic assessment?
 How will you deal with homework?
 How will you deal with missed work?
 How do you represent affective dimensions
  (cooperation, effort) in your assessment plan?
 How will you report student performance (grade,
  percentage, description)?
Designing a grading system
 Your system needs to be clear and
    understandable and in line with district/school
    policies
   Your system should support learning by
    gathering frequent data on students’ learning
   Grades should be based on hard data
   Grades should be fair to all students
   You should be able to defend your system to
    parents & administrators
A comment from me…
 Just remember that evaluations not only
 tell you what is going on with your
 students but they are also potentially
 evaluations of your teaching. If everyone
 fails to answer a question correctly, then it
 is possible that you failed to teach the
 concept clearly. If everyone does well on
 an evaluation, then give yourself a pat on
 the back when you give the students a pat
 on the back.
Other assessments
 Tests and quizzes
 Alternative assessments
 Homework

You need to think about the role each of these plays in your classroom.
How constructive are these as activities? How do they contribute to your
understanding of students’ performances? How do they contribute to
your planning of lessons from day to day? How motivational are these
activities? Are any of these activities a barrier to student learning? It’s
helpful to think about your own experiences in relation to these. Under
what conditions did these forms of assessment seem to work? When did
they not work?
Assigning grades
 Grades can be motivating or demotivating
 The challenge is to figure out how to
  reduce the complexity of a student into a
  single letter.
 What grade do you give the bright student
  who puts out minimal effort?
 What grade do you give a student who
  works very hard but has difficulty learning
  the material?
My opinion…
 I think a lot about grades.
 Your grade in this class may influence your future: a good grade will help
    you to get into graduate school and a bad grade may be a barrier to your
    career goals.
   At the same time, if you don’t learn the material of this class, it’s not fair to
    the students who do learn the material for you to get a good grade.
   I try to think about what the barriers are for learning and how I can remove
    them. This is why I have tried to create materials to assist different kinds of
    learners. I have tried to make learning the easiest thing to choose to do. I
    feel if I have removed barriers as much as possible, then if a student
    chooses not to learn, I can apply the consequences of that.
   I have also tried to create assessments that give every student a chance to
    demonstrate their knowledge.
   Every year that I teach, I revamp my courses, often after considering the
    issue of grades and assessment. I’m sure this course will look different after
    a few more times of my teaching it.
   I share this with you hoping you will share this journey with me of constantly
    analyzing and trying to improve one’s teaching. It is an exciting journey and
    in all the years I have taught, I have never gotten tired of it.
Failure

  Failure is the nemesis of the self-esteem
  movement. Yet, in order to appreciate and
  understand success, we need to experience
  failure.
Failure

    Some parents cannot stand for their
    children to fail. Unfortunately, parents
    cannot control what happens to their
    adult children—and a certain amount
    of failure helps young people learn to
    cope with adult problems.
Failure is positive when…
 …students have the possibility of learning
  from it.
 …when it helps students to take
  responsibility for their own actions and
  choices.
 …when its consequences are significant
  for the student but not horribly life-
  changing.
Failure
 Remember—allowing students to
 experience and cope with small failures
 can prevent large failures later on.
 Allowing a student to deal with the
 consequences of not studying by failing an
 assignment is a lot better than giving a
 student a false message of success and
 then the student fails something large later
 on (Ohio Graduation Test, for example).
The work ethic




No matter what kind of work people do, they need
a work ethic.
The work ethic
 People need to know how to work without
    supervision.
   They need to know how to take initiative and
    solve problems without direction.
   They need to be trustworthy.
   They need to take responsibility for their actions.
   They need to know how to work hard.
   They need to know how to motivate themselves
    to work even when the work is not fun or
    interesting.
How schools can help students
       develop a work ethic
 It should take work to succeed in a classroom.
 The work should be do-able but it should be challenging.
  This means that students with struggles should have
  tasks that are challenging for them and students who are
  gifted should have tasks that are challenging for them.
  No one should get a free ride.
 Students should face consequences for not doing work.
 Students should be given strategies for working,
  particularly when they are struggling with self-discipline.
 When a student fails, the student should be given the
  message to try again and support for doing so. This
  support includes problem-solving techniques to figure
  out why the failure happened and what can be done to
  prevent it in the future.
Feedback
   Providing constructive feedback to students is probably one of the most useful things for
    student learning you can do as a teacher – and most likely to be appreciated by students.
   Feedback is an integral part of all teaching not just assessment. Feedback is constructive
    observations and suggestions on work done, skills acquired, emerging capabilities, ideas for
    directing further learning. Clearly these can be provided throughout a learning program.
   Feedback can be provided by peers and others not just the teachers. So, for example, in a
    lecture you could ask a question, give time for students to write a brief response (2-3 minutes)
    then discuss with their peers, before you discuss features of the question.
   Feedback is not just about weaknesses. Students will respond if you are encouraging as
    well as allowing of mistakes. Positive feedback on the capabilities they have demonstrated
    helps establish a culture where students seek feedback and take greater active ownership over
    their learning.
   Feedback is a starting point for learning – not just confirmation for the students that they
    were right or wrong. In particular, students will appreciate advice which helps them discover
    and address their weaknesses. It is best seen as a trigger for active learning and therefore
    feedback can be challenging, questioning, widening.
   Clearly timing is important to the value of feedback. It is vital to get constructive comments
    back to students as soon as possible as part of student learning tasks. Consider too, a wider
    view of feedback: 'helping my students taking stock of what and how they are learning in a
    timely manner - to help them take constructive steps forward'. It is particularly useful for
    formative assessment activities or during a drafting or development process.
   In summary, feedback is about students learning to self-evaluate, helping them to take greater
    ownership of their learning and increasing the likelihood that they will come to direct their own
    learning.

              http://www.lts.rmit.edu.au/renewal/assess/faq3.htm
Feedback

  This is the great violin teacher, Shinichi
  Suzuki. One of the revolutions in music
  teaching he created is the idea of being
  positive with children. Before correcting a
  student’s performance, Suzuki said that the
  teacher had to find something positive to
  say to the student about his or her
  performance. It’s amazing how much more
  motivated students are when their efforts
  are recognized and feedback is given with
  the clear purpose of helping the student to
  improve.
Teacher: judge or advocate?




The right to assess can either be used as a means of judgment or it can be
used as a means to help students to improve their skills and abilities. Do
your best to be an advocate for students—to use your skills and abilities to
help them to learn and to grow.
Grading
 Norm-referenced grading: assessment of
  students’ achievement in relation to one another.
 Criterion-referenced grading: assessment of
  each student’s mastery of the course objectives.
 Grading on the curve: norm-referenced grading
  that compares students’ performance to an
  average level. (This is one version of norm-
  referenced grading).
Norm-referenced grading

                                 Grading on a curve assumes that a
                                 classroom is going to be like a large
                                 population: some people will do well,
                                 some people will fail, and most people
                                 will do average.

F D         C    B    A




Norm-referenced grading (of any variety) assumes that only a
certain percentage of the class will do well. A’s (and the positive
message that accompanies a high grade) are a limited commodity.
Norm-referenced grading




But if we are to “Leave No Child Behind” and but our grading system
is based on a certain percentage of the class failing, then isn’t there
something wrong here?
Criterion-referenced grading
 The answer is to grade on mastery of the
  material rather than comparisons of one
  student to another.
 In this class, if you master the material,
  you can receive an A. If all of you master
  the material, then I will be really happy.
Percentage grading
 System of converting class performances to
  percentage scores and assigning grades based
  on predetermined cutoff points.
 This is the “90-100% is an A” type grading
  system.
 The main problem is: which 80% of math is it
  okay to learn and which 20% is it not okay to
  learn? What is 80% of 8th grade math? This
  system does not lead to mastery of the material
  except for the folks who get A’s.
Contract system
 System in which each student works for a
  particular grade according to agreed-upon
  standards.
 In this system, the student has a lot of control.
  You have to, as a teacher, figure out how to
  create a contract you can live with. This
  includes clearly-understood quality measures so
  you don’t end up giving high grades that are not
  really deserved.
 Rubrics can help you and students to deal with
  quality of work issues.
Revise option
 In a contract system, the chance to revise
  and improve work.
 Actually, this is a good policy to have. If a
  student writes a poor paper and has a
  chance to rewrite, a valuable learning
  process can take place. This is part of
  mastery learning.
Grading on effort and improvement
 When you include effort and improvement
 in your grading system, you can challenge
 all students. A very bright student who
 doesn’t put much effort into anything
 should not get a stellar grade—this
 experience does not help the student to
 develop a work ethic.
Individual learning expectation
 Personal average score.
 One way to give points for improvement is
 to award extra points to students who do
 better on a test or homework than their
 personal average.
Halo effect


                             Did you ever notice how some kids, no
                             matter what they did, NEVER got in
                             trouble? Other kids, no matter how they
                             behaved were always in trouble. This is
                             the halo effect: the tendency for a general
                             impression of a person to influence our
                             perception of any aspect of that person.




The halo effect leads to unjustified grading practices. Watch out for it.
Grade pollution


                              If your directions weren’t clear and a bunch of
                              students therefore did badly on a project, that’s
                              grade pollution. If a student had poor reading
                              skills and your math test involved a lot of
                              reading, that’s grade pollution. Anything that
                              “muddies” the relationship between the
                              student’s learning and the grade is pollution.




Stop Grade Pollution!!! You don’t like it and neither will your students.
Communicating with parents

           Don’t wait until something bad happens
           to get in touch with parents. Try to find
           reasons to make a positive phone call
           for each student.
Communicating with parents

             Document why you are giving the
             grades you are giving. Your
             documentation is your justification.
             Be sure you document behavioral
             information that leads to your grades
             for deportment.
Documentation: A Reminder
 When you document a student’s behavior,
  describe the behavior as exactly as possible,
  using non-judgmental terms.
 “Sam was running in circles around the
  classroom, shouting obscenities, and waving his
  arms” is far more specific than “Sam was acting
  crazy.” “Crazy” is a judgment on your part. A
  parent will be more likely to accept the first
  description because it is specific and
  nonjudgmental.
Communicating with parents
Most of the time, if you have
made an effort to communicate
with parents when positive
things happen, if you
demonstrate that you care
about each student, and if you
share not only what your
grades for a student are but
WHY (including showing
examples of student work),
parents will accept your
assessment and will work with
you to help the student
improve.
Communicating with parents
 Once in awhile, you get a parent who is
 not going to be reasonable. This person
 may have mental health issues going on,
 may be overly involved in his/her student’s
 life, may be under a lot of psychological
 stress for some reason or other, may have
 had negative experiences with school him
 or herself, or just may be an unpleasant
 person in general.
Communicating with parents
 If you have a negative conference with a parent,
  immediately tell your supervisor (before the
  parent can call your supervisor). Ask your
  supervisor to help you plan future conferences—
  you may want to have someone from the school
  administration present for future conferences.
  Keep careful documentation on this student so
  you can let your supervisor know what you are
  doing with the student and why.
Technology and assessment
 Technology can be used in planning and
  constructing tests.
 Test software can help you to analyze test
  results.
 Software is helpful for maintaining student
  records.
Technology and authentic
              assessment
 Technology holds great promise for authentic
  assessment.
 With scanning equipment, students can create and keep
  their materials on electronic media (CD’s). They can
  also create materials using the vast possibilities on the
  computer (powerpoint, movie maker, etc.).
 Because of the ease of storage, student portfolios could
  potentially span several years of their schooling.
  Samples of student work across multiple years not only
  demonstrate student progress but they also help to
  reveal the areas of interest a student has developed and
  maintained across the years.

Mais conteúdo relacionado

Mais procurados

Performance assessment
Performance assessmentPerformance assessment
Performance assessment
Krisna Marcos
 
Types of assessment
Types of assessmentTypes of assessment
Types of assessment
cwhinsch
 
Summative assessment( advantages vs. disadvantages)
Summative assessment( advantages vs. disadvantages)Summative assessment( advantages vs. disadvantages)
Summative assessment( advantages vs. disadvantages)
Love Joy Amargo
 

Mais procurados (20)

Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Performance assessment
Performance assessmentPerformance assessment
Performance assessment
 
Assessment for Learning, Assessment as Learning and Assessment of Learning
Assessment for Learning, Assessment as Learning and Assessment of LearningAssessment for Learning, Assessment as Learning and Assessment of Learning
Assessment for Learning, Assessment as Learning and Assessment of Learning
 
Types of grading system
Types of grading systemTypes of grading system
Types of grading system
 
Formative and summative assessments
Formative and summative assessmentsFormative and summative assessments
Formative and summative assessments
 
Classroom assessment
Classroom assessmentClassroom assessment
Classroom assessment
 
Types of assessment
Types of assessmentTypes of assessment
Types of assessment
 
What is assessment ‫‬
What is assessment ‫‬What is assessment ‫‬
What is assessment ‫‬
 
Assessment for learning
Assessment for learningAssessment for learning
Assessment for learning
 
Definition of Assessment,
Definition of Assessment,Definition of Assessment,
Definition of Assessment,
 
TOOLS AND TECHNIQUES FOR CLASSROOM ASSESSMENT
TOOLS AND  TECHNIQUES FOR CLASSROOM ASSESSMENTTOOLS AND  TECHNIQUES FOR CLASSROOM ASSESSMENT
TOOLS AND TECHNIQUES FOR CLASSROOM ASSESSMENT
 
teacher made test Vs standardized test
 teacher made test Vs standardized test teacher made test Vs standardized test
teacher made test Vs standardized test
 
Types of Grading and Reports
Types of Grading and ReportsTypes of Grading and Reports
Types of Grading and Reports
 
Summative assessment
Summative assessmentSummative assessment
Summative assessment
 
Summative assessment( advantages vs. disadvantages)
Summative assessment( advantages vs. disadvantages)Summative assessment( advantages vs. disadvantages)
Summative assessment( advantages vs. disadvantages)
 
Diagnostic assessment
Diagnostic assessmentDiagnostic assessment
Diagnostic assessment
 
Assessment techniques
Assessment techniquesAssessment techniques
Assessment techniques
 
Nature of Assessment
Nature of AssessmentNature of Assessment
Nature of Assessment
 
measurement assessment and evaluation
measurement assessment and evaluationmeasurement assessment and evaluation
measurement assessment and evaluation
 
Functions of Grading and Reporting System.pptx
Functions of Grading and Reporting System.pptxFunctions of Grading and Reporting System.pptx
Functions of Grading and Reporting System.pptx
 

Semelhante a Classroom assessment

Active learning & classroom assessment practices
Active learning & classroom assessment practicesActive learning & classroom assessment practices
Active learning & classroom assessment practices
Kyawmoe Aung
 
Formative assessment
Formative assessmentFormative assessment
Formative assessment
msedaghatian1
 
Common Formative Assessments
Common Formative AssessmentsCommon Formative Assessments
Common Formative Assessments
GLKinTeacher
 
9.2.2011 pd on formative assessments
9.2.2011 pd on formative assessments9.2.2011 pd on formative assessments
9.2.2011 pd on formative assessments
ASPIRA of Illinois
 

Semelhante a Classroom assessment (20)

Module 4: School Based Assessment
Module 4: School Based AssessmentModule 4: School Based Assessment
Module 4: School Based Assessment
 
Examination as a tool for Assessment
Examination as a tool for AssessmentExamination as a tool for Assessment
Examination as a tool for Assessment
 
Fs 5 Learning Assessment Strategies
Fs 5 Learning Assessment StrategiesFs 5 Learning Assessment Strategies
Fs 5 Learning Assessment Strategies
 
Types Of Assessment
Types Of AssessmentTypes Of Assessment
Types Of Assessment
 
Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention
 
Active learning & classroom assessment practices
Active learning & classroom assessment practicesActive learning & classroom assessment practices
Active learning & classroom assessment practices
 
Formative assessment
Formative assessmentFormative assessment
Formative assessment
 
Essay On Assessment For Learning
Essay On Assessment For LearningEssay On Assessment For Learning
Essay On Assessment For Learning
 
Building a base for better teaching
Building a base for  better teachingBuilding a base for  better teaching
Building a base for better teaching
 
Common Formative Assessment
Common Formative AssessmentCommon Formative Assessment
Common Formative Assessment
 
Ev681 session3 planning
Ev681 session3 planningEv681 session3 planning
Ev681 session3 planning
 
Assessment of learning 1
Assessment of learning 1Assessment of learning 1
Assessment of learning 1
 
Assessment And Learner Essay
Assessment And Learner EssayAssessment And Learner Essay
Assessment And Learner Essay
 
Assessing student learning
Assessing student learningAssessing student learning
Assessing student learning
 
Common Formative Assessments
Common Formative AssessmentsCommon Formative Assessments
Common Formative Assessments
 
Classroom Assessment
Classroom AssessmentClassroom Assessment
Classroom Assessment
 
9.2.2011 pd on formative assessments
9.2.2011 pd on formative assessments9.2.2011 pd on formative assessments
9.2.2011 pd on formative assessments
 
Online assessment
Online assessment Online assessment
Online assessment
 
assessment_and_evaluation_[autosaved.pptx
assessment_and_evaluation_[autosaved.pptxassessment_and_evaluation_[autosaved.pptx
assessment_and_evaluation_[autosaved.pptx
 
Learning_activity1_Erazo_Estevez_Fabara_Factos_Farfán .pptx
Learning_activity1_Erazo_Estevez_Fabara_Factos_Farfán .pptxLearning_activity1_Erazo_Estevez_Fabara_Factos_Farfán .pptx
Learning_activity1_Erazo_Estevez_Fabara_Factos_Farfán .pptx
 

Mais de CarolynOsborne (13)

Types of Teacher Expertise
Types of Teacher ExpertiseTypes of Teacher Expertise
Types of Teacher Expertise
 
Research for wiki
Research for wikiResearch for wiki
Research for wiki
 
Communicationppt
CommunicationpptCommunicationppt
Communicationppt
 
Differences and similarities
Differences and similaritiesDifferences and similarities
Differences and similarities
 
Abilities, disabilities intelligence laws
Abilities, disabilities intelligence lawsAbilities, disabilities intelligence laws
Abilities, disabilities intelligence laws
 
Motivation
MotivationMotivation
Motivation
 
Cognitive (1)
Cognitive (1)Cognitive (1)
Cognitive (1)
 
Bandura
BanduraBandura
Bandura
 
Vygotsky
VygotskyVygotsky
Vygotsky
 
Erikson
EriksonErikson
Erikson
 
Standardized tests
Standardized testsStandardized tests
Standardized tests
 
Piaget (1)
Piaget (1)Piaget (1)
Piaget (1)
 
Behaviorism
BehaviorismBehaviorism
Behaviorism
 

Último

1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
MateoGardella
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 

Último (20)

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 

Classroom assessment

  • 1. Classroom assessment  All the processes involved in making decisions about students’ learning progress.  It’s more than just grades and tests!!
  • 2. The role of assessment There is a cycle of planning, teaching, and assessing that insures students are learning. Plan Teach This is the cycle around which I built Assess my Educational Psychology schema (back in session 1)
  • 3. Formative and summative evaluation  Formative evaluation: occurs before or during instruction and is used to provide feedback to students and to monitor their growth.  Summative evaluation: occurs after instruction on a given topic and is used for grading purposes.
  • 4. How to remember formative vs. summative  Summative happens at the end—like a summary.  Formative happens while instruction is taking place (while students’ understanding is still forming).
  • 5. Types of Formative Evaluation  Pretest: formative test for assessing students’ knowledge, readiness, and abilities.  Diagnostic test: formative test to determine students’ areas of weakness.
  • 6. Why use formative assessment?  If you want your lessons to be in students’ “zone of proximal development,” then you need to know what they know (pre-assess or test) or where they are misunderstanding (diagnostic test).
  • 7. Why use formative assessment?  If you want to know the effect of a lesson or series of lessons, then you can do a pre-test, teach the lessons, and then do a post-test in order to find out how well you taught this group of students.
  • 8. Formative assessment  Remember that assessment is not just tests, so you can do a “pre-test” that is not really a test. For example, music teachers can have students sight-read a new piece of music as a “pre-test.” Math teachers can have students try out a worksheet as a “pre-test.” A KWL (a collaborative chart where students tell “what they Know, what they Want to learn, and what they have Learned) can be a formative assessment.
  • 9. Measurement and Evaluation  Measurement: the process of gathering information about learning.  Evaluation: the process of making decisions on the basis of measurements. These contribute to the planning phase of the assessment cycle.
  • 10. Types of measurement  Informal measurement is the process of gathering incidental information about learning progress during learning activities and discussions with students.  Formal measurement is the process of systematically gathering information about learning progress. Good teachers are always “kid watching”—seeing what is going on with individual students as they participate in classroom activities. These teachers often make notes of what they observe and use this information not only in lesson planning but also in parent-teacher conferences.
  • 11. Systematic assessment  Students can get lost in the cracks. A quiet student sitting in the back may not attract your attention.  You need to know what is going on with each student in your classroom; this is your responsibility as a teacher.  You need to systematically collect assessment information on every student.
  • 12. Validity  This is an indicator of the extent to which an assessment actually measures what it is supposed to measure.  Is the measure aligned with your goals as a teacher?  Is the measure aligned with what happened in the classroom?  Is the measure based on students’ abilities rather than on personality or likability?
  • 13. Examples of invalid assessment  Students have been learning music in major keys. The teacher tests their sight reading with a piece in a minor key.  Inner city students have been learning about writing. The teacher gives them a writing prompt about “a day on a farm.”  Students have been using manipulatives. The teacher gives them a test where they cannot use these aids.  Learning disabled (with difficulty reading and writing) students have been learning about the Civil War. The teacher gives them a written test. If you were the teacher, how would you improve the validity in each of these situations?
  • 14. Reliability  A description of the extent to which measurements are consistent and free from errors of measurement.  A bathroom scale is useless if it is not reliable  So are measurements we use in schools Remember, reliability is not everything. A test of toe length could be reliable but not particularly useful in the education world.
  • 15. Reliability on teacher-made assessments  Use a sufficient number of items or tasks and look for consistency of student performance across tasks.  Ensure that directions are clear so students know what is expected of them.  Identify specific criteria to evaluate students’ performance.  Administer assessments in similar ways for all students.
  • 16. Traditional Assessment Strategies  Patterns of assessment  Constructing tests
  • 17. Getting the Most from Traditional Assessment  Planning for Testing  Objective Testing: multiple-choice, matching, true/false, short answer, and fill-in tests; scoring answers does not require interpretation.  Essay Testing
  • 18. Planning for testing  When?  Frequently  Soon after material is learned  Cumulative
  • 19. Judging textbook texts  Identify your objective first and then see if the textbook test helps  Feel free to adapt to the realities of your class  Tests from the text are less valid if you make multiple adaptations.  Make sure what gets taught matches what gets tested.
  • 20. Making tests  Each item on a test should be valid— students who know the information should get the item right and students who do not know the information should get it wrong.  In the next few slides, we will discuss some of your favorite types of test items: multiple choice, matching, true/false, completion, essays, and rubrics.
  • 21. Multiple choice  Multiple-choice format is a measurement format that consists of a question or statement, called a stem, and a series of answer choices. The individual responding to the items chooses the correct or best answer.  Distracters: the incorrect alternatives, so-called because they are designed to distract students who don’t understand the content being measured in the item.  Stem: the question part of a multiple-choice item.
  • 22. Stem and distractors  Stem: should pose one question or problem.  Distracters: should reflect common misconceptions. Should be grammatically consistent with the stem—otherwise the grammar will give away which distracter is correct.  You can assess higher-level thinking with multiple choice questions.
  • 23. Matching  Matching format: a measurement format that requires learners to classify a series of examples using the same alternatives.  Content should be homogeneous (all material of the same type).  Use more statements than alternatives so students can’t use elimination to get items right.  Let students know that alternatives can be used more than once or not at all (keeps students from guessing instead of knowing).  Keep the material all on one page—if you have material for more than one page, break it up into two different groups.  Don’t overload students’ working memories with excessively long matching items—if you have more than ten possibilities, break the item into two.
  • 24. True-false  True-false format is a measurement format that includes statements of varying complexity that learners have to judge as being correct or incorrect.  Don’t put both a true fact and a false fact in the same item.  “Most” is a hint that the item is true.  “Never” and “always” are hints that the item is false.  Negative wording can be confusing.
  • 25. Completion  Completion format is a measurement format that includes a question or an incomplete statement that requires the learner to supply appropriate words, numbers, or symbols.  It is very difficult to create completion items where only one answer is correct.  These items usually measure low-level forms of thinking.
  • 26. Essay  Essay format is a measurement format that requires students to make extended written responses to questions or problems.  They assess creative and critical thinking  They measure progress students make in creating and defending thesis statements.  They change how students study and learn.  Scoring them is a challenge.  They can be ambiguous.
  • 27. Rubrics  A rubric is a scoring scale that describes the criteria for grading.  Rubrics help students to plan the material that will be assessed.  Establish criteria based on essential elements that must be present in students’ work.  Decide on number of levels of achievement for each criterion.  Develop clear descriptors for each level of achievement.  Determine a rating scale for entire rubric.
  • 28. Commercially Prepared Test Items  Carefully examine your goals before using them.  Select those items that are consistent with your goals  Revise items that need improvement  Create additional items that help you accurately measure your students’ understanding.
  • 29. Alternative Assessment  Direct examination of student performance on significant tasks that are relevant to life outside of school.  In other words: the assessment task is similar to or the same as tasks students would be doing in the “real world,” (as opposed to using a multiple choice test where the assessment process differs significantly from real- world performance).  Performance Assessment: any form of assessment that requires students to carry out an activity or produce a product in order to demonstrate learning.  Portfolios Authentic Assessment: assessment procedures that test skills and abilities as they would be applied in real-life situations.
  • 30. Performance Assessment  A form of assessment in which students demonstrate their knowledge and skill by carrying out an activity or producing a product.  Specify the type of performance.  Select the focus.  Structure the evaluation setting—making it realistic but practical.  Design evaluation procedures
  • 31. Examples of Performance Assessment  Students find the area of a classroom in order to figure out how much carpet will be needed.  Students write persuasive letters to the editor regarding an issue of concern to them.  Students perform experiments to figure out a science problem. Pay attention to and assess not just the product the student produces but also the process by which students create the product.
  • 32. Evaluating Performance Assessment Products/Processes  Use systematic observation, the process of specifying criteria for acceptable performance on an activity and taking notes during observation based on the criteria.  Checklists are written descriptions of dimensions that must be present in an acceptable performance of an activity. These help you to keep track of student performance.  Rating scales are written descriptions of the evaluative dimensions of an acceptable performance of an activity and scales of values on which each dimension is rated.
  • 33. Portfolios  A collection of the student’s work in an area, showing growth, self-reflection, and achievement.  Portfolio assessment usually involves student goal setting and showing how the items in the portfolio demonstrate the achievement of goals.  Portfolios provide an ongoing idea of student work because they are a collection of items completed across several weeks.
  • 34. Using Portfolios  Embed portfolios in instructional practice. Refer to them frequently as you teach.  Provide examples of portfolios when introducing them to students.  Involve students in the selection and evaluation of their work.  Require students to provide an overview of each portfolio, a rationale for the inclusion of individual works, criteria they used to evaluate, and a summary of progress.  Provide students with frequent and detailed feedback about their decisions.
  • 35. Portfolios across the curriculum  Portfolios began in art. Professional freelance artists develop portfolios of their work to show potential employers.  The first book on the use of portfolios (Tierney, Carter, and Desai, 1991, Portfolios in the Reading/Writing Classroom) focused on the use of portfolios to show students’ literacy development.  Portfolios can also be used in other subjects. Students can collect a variety of artifacts to demonstrate their developing knowledge and skills.  With technology, portfolios can be digitized and they can feature a wide range of formats.  Where possible (e.g., in music programs where the teacher has the same students across several years) it would be very useful to have portfolios that span several years and show student improvement.
  • 36. Exhibition  A performance test or demonstration of learning that is public and usually takes an extended time to prepare.  Examples: a music recital, an art exhibit, a project that is presented to a class.
  • 37. Evaluating alternative assessments  Use a scoring rubric: rules that are used to determine the quality of a student performance.  Students can contribute to the rubric. When they do so, they often “buy into” the process of assessment.  Assessment can also include a student component—students should learn to assess themselves.
  • 38. Reliability, validity, generalizability  Alternative assessments may not be reliable—two different teachers may give a portfolio very different grades.  Even validity may be a problem—there’s not a lot of research that is helpful yet.
  • 39. Diversity and equity Professional orchestras used to be all male (except for the harpist, for some odd reason). In order to counter this, orchestras began having blind auditions—potential members tried out for the orchestra behind a screen. This practice has led to a greater amount of diversity in orchestras.
  • 40. Diversity and equity  In terms of exhibition-type assessments, subtle “-isms” can creep in when it is impossible to judge work apart from the person who does the work. It is important to keep this in mind when making judgments on students’ work.  Also, it is important to remember what students have access to at home in terms of the ability to work. Some students have a lot of support at home for their schoolwork (a computer, a private space for working, social support for doing homework, etc.) and other students do not. If there is a lot of out-of-class work that goes into a portfolio or an exhibition, one needs to keep these potential inequities in mind.
  • 41. Informal assessments  Ungraded (formative) assessments that gather information from multiple sources to help teachers make decisions.  Good teachers are constantly observing students and making informal assessments of student understanding.  Other ways to do informal assessment: student journals, “exit tickets” where students answer a question in writing before they leave the classroom, checklists, rating scales, etc.
  • 42. Self-assessment  Alternative assessment can be just as dumb and alienating as testing if it is carried out in the wrong spirit.  Most of the proponents of alternative assessment call for student involvement in assessment.
  • 43. Self-assessment  Truly, involving students makes sense; in the world of work, we often have to assess ourselves. This is part of taking initiative. We can’t simply wait for the annual review.
  • 44. Student-Led Conferences: A Growing Trend  For years parent-teacher conferences have been the primary means of parent-teacher communication. But now, many schools are trying something new -- student-led conferences that communicate not only how a student's doing but also why.  Parent-teacher conferences -- we all know how they go. Parents troop into classrooms to talk with teachers about their children's progress in school. Often, the process feels rushed, and parents leave feeling vaguely dissatisfied, as if they didn't really get what they came for.  For years that process has been the norm, but now it is changing. In more and more schools, students are leading conferences, and, overall, the word is that they're doing a fine job. http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 45. Student-Led Conferences: A Growing Trend  Many teachers themselves speak enthusiastically of the advantages of student-led conferences over teacher-led ones. "We found the [student-led] conferences most beneficial," said Keith Eddinger of the Marcus Whitman Middle School in Rushville, New York. "From a teacher's perspective, we were able to get a better picture of each child. It forced us to sit down with each student and review strengths and weaknesses. This conversation often told us the students learned more than perhaps we had measured through conventional assessments."  Eddinger added, "Our post-conference reviews with parents and students were overwhelmingly positive."  John Osgood, of C. L. Jones Middle School in Minden, Nebraska, found that "comments [about student-led conferences] from parents and board members were very positive." http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 46. Student-Led Conferences: A Growing Trend  Another staff member, Dick Philips, said, "Most parents listened to their child. It was interesting listening to [children] explain low grades to their parents. It did open the lines of communication."  "Several parents really liked it because it gave them an opportunity to see their child's work," said Sue Yant, another staff member. Yet "some [parents] said they hoped we [would hold] the traditional conference once a year." http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 47. Student-Led Conferences: A Growing Trend  STUDENT PREPARATION "The format is important, but I believe the success of a student-led conference is most determined by how well students are prepared," wrote Laura Hayden, a seventh-grade communications teacher at Derby Middle School in Derby, Kansas, in Letting Students Lead Parent Conferences, an article published by the National Association of Elementary School Principals in Middle Matters.  The conference format at Hayden's school had students show parents some of their work and explain their grades in a student-led conference. Each team could conduct conferences a bit differently. Hayden's team used an open house arrangement in which students and parents visited all team members' classrooms, but other teams held the entire conference in one classroom.  The significance of format aside, Hayden focused her students on preparation. At the beginning of the school year, she had students set up a binder to contain a portfolio as well as graded work. She explained that students had to keep their binders orderly because they would use them to lead their conferences. http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 48. Student-Led Conferences: A Growing Trend  A week before the conferences, Hayden's team sent home a letter informing parents of the conference and the fact that their child would lead it. About three days before conferences, she had students prepare portfolios of their work to date, including a special project, a quiz, a homework assignment, and one assignment from which they felt they had learned the most. Students also wrote a reflection on their grades and study habits. They set goals for the next semester and organized their graded work section.  The day before conferences, teachers role-played, pretending to be the student, with the student playing the teacher or the parent. Teachers modeled, for example, how to explain a poor grade to parents, and they gave students a checklist of what to cover in the conference. http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 49. Student-Led Conferences: A Growing Trend  STUDENT RESPONSIBILITY  "The preparatory time is worth it," Hayden wrote, "especially when you hear a struggling student explaining what he or she learned from an assignment and taking responsibility for the score he or she achieved."  "[Students] need to understand that they are in control of their own efforts to learn the material," said Barbara Rommel, superintendent of the David Douglas School District in Oregon. (Source: "New Method Puts Student in Charge," an article published in the Oregonian newspaper.)  The Oregon Educational Act for the 21st Century requires students to meet higher standards. By having students assess how they are progressing toward those standards, educators say, students will know how far they've come and how far they have to go to meet the standards.  "It helps them accept responsibility for their learning," said Patti Kinney, principal of Talent Middle School in Oregon.  "I like being able to tell my side of the story," Josh Whitney-Wise of Milwaukie, Oregon's, McLoughlin Middle School told the Oregonian. http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 50. Student-Led Conferences: A Growing Trend  ADVANTAGES AND DISADVANTAGES  Educators acknowledge that there are disadvantages as well as advantages to student-led conferences. Although parent attendance seems higher for student-led conferences than for teacher-led ones, a parent's failure to attend a student-led conference leads to a great deal of disappointment for a student who has worked hard to prepare.  Another disadvantage is that some parents want to spend more time with their child's teacher, receiving his or her viewpoint. Nearly all schools with student-led conferences will let parents make separate appointments to confer with teachers.  For the most part, parents support the concept of student-led conferences, though some support them with slight reservations. "My daughter was in a class that did student-led conferences a couple of years ago," said one parent of a child at Jones Middle School. "I think the object was to make the child feel a part of the whole process, to get them in tune with their own progress. … As a parent, I felt like I still needed some info from the teachers and wanted more. … But I do think the student gets a new perspective on their grades. … Personally, I don't think it would be good to do this often, but once a year is good. When you ask if they were 'beneficial,' I can say yes and no. They were more beneficial to the student than to the parent." http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 51. Student-Led Conferences: A Growing Trend  But the advantages, say most teachers who have participated in student-led conferences, outweigh the downside. Student accountability is mentioned again and again by educators as a plus for student-led conferences. Another plus is the way even a struggling student can produce something positive for a conference, an art project or an essay, perhaps, that wouldn't show up in a report card grade. Overall, talks with educators indicate, student-led conferences are a growing trend. http://www.educationworld.com/a_admin/admin/admin112.shtml
  • 52. Traditional vs. alternative assessments Traditional Alternative  Tends to be low-level  Tends to be higher level (memorization rather than thinking)  Has real world application  May not apply to “real  Reveals process as well world.” as product  Measure outcome rather  Based on constructivism than process  Harder to implement  Based on behaviorism  Less reliable  Easier to implement  Should involve students  Reliable in goal setting and  Does not involve students assessment
  • 53. What to do?  Students have to learn to deal with standardized assessment, as that is the gatekeeper to college and many careers. (How many standardized tests have you taken across your school experiences? How many more will you have to take to achieve your educational and career goals?)  Yet students also need opportunities to demonstrate skills and abilities that multiple choice testing fails to assess.  Good teachers make judicious use of both forms of assessment.
  • 54. A personal take on standardized assessment  When I applied to get a Ph.D., the department to which I applied required the GRE. At the time, I was working for a leader in the area of alternative assessment and I therefore was very much aware of the criticisms of standardized testing. I decided that I needed to take a stand on the GRE.
  • 55. So…  I went to the library and in about 30 minutes I had a small array of research articles that showed that the GRE is biased and that the Educational Testing Service has a checkered past (it was founded by eugenicists—people who shared some of Hitler’s ideas about race).  I wrote a letter to the department that stated that it was against my principles to support something that was biased and to pay money to ETS. I said that I would be willing to provide a portfolio of my writing for them to assess.
  • 56. What happened next…  The department wrote back and said that they would accept my principled stand against ETS and would review my materials without the GRE. They also said that they would also consider waiving the GRE requirement for any other potential student who had the same concerns (this was OSU, Dept. of English, in case you are interested in testing this out).
  • 57. But then…  All during my teaching, I refused to have anything to do with standardized testing and the kind of teaching that supports it. My idea of being a teacher who made a difference in people’s lives was through using whole language and other student- centered approaches.  Then I volunteered for a year at an African American Christian school, where the point of the school was to make differences in students’ lives. The principal there explained that since the Proficiency Test in Ohio was a gatekeeper, one that has been biased against her students, she felt the best thing to do was to prepare students as best she could to beat the gate. An example of bias on the Ohio proficiency test: a few years ago, one of the writing prompts was about taking a camping trip. Urban teachers were frustrated—many of their students had never traveled more than a few miles from home, much less had the experience of being in the wilderness and setting up a tent.
  • 58. So…  This altered my teaching profoundly. As much as I dislike ETS, I have learned that it is my job as a teacher to prepare my students to beat the gate. It is also my job to speak up against ETS and its biases. I can simultaneously be an advocate for alternative assessment and prepare my students for the realities they will face. I can also let my students know that as voters, citizens, and students, they can influence state and federal laws and university practices that allow ETS its monopoly on assessment.
  • 59. Effective Assessment Practices: Instructional Strategies  Plan systematically  Prepare students  Administer assessments under optimal conditions  Analyze results to make sure assessments are accurate and valid
  • 60. Planning for Assessment  Make sure assessment is in line with goals.
  • 61. Preparing students for assessment  Give students opportunities to practice assessment procedures.  Help students with test-taking strategies.  Give students strategies to reduce anxiety.  When using a test, give students practice with the type of item that will be on the test. The above is what your book says. Here are my 2 cents. Consider using tests for curricular items that will be tested through state and locally-mandated tests. You are using your tests to help students practice for the BIG test. Use non-testing means of assessing everything, as well, for two reasons: 1. tests are anxiety-producing and often require and reflect low-level thinking. They turn struggling and bored students off from learning. Well-planned alternative assessment is interesting and engaging. 2. many students do not do well on tests, so you really have no idea if they are learning or not unless you do some kind of alternative assessment. If they know the material but still do not perform well on standardized-style testing, then you know you need to teach test- taking strategies.
  • 62. Test-Taking Strategies  Use time efficiently and pace yourself  Read directions carefully  Identify the important information in questions  Understand the demands of different testing formats  Find out how the questions will be scored. Oh, boy. If only it were that simple. If you are good at taking tests, this list would affirm and summarize what you know, but if you struggle, this list is worthless. Please look at the powerpoint and the word file on test taking strategies for better information.
  • 63. Reducing Test Anxiety  Test anxiety is a relatively stable, unpleasant reaction to testing situations that lowers performance.  Symptoms: increased pulse rate, dry mouth, headache, feeling of dread and/or helplessness, going blank, worry about failure.
  • 64. Reducing Test Anxiety: strategies for teachers  Use criterion-referenced measures to minimize the competitive aspects of tests (these measure student against material to be learned rather than student against student).  Avoid social comparisons, such as public displays of test scores and grades  Increase frequency of quizzes and tests  Discuss test content and procedures before testing  Give clear directions, and ensure that students understand the test format and requirements  Teach test-taking skills  Use a variety of measures, including alternative assessments, to measure the range of students’ understanding and skills  Provide students with ample time to take tests
  • 65. More strategies (not in your book)  Think about: what will make it possible for this student to succeed (really succeed in demonstrating his/her knowledge, really succeed in learning how to deal with testing)?  You may need to break the process of testing down for some students—for instance, if the proficiency test is mostly multiple-choice, then work with them only on multiple-choice (in your tests), showing them how to analyze the questions. They may need lots of practice in this area. You could create multiple-choice homework. This is a lot of work, but it is what is necessary for students to succeed.  You may need to reteach critical information to your struggling students. This means figuring out how to manage a review of information for one or two students while other students work on other things.
  • 66. Specific test preparation procedures  Specify what will be on the test  Give students a chance to practice test- like items under test-like conditions  Establish positive expectations
  • 67. Administering Tests  Make environment comfortable and free from distractions  Give precise directions about what students need to do and what will happen (especially, what early finishers should do to avoid distracting late finishers)  Monitor students while they are taking the test
  • 68. Analyzing results  Grade tests quickly so you can give feedback to students as soon as possible  Go over items that several people missed and reteach information that was clearly not understood
  • 69. Analyzing results  Research shows that if students are told they did well on a test, then they will tend to do better the next time. Be positive. (This is also an argument for setting students up for success).  Note on your copy of the test any problems with questions or wording that was unclear. That way you can edit the test for next time.
  • 70. Reducing bias in testing  Carefully wording items  Making provisions for non-native English speakers  Accommodating diversity in scoring
  • 71. Carefully wording items  Think about what the students do know and have experience with and use that as much as possible.  For instance, you can use school experiences that you know students have had instead of home experiences that only some students have had. “Tanisha has three pencils and Lamar has two. How many pencils do they have together?” rather than “Bob has a DVD player, a CD player, and an I-pod. How many pieces of equipment does he have?” (Okay, so I know the examples are lame, but hopefully you get the idea).
  • 72. Making provisions for non-native speakers  Modify test  Modify test procedure  Simplify test language  Provide extra time to take  Simplify test directions test  Provide visual supports  Allow students to use a  Assess students in their glossary and dictionary native language  Read directions aloud (in native language would be better)  Read test aloud and clarify misunderstandings These provisions would also work for students with learning disabilities.
  • 73. Accommodating diversity in scoring  Multiple-trait scoring: different dimensions of a product are judged according to different criteria.  For example, grade an essay on both content and grammatical conventions or break down a task into three important characteristics and score for each one.
  • 74. Grading and Reporting: the total assessment system  What types of assessment will be used?  How will you use traditional assessment?  How will you use authentic assessment?  How will you deal with homework?  How will you deal with missed work?  How do you represent affective dimensions (cooperation, effort) in your assessment plan?  How will you report student performance (grade, percentage, description)?
  • 75. Designing a grading system  Your system needs to be clear and understandable and in line with district/school policies  Your system should support learning by gathering frequent data on students’ learning  Grades should be based on hard data  Grades should be fair to all students  You should be able to defend your system to parents & administrators
  • 76. A comment from me…  Just remember that evaluations not only tell you what is going on with your students but they are also potentially evaluations of your teaching. If everyone fails to answer a question correctly, then it is possible that you failed to teach the concept clearly. If everyone does well on an evaluation, then give yourself a pat on the back when you give the students a pat on the back.
  • 77. Other assessments  Tests and quizzes  Alternative assessments  Homework You need to think about the role each of these plays in your classroom. How constructive are these as activities? How do they contribute to your understanding of students’ performances? How do they contribute to your planning of lessons from day to day? How motivational are these activities? Are any of these activities a barrier to student learning? It’s helpful to think about your own experiences in relation to these. Under what conditions did these forms of assessment seem to work? When did they not work?
  • 78. Assigning grades  Grades can be motivating or demotivating  The challenge is to figure out how to reduce the complexity of a student into a single letter.  What grade do you give the bright student who puts out minimal effort?  What grade do you give a student who works very hard but has difficulty learning the material?
  • 79. My opinion…  I think a lot about grades.  Your grade in this class may influence your future: a good grade will help you to get into graduate school and a bad grade may be a barrier to your career goals.  At the same time, if you don’t learn the material of this class, it’s not fair to the students who do learn the material for you to get a good grade.  I try to think about what the barriers are for learning and how I can remove them. This is why I have tried to create materials to assist different kinds of learners. I have tried to make learning the easiest thing to choose to do. I feel if I have removed barriers as much as possible, then if a student chooses not to learn, I can apply the consequences of that.  I have also tried to create assessments that give every student a chance to demonstrate their knowledge.  Every year that I teach, I revamp my courses, often after considering the issue of grades and assessment. I’m sure this course will look different after a few more times of my teaching it.  I share this with you hoping you will share this journey with me of constantly analyzing and trying to improve one’s teaching. It is an exciting journey and in all the years I have taught, I have never gotten tired of it.
  • 80. Failure Failure is the nemesis of the self-esteem movement. Yet, in order to appreciate and understand success, we need to experience failure.
  • 81. Failure Some parents cannot stand for their children to fail. Unfortunately, parents cannot control what happens to their adult children—and a certain amount of failure helps young people learn to cope with adult problems.
  • 82. Failure is positive when…  …students have the possibility of learning from it.  …when it helps students to take responsibility for their own actions and choices.  …when its consequences are significant for the student but not horribly life- changing.
  • 83. Failure  Remember—allowing students to experience and cope with small failures can prevent large failures later on. Allowing a student to deal with the consequences of not studying by failing an assignment is a lot better than giving a student a false message of success and then the student fails something large later on (Ohio Graduation Test, for example).
  • 84. The work ethic No matter what kind of work people do, they need a work ethic.
  • 85. The work ethic  People need to know how to work without supervision.  They need to know how to take initiative and solve problems without direction.  They need to be trustworthy.  They need to take responsibility for their actions.  They need to know how to work hard.  They need to know how to motivate themselves to work even when the work is not fun or interesting.
  • 86. How schools can help students develop a work ethic  It should take work to succeed in a classroom.  The work should be do-able but it should be challenging. This means that students with struggles should have tasks that are challenging for them and students who are gifted should have tasks that are challenging for them. No one should get a free ride.  Students should face consequences for not doing work.  Students should be given strategies for working, particularly when they are struggling with self-discipline.  When a student fails, the student should be given the message to try again and support for doing so. This support includes problem-solving techniques to figure out why the failure happened and what can be done to prevent it in the future.
  • 87. Feedback  Providing constructive feedback to students is probably one of the most useful things for student learning you can do as a teacher – and most likely to be appreciated by students.  Feedback is an integral part of all teaching not just assessment. Feedback is constructive observations and suggestions on work done, skills acquired, emerging capabilities, ideas for directing further learning. Clearly these can be provided throughout a learning program.  Feedback can be provided by peers and others not just the teachers. So, for example, in a lecture you could ask a question, give time for students to write a brief response (2-3 minutes) then discuss with their peers, before you discuss features of the question.  Feedback is not just about weaknesses. Students will respond if you are encouraging as well as allowing of mistakes. Positive feedback on the capabilities they have demonstrated helps establish a culture where students seek feedback and take greater active ownership over their learning.  Feedback is a starting point for learning – not just confirmation for the students that they were right or wrong. In particular, students will appreciate advice which helps them discover and address their weaknesses. It is best seen as a trigger for active learning and therefore feedback can be challenging, questioning, widening.  Clearly timing is important to the value of feedback. It is vital to get constructive comments back to students as soon as possible as part of student learning tasks. Consider too, a wider view of feedback: 'helping my students taking stock of what and how they are learning in a timely manner - to help them take constructive steps forward'. It is particularly useful for formative assessment activities or during a drafting or development process.  In summary, feedback is about students learning to self-evaluate, helping them to take greater ownership of their learning and increasing the likelihood that they will come to direct their own learning. http://www.lts.rmit.edu.au/renewal/assess/faq3.htm
  • 88. Feedback This is the great violin teacher, Shinichi Suzuki. One of the revolutions in music teaching he created is the idea of being positive with children. Before correcting a student’s performance, Suzuki said that the teacher had to find something positive to say to the student about his or her performance. It’s amazing how much more motivated students are when their efforts are recognized and feedback is given with the clear purpose of helping the student to improve.
  • 89. Teacher: judge or advocate? The right to assess can either be used as a means of judgment or it can be used as a means to help students to improve their skills and abilities. Do your best to be an advocate for students—to use your skills and abilities to help them to learn and to grow.
  • 90. Grading  Norm-referenced grading: assessment of students’ achievement in relation to one another.  Criterion-referenced grading: assessment of each student’s mastery of the course objectives.  Grading on the curve: norm-referenced grading that compares students’ performance to an average level. (This is one version of norm- referenced grading).
  • 91. Norm-referenced grading Grading on a curve assumes that a classroom is going to be like a large population: some people will do well, some people will fail, and most people will do average. F D C B A Norm-referenced grading (of any variety) assumes that only a certain percentage of the class will do well. A’s (and the positive message that accompanies a high grade) are a limited commodity.
  • 92. Norm-referenced grading But if we are to “Leave No Child Behind” and but our grading system is based on a certain percentage of the class failing, then isn’t there something wrong here?
  • 93. Criterion-referenced grading  The answer is to grade on mastery of the material rather than comparisons of one student to another.  In this class, if you master the material, you can receive an A. If all of you master the material, then I will be really happy.
  • 94. Percentage grading  System of converting class performances to percentage scores and assigning grades based on predetermined cutoff points.  This is the “90-100% is an A” type grading system.  The main problem is: which 80% of math is it okay to learn and which 20% is it not okay to learn? What is 80% of 8th grade math? This system does not lead to mastery of the material except for the folks who get A’s.
  • 95. Contract system  System in which each student works for a particular grade according to agreed-upon standards.  In this system, the student has a lot of control. You have to, as a teacher, figure out how to create a contract you can live with. This includes clearly-understood quality measures so you don’t end up giving high grades that are not really deserved.  Rubrics can help you and students to deal with quality of work issues.
  • 96. Revise option  In a contract system, the chance to revise and improve work.  Actually, this is a good policy to have. If a student writes a poor paper and has a chance to rewrite, a valuable learning process can take place. This is part of mastery learning.
  • 97. Grading on effort and improvement  When you include effort and improvement in your grading system, you can challenge all students. A very bright student who doesn’t put much effort into anything should not get a stellar grade—this experience does not help the student to develop a work ethic.
  • 98. Individual learning expectation  Personal average score.  One way to give points for improvement is to award extra points to students who do better on a test or homework than their personal average.
  • 99. Halo effect Did you ever notice how some kids, no matter what they did, NEVER got in trouble? Other kids, no matter how they behaved were always in trouble. This is the halo effect: the tendency for a general impression of a person to influence our perception of any aspect of that person. The halo effect leads to unjustified grading practices. Watch out for it.
  • 100. Grade pollution If your directions weren’t clear and a bunch of students therefore did badly on a project, that’s grade pollution. If a student had poor reading skills and your math test involved a lot of reading, that’s grade pollution. Anything that “muddies” the relationship between the student’s learning and the grade is pollution. Stop Grade Pollution!!! You don’t like it and neither will your students.
  • 101. Communicating with parents Don’t wait until something bad happens to get in touch with parents. Try to find reasons to make a positive phone call for each student.
  • 102. Communicating with parents Document why you are giving the grades you are giving. Your documentation is your justification. Be sure you document behavioral information that leads to your grades for deportment.
  • 103. Documentation: A Reminder  When you document a student’s behavior, describe the behavior as exactly as possible, using non-judgmental terms.  “Sam was running in circles around the classroom, shouting obscenities, and waving his arms” is far more specific than “Sam was acting crazy.” “Crazy” is a judgment on your part. A parent will be more likely to accept the first description because it is specific and nonjudgmental.
  • 104. Communicating with parents Most of the time, if you have made an effort to communicate with parents when positive things happen, if you demonstrate that you care about each student, and if you share not only what your grades for a student are but WHY (including showing examples of student work), parents will accept your assessment and will work with you to help the student improve.
  • 105. Communicating with parents  Once in awhile, you get a parent who is not going to be reasonable. This person may have mental health issues going on, may be overly involved in his/her student’s life, may be under a lot of psychological stress for some reason or other, may have had negative experiences with school him or herself, or just may be an unpleasant person in general.
  • 106. Communicating with parents  If you have a negative conference with a parent, immediately tell your supervisor (before the parent can call your supervisor). Ask your supervisor to help you plan future conferences— you may want to have someone from the school administration present for future conferences. Keep careful documentation on this student so you can let your supervisor know what you are doing with the student and why.
  • 107. Technology and assessment  Technology can be used in planning and constructing tests.  Test software can help you to analyze test results.  Software is helpful for maintaining student records.
  • 108. Technology and authentic assessment  Technology holds great promise for authentic assessment.  With scanning equipment, students can create and keep their materials on electronic media (CD’s). They can also create materials using the vast possibilities on the computer (powerpoint, movie maker, etc.).  Because of the ease of storage, student portfolios could potentially span several years of their schooling. Samples of student work across multiple years not only demonstrate student progress but they also help to reveal the areas of interest a student has developed and maintained across the years.