Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Ā
Oerc june 2014 final ppt combined
1. OERC RESEARCH RELATED
TO STUDENT GROWTH
MEASURES AND EDUCATOR
EFFECTIVENESS
Jill Lindsey, Ph.D.
Wright State University
Marsha Lewis, Ph.D.
Ohio University
E x t e n d i n g Y o u r K n o wl e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4
2. ļ” OERC can examine statewide policy and
practice questions
ļ” Follow and document early implementation
in order to inform policy and practice
ļ” Research implementation for multiple years
as start-up issues are resolved and
implementation takes holdālook for what is
working and what can be improved.
PURPOSE OF RESEARCH
3. ļ” Time span of findings offer insight into changing landscape
around teacher evaluation and student growth measures
ļ” 2012: Teachers and principals philosophically supportive of
new evaluation system and need to measure student growth
ļ” 2013: HB 555
ļ” 2014: Teachers and principals far less supportive and
troubled by concerns related to use of different types of
SGMs for evaluation
ļ” Common Themes
3
FOUR STUDIES SPANNING
THREE YEARS
4. 4
METHODOLOGY
ļ” Structured interviews with superintendents
and administration team members
ļ” Focus groups with teachers
ļ” Surveys of teachers in each pilot LEA
ļ” eTPES data analysis
5. ļ” OTES/OPES Implementation Study (37 LEAs)
ļ” Extended Testing for Value-Added Reporting (23
LEAs)
ļ” Initial Use of Student Learning Objectives (30 LEAs)
ļ” Student Growth Measures Policy & Practice (13
LEAs)
5
FUNDED PROJECTS RELATED TO
STUDENT GROWTH MEASURES
6. ļ” Sequencing, planning, feedback, and student
growth measures for teachers and principals
ļ” Preparation for evaluation
ļ” Experiences of teachers and principals evaluated
using student growth measures
ļ” Processes and measures of student growth that
districts adopted for use in teacher and principal
evaluation systems
6
OTES/OPES IMPLEMENTATION
STUDY
7. 7
EARLIEST FINDINGS
ļ§ Generally positive about the new evaluation systems
ļ§ Supported use of student growth measures in evaluation
ļ§ Lack of trust & misunderstandings about value-added, vendor
assessments, and local measures of student growth
ļ§ Unfairness of using different kinds of measures and differing
time cycles for different measures of student growth
ļ§ Conversations around new evaluation system focus on
instruction
ļ§ Time required to complete evaluation took time away from
working with students
ļ§ Appreciative at being asked about their experiences and views
8. ļ” Grant funds provided vendor testing for grades 1,2,3, and
high school subjects
ļ” Provided teacher-level value-added scores from vendor test
results
ļ” Processes and challenges related to extended testing
implementation
ļ” Role of roster verification
ļ” Use of SGMs in educator evaluation
ļ” Best practices/lessons learned
8
SGM EXTENDED TESTING MINI-
GRANT
9. Findings
ļ” Want reliable student
growth measures
ļ” Lack assessment literacy
ļ” Unclear how vendors will
provide data
ļ” Uncertain of roster
verification timing and
impact on VAM
ļ” LEAs opting to use lowest
percentages in weights
ļ” Grateful for being asked
about their experiences
Nine Drop-outsā Reasons
ļ” Requirement to use
extended testing results
too soon, unfair, not
part of grant
ļ” Cost of extended testing
was too high
ļ” Too many changes, and
too much on teachersā
plates
SGM MINIāGRANTS FOR
EXTENDED TESTING
10. 10
INITIAL USE OF STUDENT
LEARNING OBJECTIVES
Study examined fidelity of SLO use for:
ļ§ improving student performance
ļ§ measuring academic growth
ļ§ evaluating teachers
11. ļ” Training was not uniform across the state
ļ” Assessments varied widely across grade
levels, buildings, and districts
ļ” Processes excessively time-consuming
ļ” More challenging for semester or quarter
courses; limited time to complete the pre-
test, teach, post-test cycle
ļ” Implementation hampered by too many
changes to common core, piloting new
state tests and PARCC, and implementing
OTES
ļ” Many emotional moments and gratitude for
being invited to talk about experiences
Interviews
Surveys
Documents
eTPES
Data
11
EARLY
SLO
THEMES
ALL DATA
NOT YET
ANALYZED
12. 12
SGM POLICY AND PRACTICE
STUDY
OERC study of early adopter districts of
Student Growth Measures
ļ§ Designed to provide timely data to inform state
policy and district practice.
ļ§ āWhat does this look like when implemented?ā
ļ§ Teachersā perceptions of SGM components
ļ§ Do SGMs correlate with Performance on Standards? If
not, why not?
ļ§ The distribution of teacher and principal ratings
13. Focus group themes:
ļ” Fairness questions (e.g. Category A teachers do
not know OAA items in advance while Category C
teachers develop their own assessments)
ļ” Principalsā time consumed with teacher observation
activities
ļ” Teachers have questions/misconceptions about
value-added methodology
13
SGM POLICY AND PRACTICE
STUDY
14. 14
SGM POLICY AND PRACTICE
SURVEY
ļ” Deployed late February through mid-April, 2014
ļ” 22% response rate (603 teacher respondents/2,709 full-
time teachers) N = 469āclassroom teachers, 97
intervention specialists
ļ” Survey responses were similar to focus group findings
ļ” Of the four SGMs (Value-Added, SLOs, Vendor-Approved
Assessments, Shared Attribution), more surveyed
teachers think Student Learning Objectives āmost
accurately assess a teacherās instructional impact.ā
15. ļ” Early stage of implementation
ļ” Uncontrollable factors
ļ” Unequal measures/accuracy of the measures
ļ” SLOs teacher-developed, validity/reliability questions
ļ” Others see the SLOs as most fair because focused on
the content taught and results during evaluation year
ļ” Approved vendor assessments may not match content
standards
ļ” Value-added model was not formulated to measure
individual teacher effectiveness
15
FAIRNESS CONCERNS WITH
SGMS & EVALUATION
16. Teachers who see value in SGMs:
ļ” Feel it is important to measure student growth
ļ” Recognize the need for accountability
ļ” SGMs useful source of feedback for planning and
adjustment to outcomes
16
SGM POLICY AND PRACTICE
STUDY
17. āI do think it is important to make sure a child makes
adequate growth. However, there are factors that are
out of my control (attendance, home support, etc.)
that affect a child's learning and are not considered
when calculating the yearly academic growth of a
student.ā
āIt shows the effectiveness of a teacher and useful
data to adjust your teaching.ā
17
SGM POLICY AND PRACTICE
STUDY
18. TeacherāStudent Data Link/Roster Verification is necessary to
ensure SGM data quality.
Research Questions:
ļ” Are teachers actively participating in the verification of their
own rosters and percentage of instructional time with
students as specified by Ohioās roster verification process
guidelines?
ļ” Do principals and teachers have access to adequate
training and technical assistance?
ļ” Do principals and/or teachers perceive any issues with
roster verification?
ļ” What do Ohio educators view as the hallmarks of a good
system?
18
TEACHER ROSTER
VERIFICATION RESEARCH
19. 19
TEACHER ROSTER
VERIFICATION SURVEY
Sent online survey to all teachers and principals who
completed the link/roster verification process in spring
2013 and spring 2014
2013 survey: 5,984 teacher responses from 695 LEAs
2014 survey to-date: 6,778 teacher responses. Survey
still in field.
20. 2011 2013 2014*
(prelim.)
Yes 46% 57% 59%
No 23% 25% 25%
Don't
know
31% 17% 16%
Teachers ā Do you think the linkage process accurately
captured what was happening in your classroom (i.e.
students you taught last year, their length of enrollment,
and your percentage of instructional time with them)?
20
TEACHER ROSTER VERIFICATION SUR
21. 21
TEACHER ROSTER
VERIFICATION SURVEY
For teachers that answered āNoā:
Teachers ā Explain why you think the studentāteacher
linkage process did not accurately capture what was
happening in your classroom. (open-ended)
Themes:
ļ” Difficulty dividing time in various co-teaching
situations.
ļ” Unable to account for student absences
ļ” Teachers want to be able to report finer
increments of shared instructional responsibility
ļ” Studentsā schedules changed too
often/environment too dynamic to accurately
estimate time
22. 2011 2013 2014*
(prelim.)
Not at all
confident
39% 32% 35%
Somewhat
confident
55% 61% 58%
Very
confident
6% 8% 7%
Teachers ā Given your experience with the linkage process, how
confident are you that the linkage process improves the
accuracy of the teacher-level value-added data?
22
TEACHER ROSTER VERIFICATION SURVEY
23. Concerns
ļ” Early in
implementationālack of
trust and
misunderstandings
ļ” Perceived unfairness of
different kinds of
measures
ļ” Time required to
complete evaluation
ļ” Too many changes at
the same time
Kudos
ļ” Support measuring
student growth
ļ” Training for assessment
literacy desired
ļ” Appreciate being
consulted and heard
ļ” Roster verification
process is improving
23
COMMON THEMES ACROSS TIME
24. ļ” Build trust by continuing to include teachers and
administrators in conversations and policies that impact
them
ļ” Acknowledge concerns as legitimate
ļ” Provide professional development opportunities to
correct misunderstandings and knowledge deficits
ļ” Streamline paperwork where possible; use adaptations
from the field
ļ” Modify policy and roll-out timelines when possible
24
RECOMMENDATIONS
27. PLANNING FOR THE FUTURE,
LEARNING FROM THE PAST: WHAT
CAN SCHOOLS LEARN FROM COLLEGE
AND CAREER PROFILES OF
GRADUATES?
Joshua D. Hawley
Director, OERC and Associate Professor
John Glenn School of Public Affairs
The Ohio State University
Making Research Work for Education
29. College for all
High
School
College Work
Education and Career
High
School
ā¢ CTE
ā¢ STEM
College
ā¢ AP/Dual
Enrollment
ā¢ School +
Work
Workforce
Training
ā¢ Apprenticeship
ā¢ Military
29
LINKING SCHOOL TO
WORK
30. 30
CHANGING VIEWS OF WORK
REQUIRE NEW INFORMATION
ļ” As educators, we want to know about a variety of
educational outcomes, not just college-going rates
for students. At its broadest, we might consider the
following domains:
ļ§ College (traditional two and four-year sectors)
ļ§ Credentialed and non credentialed workforce training
ļ§ Apprenticeships
ļ§ Military
31. 31
QUALITY OF OUTCOMES MATTER
ļ” In this day and age, the quality of the outcomes matter a
great deal, and, therefore, we are concerned with how well
students are prepared to perform over time.
ļ” Concerns we typically have in this case:
ļ§ Student remediation
ļ§ Does student knowledge match what is required in college
classes?
ļ§ Are students prepared to pick a career? (I distinguish this
from a job)
ļ§ What happens to kids that go directly from high school to work?
ļ§ What happens to kids that dropout (both in terms of further
education and work?)
32. 32
PILOT HIGH SCHOOL REPORTS
ļ” Using data from the Ohio Longitudinal Data Archive
(OLDA), the OERC has been able to answer many of
these questions, beginning for high schools, and
present them in a format that schools can use.
ļ§ The report has four key question areas:
ļ§ What are the employment outcomes of high school graduates?
ļ§ What are the post secondary education outcomes of high school
graduates?
ļ§ What is the quality of post secondary education high school
graduates are carrying out?
ļ§ What happens to individuals that do not graduate from high
school (dropout)?
41. ļ” Develop a formal high school college and career report for
select districts (next up, Columbus City Schools; Battelle
For Kids)
ļ” Complete Workforce Success Measures Project with the
Office of Workforce Transformation (see OWT website for
introduction: http://workforce.ohio.gov/ ).
ļ” Work with Ohio Department of Education and Board of
Regents to answer questions about employment outcomes
for K-12 and higher education.
41
FUTURE PLANS
43. Slide 42
SM22 to be consistent with brochure and briefs
www.oerc.osu.edu | connect@oerc.osu.edu
Sunny Munn, 6/13/2013
44.
45. THIRD GRADE READING
GUARANTEE:
A CASE STUDY
Suzanne Franco, Professor, Wright State University
Jarrod Brumbaugh, Principal, Milton-Union Schools
Making Research Work for Education
Extending Your Knowledge Through Resear ch That Works! I Columbus , OH I June 18,
2 0 14
46. ļ” Ohio Third Grade Reading Guarantee (TGRG)
2012
ļ” In 2012ā13, 81% of Ohioās 3rd graders were
proficient or above
ļ” ODE offered competitive funding grant for
developing TGRG 2013ā2014 implementation
ļ” OERC funded a case study of a funded TGRG
three-LEA consortium for 2013ā2014
BACKGROUND
45
47. ļ” Co-located in Midwestern Ohio but had
not collaborated on previous initiatives
ļ” Orton-Gillingham Multi-Sensory training
and instructional strategies
ļ” Professional Learning Community (PLC)
ļ” Parent Informational Opportunities
46
CONSORTIUM TGRG PLAN
49. ļ” Feedback and buy-in for the training,
implementation, and PLC.
ļ” Progress and monitoring tools used.
ļ” Reading skills improved for On-Target
students. For Not on Target students?
ļ” Percentage of Kā3 students Not on Target in
2012-14?
49
RESEARCH QUESTIONS
50. For each LEA:
ļ” Document analysis of historical data and end-of-year
2014 RIMPS
ļ” Interviews and focus groups with Administrators (6),
Teachers (12)
ļ” Observations of O/G training and classroom instruction
50
METHODOLOGY
51. 51
O/G TRAINING
Training Details
ļ” Two 5-day sessions the week after school year ended
ļ” One 5-day session in November
ļ” Refresher course available summer, 2014
Training Feedback
ļ” Teachers felt it was engaging but too long, or covered
grade levels not in their interest. They would like to repeat
after one year of implementation.
ļ” Administrators from one LEA attended training. They felt
that common language helped with classroom observations.
52. 52
IMPLEMENTATION
Implementation Details
ļ” LEA 1 ā O/G not required due to receipt of grant
funds in mid-September, 2013. Used in RTI, Title 1,
and other interventions.
ļ§ KRAL is the identifier for K; State assessment tool for
grades 1ā3
ļ§ DIBELS is the progress monitor along with STAR and
Study Island
ļ” LEA 2 ā O/G not required (see above). Used in RTI,
intervention, and Title 1
ļ§ NWEA (2012) and DIBELS (2013) for Kā3
ļ§ DIBELS is the progress monitor
ļ” LEA 3 decided not to participate
53. Implementation Feedback Details
ļ” Not all supplies at beginning of year for all teachers due to
delay in receiving grant funds
ļ” Not all training completed at beginning of year (new
teachers)
ļ” Use of O/G not required; inconsistency a challenge for
teams
ļ” Merging O/G with LEA-approved reading curriculum difficult
ļ” Parent Nights were not well attended; PLC not formed
53
IMPLEMENTATION FEEDBACK
54. 54
PROGRESS AND MONITORING
TOOLS
ļ” LEA 1: DIBELS
ļ” LEA 2: NWEA (2012); DIBELS (2013)
Feedback
ļ” O/G assessment tools not Ohio-approved, therefore the
LEAs use DIBELS and NWEA to assess student progress
ļ” RIMPs not standardized among LEAs (issue for moving and
determining LEA or statewide impact)
ļ” For highly mobile student populations, 30-day requirement
for RIMP is very difficult to meet.
ļ” Too much testing for young students; test anxiety rising
55. Successes
ļ” After School
Program
ļ” Students respond
well to Multi-Sensory
ļ” Teachers want more
training
Challenges
ļ” Use of O/G not
consistent
ļ” Costs to sustain
ļ” O/G assessments not
state approved
ļ” RIMP forms could be
improved; data should be
collected for analyses
ļ” No information about
other LEA TGRG plans
55
CONSORTIUM SUMMARY
56. ļ” LEA1
ļ§ Grade 3 results to date
ļ§ Changes in implementations for 2014ā2015
ļ” LEA 2 Details
ļ§ Grade 3 results to date
ļ§ Changes in implementations for 2014ā2015
56
2013ā2014 RESULTS
2014ā2015 PLANS
57. ļ” Assessment tools aligned with TGRG programs
funded by state need approval.
ļ” Primary students exhibit high anxiety regarding
TGRG, impacting performance and fear of
school.
ļ” Required testing takes away from instruction
time. Embrace testing that collects needed
data for all accountability purposes, not just
one initiative.
57
TESTING RECOMMENDATIONS
58. ļ” Continue funding for TGRG development.
ļ” Continue monitoring LEA implementation of
funded and non-funded TGRG
implementation plans, and share ālessons
learned.ā
ļ” Revise RIMP format and collect RIMP data
for longitudinal analyses of common
deficiencies across the state.
58
TGRG POLICY
RECOMMENDATIONS
61. READY OR NOT?
E x t e n d i n g Y o u r K n o wl e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4
Ginny Rammel, Ph.D.
Superintendent
Milton-Union Exempted Village Schools
62. ļ” Research and study
ļ” Use of multiple forms of data
ļ” Create a culture of ācalculated risk-takersā
ļ” Embed professional development
āEVERY STUDENT, EVERY DAYā
63. ļ” Role model to all at all times
ļ” Establish high expectations
ļ” Collaborate, share
ļ” Trust, be truthful and supportive
ļ” Know your staff
63
CULTURE TAKES TIME TO
CHANGE!
64. 64
EXPLORE OPPORTUNITIES
ļ” Through grants, pilot studies, action
research
ļ” Connect with:
ļ§ Personnel at colleges and universities, OERC
ļ§ Educators from other districts
ļ§ Members of professional organizations
ļ§ Policymakers, legislators
65. ļ” Milton-Union was involved in a number of
grants:
ļ§ RttT Mini-grant Value-Added
ļ§ Student Growth Measures
ļ§ Early Literacy and Reading Readiness
ļ§ OERC case study
The more you and your staff research, study,
and share data, the better decisions you make.
Collaborating and working together help to
create a culture of āEvery Student, Every Day.ā
65
EMBEDDED PD:
BE AN ACTIVE PARTICIPANT
66. Our culture and the research and data from
our grant involvement led to development
of our OTES instrument, and a successful
year of implementation
This trust and openness flowed throughout
recent negotiations.
66
RESULTS
67. All initiatives impact one another and YOU:
ļ” OTES
ļ” OPES
ļ” Graduation requirements
ļ” Third Grade Reading Guarantee
Keep the main thing the main thing ā is
what Iām doing going to help students
learn? Are we preparing students for
ādown-the-roadā careers?
67
CHANGE WILL OCCUR WITH OR
WITHOUT YOU!
68. ļ” Do the research upfront
ļ” Study the data
ļ” Reflect, revise if necessary
ļ” Building project
ļ” Food service program
68
CALCULATED RISK-TAKERS
69. ļ” All day, every day kindergarten
ļ” On-site Head-Start programs
ļ” Grouping students by quintiles
ļ” H.S. ACT EOC exams
ļ” Recognized as a U.S. Department of Education
Green Ribbon School
ļ” Food service program ended the year in the black!
69
DATA SUPPORTS INITIATIVES
70. ļ” How do we better prepare students for their
futures ā colleges, universities, employers?
ļ” How can we convey to young parents the
importance of their role as a teacher?
ļ” How can we differentiate education so all
students are better served?
ļ” How can we better communicate the results
of research and the sharing of data?
70
NEXT STEPSā¦