SlideShare a Scribd company logo
1 of 19
qwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnm<br />Subject Area Benchmark Tests: Indicators of Success on the Alabama High School Graduation ExamCAT 689Dr. RiceMay 3rd, 2008Virginia Vilardi, Robert Vilardi<br />Introduction<br />In order to comply with Adequate Yearly Progress goals set forth by the No Child Left Behind mandate, several school districts in Alabama developed subject specific end-of-course exams.  These “benchmark” exams were designed to determine if students had mastered the skills outlined by the Alabama State Course of Study for each course.  The concepts that are outlined in the course of study are also the basis for the Alabama High School Graduation Exam.  Students are expected to have mastered the skills necessary to pass this exam through their coursework in Algebra I, Geometry, English, US History, and Biology.  Because the benchmark test and graduation exam measure for similar competencies, one of two conclusions may be drawn.  If the tests both measure the same objectives then testing students twice is redundant, time-consuming, and expensive; or, if the tests measure similar objectives then one may be used as an indicator for the other allowing for better and earlier remediation for those students who do not have success on the benchmark tests.  This may also have implication in the areas of curricular development and data driven instructional design.  <br />Review of Literature<br />Browne-Dianis, Judith A. (2008). Graduation Tests Will Harm Students. Fair Test: The National Center for Fair and Open Testing , 1-2.<br />In the article titled “Graduation Tests Will Harm Students” author Judith Brown-Dianis concludes that the new requirements for graduation test completion in Maryland will only exacerbate an already deteriorating system.  Beginning in the 2008-2009 school year Maryland students will be forced to pass four state tests in order to receive their high school diplomas.  This requirement is in addition to any other previous requirement of grades, Carnegie Units, or credits.  The author then notes that research into high stakes testing has shown that these tests have been linked to decreases in high school completion rates and increased dropout rates.  Additionally, the author cites a report by the National Academy of Sciences that indicated the use of high stakes testing does not improve overall educational levels, but rather can be a medium through which students are punished.  Of those students negatively impacted by these tests, an inordinate number of them are minority students.  In this article the author points to Maryland’s already low graduation rates of 70.4 percent overall and 57 percent for African-American or Latino students.  If the students in these schools follow the same pattern as other schools implementing high stakes testing these already low graduation rates will continue to spiral out of control.  In conclusion the author demands that educators eliminate their reliance on high stakes testing and allow a student’s twelve years of school speak for themselves.  <br />Fisher, M., & Elliott, S. (2004, May 25). Unsure Future: Changes planned, but will test problems be solved? Dayton Daily News , pp. A-1.<br />In “Unsure Future” the authors detail some of the struggles that have plagued Ohio School Districts and Ohio Students.  Ohio students are required to pass content specific graduation exam tests in order to graduate from high school.  At the time of this article the graduation exam was the Ohio Ninth Grade proficiency test.  The skills required by this exam are expected to be mastered by the end of the eighth grade and these tests are completed successfully by approximately 98% of students.  The author then describes the test that will replace the current test in the year following.  This exam is based on benchmarks and content standards that are on average two years above the current eighth grade proficiency requirement.  In the first pilot study of this test the success rate for the mathematics portion was only 23.1% and just over 5% of the African American population were successful on the exam.  In later studies the success rate was greater, but still only 67.9% were successful with 61% of African American students failing the exam.  These abysmal rates were increased significantly after the test was modified and the minimum scores required were lowered.  Still the failure of students to perform well on this examination is contrary to increases found in the current Ohio Ninth Grade Proficiency Test and the SAT college entrance exam.  One explanation for the failure of students to perform well on these exams is the use of 55 mathematics benchmarks in the creation of the exam.  These benchmarks reflect content that the students should be able to accomplish, but these content standards are based on current revisions to policy and do not necessarily reflect the education received by those taking the exam.  This is not the first time that Ohio students have been forced to deal with adversity in regards to testing.  When the first Ninth Grade Proficiency Test was administered only 33% of the students passed the exam.  Four years later the success rate was 97%.  While this educational wake-up call is important for administrators the author indicates that it is unfortunate and borderline immoral to hold students accountable for standards and curricula to which they have not been exposed.  As the state grapples with hard decisions concerning the graduation exams they are forced to weigh issues of social consequence and fairness against issues of progress and adequacy.  Unfortunately neither has a straightforward solution.  <br />Zabala, Dalia. (2008). A Move Toward End-of-Course Exams. Washington D.C.: Center on Education Policy.<br />Before 2002 school systems regularly used minimum competency exams in order to determine a student’s eligibility to graduate from high school.  Since that time schools have been moving towards comprehensive exams, and more recently to end-of-course exams.  The basic skills exams traditionally tested students on skills developed prior to entrance in high school, most commonly eighth grade.  Comprehensive exams are traditionally targeted at the tenth grade level and are coupled to state standards for the content at that grade level.  End-of-course exams measure mastery of specific content pertaining to a subject that a student has just completed.  The use of end-of–course examinations is still in its infancy, but several states are implementing graduation requirements including competency exams and end-of-course exams.  <br />Nancy Kober. (2004). Testtalk: My School Didn't Make Adequate Yearly Procress-So What Does That Mean? Washington D.C.: Center of Education Policy.<br />As part of the No Child Left Behind Act of 2002 schools were required to meet specific benchmarks developed by both Federal and State Educational Administrators.  In order to meet the standards set up for Adequate Yearly Progress a school must ensure that every major subgroup in the school’s population tests at the proficient level as determined by the state board of education.  Schools must have a minimum of 95% participation from each of these subgroups on the state mandated examination.  In addition to these requirements the schools are each required to meet at least one other benchmark as prescribed by the state.  2003 marked the first year schools were evaluated for Adequate Yearly Progress as mandated by the NCLB Act of 2002.  Of the almost 88,000 schools evaluated, 32% of them failed to meet AYP goals.  As schools make progress towards their AYP goals they must also have a plan for continual improvement of proficiency.  According to NCLB guidelines schools must continuously improve and have a plan in place with benchmarks to ensure that they reach 100% proficiency by the year 2014.  While these goals are ambitious, they may be unattainable as they require gains in achievement that are higher than the gains seen currently in the best achieving schools.  In order to meet these goals state boards need to analyze how the performance targets are set and allow for some flexibility in a schools accountability program.  Additionally state systems should move beyond a specific proficiency goal towards an improvement goal that is determined by where a school starts their improvement plan.  Finally, more factors than simply a test score need to be analyzed, such as dropout rates or percentages of students taking rigorous college preparatory courses.  Although AYP guidelines must be set to meet federal guidelines, the use of some of these alternatives may provide schools with much needed alternatives when facing an AYP shortcoming.  <br />Rubenstein, Grace, (2008) Reinventing the BIG TEST. Edutopia, 4 (2), 32-37.<br />Reinventing the Big Test evaluates the merit of today’s standardized tests. The author believes that, though the accountability tests show numbers with authority, they have two vital flaws: they encourage practice and drill teaching, and they do not show the quality of student learning.  Standardized tests are often the rubric that policy makers use to determine if schools have met their annual yearly progress goals and therefore have satisfied No Child Left Behind. The author states that these tests have some basic flaws.  Some of these flaws are in the difficulty level of the tests, scoring errors, or ambiguous questions.  Other flaws include the psychological pressure of one test determining a student’s whole future opportunities or the pressure of taking one of these high stakes tests when the student is ill or having personal issues at home. The author researched alternatives to these critical assessments and has found some alternative methods of evaluation and advises more, including the revamping of the whole testing program.  While this may sound radical, some of this is already taking place.  Some schools use portfolios for evaluation in place of exams, others do presentation, and still others have developed interactive computer programs that test students’ skills. Test writers are skeptical about the feasibility of developing new large scale complex tests that would measure student learning in a more sophisticated manner.  The large scale may need to be replaced and tailored to a school or an area, but this will raise a whole new set of issues not discussed here.  <br />Kober, Nancy, (2002). What Tests Can and Cannot Tell Us. Test Talk for Leaders, 2 1-15.<br />Center on Education Policy reviewed testing’s strengths and limitations. This article was a qualitative study done for policy makers to help educate them on the viability of today’s standardized tests.  Today’s high stakes standardized tests give a wealth of information that is hard to come by with the same amount of time, effort and expense. Testing still has a price but compared to other evaluation methods it gives the most bang for the buck. These tests provide standard and consistent information school to school even state to state.  Many feel that this information is more valuable than individual teacher’s appraisals of student performance.  For example most colleges will not even look at a student’s application without a SAT or ACT score attached to the package.  This is because an” A” in Mr. Smith’s class at high school one might only be a “C” in Ms. Jones class at high school two. However, test scores are not an absolute they are more of an estimate or a range of scores.  Most test makers build these variations into the standard error for the test. This range needs to be taken into account when annual yearly progress (AYP) reports for a school or district are looked at.  Great fluctuations in scores normally have a reason such as a higher than usual population immigration or emigration.  A higher than normal special education population can also vary results. This why AYP results for No Child Left Behind can now be averaged over three years to help correct for the absoluteness of some tests. Tests are indispensible for providing comparable information but using them in conjunction with other methods of grading student work can provide a deeper understanding of the student’s true capabilities. One test should not make it or break it for a student’s or school’s future.<br />Toch, Thomas, (2006). Turmoil in the Testing Industry. Educational Leadership, 64 (3) 53-57.<br /> This article was a summary of a qualitative study by testing industry executives, state testing officials, and other testing experts. These experts examined the results of the No Child Left Behind (NCLB) testing demands on all states.  Annual Yearly Progress on standardized tests is the meter-stick by which schools are held accountable.  Some of the problems with these standardized tests are that they do not always align with the state’s course of study or that they only test lower level thinking skills. Other problems include the stress one the test making industry to generate multiple tests quickly.  Making a credible test takes time.  It involves not only writing the test but proofing the test, standardizing the passing score by giving it to students, checking it for racial bias and much more.  All of these things are not done or not well if there is a rush to produce a test in a limited amount of time. Cost is also an issue in making a test.  A test that can be run through a machine costs pennies where a test that must be scored by hand costs $.50 - $5.00.  When cost is an issue multiple-choice wins hands down over essay. NCLB testing is being called the race to the bottom by the testing industry. This is of course the complete opposite to NCLB goals.  States must meet AYP to continue to receive funding; therefore changing the standards at the state level allows schools to achieve results without actually showing improvement. Policy makers must acknowledge what is going on and address it or the goals of NCLB will not be worth all of the effort that has been made by so many to meet it.   <br />Terry, Brooke Dollens, (2007). End-of-course exams as a measuring stick. Beaumont Journal, 1-3.<br />This article compared the Texas Graduation Exam – TAKS with end-of-course tests. This article was a qualitative observation stating that students were getting inflated grades and that students taking honors courses were not truly doing honors work.  A student that received credit for courses such as geometry and algebra II could not pass the math section of TAKS.  This was especially apparent among lower income students. The author believes that an end-of-course exam would eliminate issues such as this plus give timely feedback to students and parents.  TAKS is not given until the end of the 11th grade giving students little time to remediate the problem before facing graduation.  Fifteen states and several countries in Europe and East Asia already use end-of-course exams instead of exit or graduation exams.  The curriculum is normally only as tough as the test used to measure it if you want to up the ante toughen the test. <br />Toch, Thomas, (2008). Test Results and Drive-By Evaluations. Education Sector, 1-3.<br /> This article was an observational study of what test scores actually measure.  It was a qualitative study looking at a proposal by New York City’s Chancellor Joel I. Klein. Klein wants to rate teachers according to their students’ success rate on standardized tests. The problem with this is that only a portion of the instructors teach subjects tested by the exam. This would make team teaching or remediation more valuable in these schools.  Klein also wants teacher reviews to be much more in- depth.  Evaluations study a checklist of items (like being presentably dressed) but do not normally focus on the quality of instruction. Tougher teacher evaluations would do a lot to take away seniority privileges and favoritism. These evaluations would help to promote administration support.  More teachers would rather work in a school that had better administration support than to work in one that paid more money.  Teachers with administration support also show less opposition to being judged by their students’ test scores.  It seems if you are doing a quality job, the boss knows it and supports you regardless of the numbers. <br />Neill, Monty Ed.D. (2008). Fair Test. National Center for Fair and Open Testing, 1-4.<br />The article was an evaluation of the progress achieved by mandated high school graduation exams and advice to the state of Rhode Island in constructing this type of exam. The article was a quantitative and qualitative study done comparing the graduation exam intensity with dropout rate.  The author claims that the evidence clearly shows that exit exams do more harm than good.  He believes it stifles creativity, critical thinking and teamwork.  He says he understands the need for the testing but that at the present time this type of testing tends to harm the disadvantaged student or the lower income student much more than it assists them. In states where the exit exam has become increasingly tough the dropout rate has risen proportionally. Minority drop out in these states is two to three times higher than white students. The author advises Rhode Island to adopt a multi-tiered evaluation route similar to Wyoming or Nebraska rather than invest in an exit exam that is sure to harm more students than help them. <br />Zabala, Dalia,   Minnici, Dr. A. , McMurrer, J.,  Hill, Dr. D.,  Bartley, A. P. and  Jennings, J., (2007). State High School Exit Exams: Working to Raise Test Scores. Center on Education Policy, 1-21.<br />This article was a summary of a study by the Center on Education Policy reviewing the current year’s issues with high school exit exams. It was both a quantitative and qualitative study reviewing what was working and what was not in the states that use exit exams for determining graduation.  The items that were working were intervention and remediation at the state and local levels.  Most state exit exams are aligned to grade ten and are supposed to measure mastery of the state curriculum. All states administering exit exams have passing gap rates between various groups of students.  White English speaking students tend to do better than ESL or African American students. The article also noted that several states are moving towards end-of-course exams or doing a dual process of both exit exam and end-of-course exam.  By 2015 twelve states will be using the end-of-course exam compared with the eighteen that will require standards based exit exams and four that will require a combination of both. These are all a move away from the minimum competency exams that were once commonly used as exit exams by many states.<br />Problem and Significance<br />High Stakes testing is a contentious issue in the current climate of consumer driven educational strategies where individual school systems endure significant scrutiny of their respective scores on various tests.  These systems are required to explain, account, and improve any deficiencies in the educational process both real and perceived.  Because so much is at stake, administrators look for opportunities to increase performance such as remediation, intervention, and content specific review.  Additionally, administering these tests takes an average of nine instructional days per semester, benchmark testing taking four and the Alabama High School Graduation Exam (AHSGE) taking five.  All of these activities take away from the traditional course instruction and can affect both the fluidity and the pace of instruction.  In order to meet the content requirements outlined by the course of study teachers find themselves rushing through important concepts failing to give the quality instruction they know is necessary to reach higher levels of learning.  Students are often taught using comprehension or knowledge based instructional strategies to facilitate the pace and “practice” test taking skills while forfeiting a truly meaningful educational experience.  Do the additional days of benchmark testing assist students or are they harming them by taking away additional educational opportunities.  Through this study we are examining the possibility that these subject area benchmark tests can be used as indicators of success on the AHSGE increasing the benefits from the benchmark tests. <br />Purpose<br />The purpose of this study is to examine whether subject specific end-of-course exams can be used as an indicator of success on the Alabama High School Graduation Exam.  If a correlation between the tests can be established then remediation opportunities can be implemented before the graduation exam is administered.  <br />Research Design<br />The use of end-of-course exams as an indicator of student achievement is a fairly new idea with most states still opting to use comprehensive examinations.  Many states are beginning to do research with end-of-course exams with several planning to establish these tests as graduation requirements by 2015.  In order to determine whether these examinations could be used as indicators on the Alabama High School Graduation Exam a thorough investigation of literature was performed to determine what research has been completed in this area. The findings of this review revealed many issues with the current testing policies, as well as a shortage of viable options to the traditional graduation/exit exams.  Because many of these end-of-course exams are still in developmental stages there were no long-term results to compare to this study.  However, the Center on Education Policy has stated that it will include research into this area in its August 2008 report.  <br />Data Collection<br />The population for this study was the tenth grade student body at Wetumpka High School.  This group of students had the greatest percentage of students who had both taken the AHSGE and subject specific end-of-course exams.  This data was supplied from STI Assessment, the school’s database.  Student scores on each of the subject tests were collected in the areas of Algebra IB, Geometry, English, and Social Studies and then paired with student scores on the AHSGE.  <br />Data Analysis<br />Student data was entered into a spreadsheet with values of zero or one.  Students who had passed the AHSGE were given scores of one in the respective category and students who had failed that portion were given a zero.  Because there are no set parameters for passing subject area end-of-course tests students were evaluated at the standard ten-point grading scale pass score of 60%.  Students who met this score were given a one and students who failed to reach this mark were given a zero.  This was completed for each of the subjects in which an end-of-course exam could be paired with the graduation exam.  AHSGE requirements for science are currently being evaluated and no scores are available at this time.  Additionally, some of the end-of-course evaluations pertained lower grade-levels than tenth and these were not evaluated due to the graduation exam being given starting in the tenth grade.  The AHSGE tests that were evaluated were the language, reading, mathematics, and social studies tests.  Because the English benchmark test contains both reading and language skills it was compared to the reading and language portions of the AHSGE separately and then combined to determine the overall comparison.  In analyzing the mathematics benchmark tests both Algebra IB and Geometry scores were used, because these courses have significantly different content standards the exams were only compared separately.  Additionally, in the Algebra IB end-of-course exam results, only two students out of 31 earned a passing score.  This along with interviews with Algebra IB instructors suggests that the end-of-course test may be invalid for measuring mastery of course content.  <br />Results<br />AHSGE TestEnd-of-Course TestSample Size (n)Prediction Percentage of Graduation Exam Success When EOC Exam is PassedPrediction Percentage of Graduation Exam Success When EOC Exam is FailedPrediction Percentage of Graduation Exam Success Overall Social ScienceUS History I8092.592679.245383.75ReadingEnglish 1011181.481562.571.8182LanguageEnglish 1011172.222262.567.2727Combined Reading and Language English 1011164.814862.563.6364MathematicsAlgebra IB3150.0 *65.517264.5161MathematicsGeometry6091.304332.432455.0<br />*Only two students of the 31 tested earned a passing score<br />The results in social science indicated a strong connection between the scores on the AHSGE and the end-of-course exam in US History I.  Students who earned a passing score on the EOC exam were 92.5926% likely to also earn a passing score on the AHSGE.  Students who failed the EOC exam were 79.2453% likely to also fail the AHSGE.  When these scores are considered together the US History I EOC exam successfully predicted the score on the AHSGE 83.75% of the time.  <br />The results in reading also indicated a strong connection between the scores on the AHSGE and the end-of-course exam in English 10.  Students who earned a passing score on the EOC exam were 81.4815% likely to also earn a passing score on the AHSGE.  Students who failed the EOC exam were 62.5% likely to also fail the AHSGE.  When these scores are considered together the English 10 EOC exam successfully predicted the score on the AHSGE 71.8182% of the time.  <br />The results in language indicated a moderately strong connection between the scores on the AHSGE and the end-of-course exam in English 10.  Students who earned a passing score on the EOC exam were 72.2222% likely to also earn a passing score on the AHSGE.  Students who failed the EOC exam were 62.5% likely to also fail the AHSGE.  When these scores are considered together the English 10 EOC exam successfully predicted the score on the AHSGE 67.2727% of the time.  <br />When compared together the reading and language results indicated a moderate connection between the scores on the AHSGE and the end-of-course exam in English 10.  Students who earned a passing score on the EOC exam were 64.8148% likely to also earn a passing score on the AHSGE.  Students who failed the EOC exam were 62.5% likely to also fail the AHSGE.  When these scores are considered together the English 10 EOC exam successfully predicted the score on the AHSGE 63.6364% of the time.  <br />The results in mathematics indicated a connection between the scores on the AHSGE and the end-of-course exam in Algebra IB.  Students who earned a passing score on the EOC exam were 50% likely to also earn a passing score on the AHSGE; however, the sample size was too small to assign any significance.  Students who failed the EOC exam were 65.5172% likely to also fail the AHSGE.  When these scores are considered together the Algebra IB EOC exam successfully predicted the score on the AHSGE 64.5161% of the time.  <br />The results in mathematics indicated a mixed connection between the scores on the AHSGE and the end-of-course exam in Geometry.  Students who earned a passing score on the EOC exam were 91.3043% likely to also earn a passing score on the AHSGE.  Students who failed the EOC exam were 32.4324% likely to also fail the AHSGE.  When these scores are considered together the Geometry EOC exam successfully predicted the score on the AHSGE 55.0% of the time.  <br />Conclusions<br />The results of this study indicate that the subject specific end-of-course examinations can be used as a predictive tool for the Alabama High School Graduation Exam.  Although the effectiveness of the indicator varied from subject to subject the end-of course exam successfully predicted the outcome of the AHSGE 67.6656% of the time.  The connection between the tests is apparent, allowing schools to make remediation opportunities available before the graduation exam is administered.  Additionally, if teachers and administrators further revise the EOC exams to rectify inherent problems exposed by this study, they can expect the predictive properties of these exams to increase making them more viable as tools for improved student achievement.  While the administration of these exams is inconvenient for educators and puts further restrictions on already tight schedules, the validity of these tests to measure content standards is significant and thus the use of subject area end-of-course examinations should continue.  <br />References<br />Browne-Dianis, Judith A. (2008). Graduation Tests Will Harm Students. Fair Test: The National <br />Center for Fair and Open Testing , 1-2.<br />Fisher, M., & Elliott, S. (2004, May 25). Unsure Future: Changes planned, but will test problems be solved? Dayton Daily News , pp. A-1.<br />Kober, Nancy. (2004). Testtalk: My School Didn't Make Adequate Yearly Procress-So What Does That Mean? Washington D.C.: Center of Education Policy.<br />Kober, Nancy. (2002). What Tests Can and Cannot Tell Us. Test Talk for Leaders, 2 1-15.<br />Neill, Monty Ed.D. (2008). Fair Test. National Center for Fair and Open Testing, 1-4.<br />Rubenstein, Grace, (2008) Reinventing the BIG TEST. Edutopia, 4 (2), 32-37.<br />Terry, Brooke Dollens, (2007). End-of-course exams as a measuring stick. Beaumont Journal, 1-3.<br />Toch, Thomas, (2008). Test Results and Drive-By Evaluations. Education Sector, 1-3.<br />Toch, Thomas, (2006). Turmoil in the Testing Industry. Educational Leadership, 64 (3) 53-57.<br />Zabala, Dalia,   Minnici, Dr. A. , McMurrer, J.,  Hill, Dr. D.,  Bartley, A. P. and  Jennings, J., (2007). State High School Exit Exams: Working to Raise Test Scores. Center on Education Policy, 1-21.<br />Zabala, Dalia. (2008). A Move Toward End-of-Course Exams. Washington D.C.: Center on Education Policy.<br />
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...
Subject Area Benchmark Tests: Indicators of Success on the Alabama ...

More Related Content

What's hot

Truths myths-ga-milesstone-3-24-15
Truths myths-ga-milesstone-3-24-15Truths myths-ga-milesstone-3-24-15
Truths myths-ga-milesstone-3-24-15gatostopcommoncore
 
ECS Education Trends
ECS Education TrendsECS Education Trends
ECS Education TrendsDawn Follin
 
2013 ccrpi indicators 10.11.13
2013 ccrpi indicators 10.11.132013 ccrpi indicators 10.11.13
2013 ccrpi indicators 10.11.13jwalts
 
Research
ResearchResearch
Researchmxr028
 
Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...
Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...
Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...William Kritsonis
 
Readiness Matters: The Impact of College Readiness on College Persistence and...
Readiness Matters: The Impact of College Readiness on College Persistence and...Readiness Matters: The Impact of College Readiness on College Persistence and...
Readiness Matters: The Impact of College Readiness on College Persistence and...National Partnership for Educational Access
 
Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...
Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...
Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...William Kritsonis
 
Policy Paper with Edits (8.5.15)
Policy Paper with Edits (8.5.15)Policy Paper with Edits (8.5.15)
Policy Paper with Edits (8.5.15)Caroline Faux
 
Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010
Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010
Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010William Kritsonis
 
High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...
High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...
High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...sowat1ka
 
WA Special Ed. Comparisons. Dammeier Data Request.03.27.15
WA Special Ed. Comparisons. Dammeier Data Request.03.27.15WA Special Ed. Comparisons. Dammeier Data Request.03.27.15
WA Special Ed. Comparisons. Dammeier Data Request.03.27.15Lynne Tucker, MPA
 
EDU 710 Lit Rev Outline #2
EDU 710 Lit Rev Outline #2EDU 710 Lit Rev Outline #2
EDU 710 Lit Rev Outline #2Brandy Shelton
 

What's hot (16)

Truths myths-ga-milesstone-3-24-15
Truths myths-ga-milesstone-3-24-15Truths myths-ga-milesstone-3-24-15
Truths myths-ga-milesstone-3-24-15
 
ECS Education Trends
ECS Education TrendsECS Education Trends
ECS Education Trends
 
2013 ccrpi indicators 10.11.13
2013 ccrpi indicators 10.11.132013 ccrpi indicators 10.11.13
2013 ccrpi indicators 10.11.13
 
Research
ResearchResearch
Research
 
Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...
Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...
Michelle Annette Cloud, PhD Dissertation Defense, Dr. William Allan Kritsonis...
 
Readiness Matters: The Impact of College Readiness on College Persistence and...
Readiness Matters: The Impact of College Readiness on College Persistence and...Readiness Matters: The Impact of College Readiness on College Persistence and...
Readiness Matters: The Impact of College Readiness on College Persistence and...
 
THESIS
THESISTHESIS
THESIS
 
AP_report
AP_reportAP_report
AP_report
 
Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...
Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...
Clarence Johnson, PhD Dissertation Proposal Defense, Dr. William Allan Kritso...
 
Policy Paper with Edits (8.5.15)
Policy Paper with Edits (8.5.15)Policy Paper with Edits (8.5.15)
Policy Paper with Edits (8.5.15)
 
Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010
Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010
Lunenburg, fred c[1]. state mandated performance testing schooling v1 n1 2010
 
High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...
High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...
High-Stakes Standardized Testing: The Advantages and Disadvantages of Teachin...
 
WA Special Ed. Comparisons. Dammeier Data Request.03.27.15
WA Special Ed. Comparisons. Dammeier Data Request.03.27.15WA Special Ed. Comparisons. Dammeier Data Request.03.27.15
WA Special Ed. Comparisons. Dammeier Data Request.03.27.15
 
Follow
FollowFollow
Follow
 
EDU 710 Lit Rev Outline #2
EDU 710 Lit Rev Outline #2EDU 710 Lit Rev Outline #2
EDU 710 Lit Rev Outline #2
 
20871 prelim2
20871 prelim220871 prelim2
20871 prelim2
 

Similar to Subject Area Benchmark Tests: Indicators of Success on the Alabama ...

Standardized Tests, by Kathy and Mary
Standardized Tests, by Kathy and MaryStandardized Tests, by Kathy and Mary
Standardized Tests, by Kathy and Marymarz_bar_angel_9999
 
Michael, There are two major flaws here, the first being that your
Michael, There are two major flaws here, the first being that yourMichael, There are two major flaws here, the first being that your
Michael, There are two major flaws here, the first being that yourDioneWang844
 
Pro questdocuments 2015-03-16(2)
Pro questdocuments 2015-03-16(2)Pro questdocuments 2015-03-16(2)
Pro questdocuments 2015-03-16(2)Rose Jedin
 
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docxRunning head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docxtoltonkendal
 
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docxRunning head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docxagnesdcarey33086
 
Running head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docx
Running head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docxRunning head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docx
Running head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docxcharisellington63520
 
6Standardized TestingStandardized tests can be defi.docx
6Standardized TestingStandardized tests can be defi.docx6Standardized TestingStandardized tests can be defi.docx
6Standardized TestingStandardized tests can be defi.docxalinainglis
 
Fixing America’s Standardized Testing
Fixing America’s Standardized TestingFixing America’s Standardized Testing
Fixing America’s Standardized TestingAlex Cortez
 
AN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docx
AN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docxAN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docx
AN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docxnettletondevon
 
Robin garzaresearchpaper
Robin garzaresearchpaperRobin garzaresearchpaper
Robin garzaresearchpaperRobin Garza
 
Assessment and accountability during COVID-19
Assessment and accountability during COVID-19Assessment and accountability during COVID-19
Assessment and accountability during COVID-19Analisa Sorrells
 
Robin garzaresearch
Robin garzaresearchRobin garzaresearch
Robin garzaresearchRobin Garza
 
Research
ResearchResearch
Researchmxr028
 
Las Vegas Chamber of Commerce Student Testing Narrative
Las Vegas Chamber of Commerce Student Testing NarrativeLas Vegas Chamber of Commerce Student Testing Narrative
Las Vegas Chamber of Commerce Student Testing NarrativeLas Vegas Chamber of Commerce
 
Action Research Proposal: Literature Review
Action Research Proposal: Literature Review Action Research Proposal: Literature Review
Action Research Proposal: Literature Review J'Nai Whitehead, MSHRM
 
AP Collegeboard Research
AP Collegeboard ResearchAP Collegeboard Research
AP Collegeboard Researchmsstech10
 
Criterion-Referenced Competency Test
Criterion-Referenced Competency TestCriterion-Referenced Competency Test
Criterion-Referenced Competency TestTasha Holloway
 
Action Research Proposal: Problem, Purpose, and Research Questions
Action Research Proposal: Problem, Purpose, and Research Questions Action Research Proposal: Problem, Purpose, and Research Questions
Action Research Proposal: Problem, Purpose, and Research Questions J'Nai Whitehead, MSHRM
 

Similar to Subject Area Benchmark Tests: Indicators of Success on the Alabama ... (20)

Standardized Tests, by Kathy and Mary
Standardized Tests, by Kathy and MaryStandardized Tests, by Kathy and Mary
Standardized Tests, by Kathy and Mary
 
Michael, There are two major flaws here, the first being that your
Michael, There are two major flaws here, the first being that yourMichael, There are two major flaws here, the first being that your
Michael, There are two major flaws here, the first being that your
 
Pro questdocuments 2015-03-16(2)
Pro questdocuments 2015-03-16(2)Pro questdocuments 2015-03-16(2)
Pro questdocuments 2015-03-16(2)
 
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docxRunning head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
 
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docxRunning head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
Running head STANDARDIZED TESTS SECTIONS I AND II1STANDARDIZED.docx
 
Running head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docx
Running head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docxRunning head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docx
Running head MORE THAN STANDARDIZED TESTS1MORE THAN STANDARDIZ.docx
 
6Standardized TestingStandardized tests can be defi.docx
6Standardized TestingStandardized tests can be defi.docx6Standardized TestingStandardized tests can be defi.docx
6Standardized TestingStandardized tests can be defi.docx
 
Fixing America’s Standardized Testing
Fixing America’s Standardized TestingFixing America’s Standardized Testing
Fixing America’s Standardized Testing
 
AN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docx
AN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docxAN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docx
AN INTERIM REPORT ON A PILOT CREDITRECOVERY PROGRAM IN A LAR.docx
 
Robin garzaresearchpaper
Robin garzaresearchpaperRobin garzaresearchpaper
Robin garzaresearchpaper
 
Assessment and accountability during COVID-19
Assessment and accountability during COVID-19Assessment and accountability during COVID-19
Assessment and accountability during COVID-19
 
Robin garzaresearch
Robin garzaresearchRobin garzaresearch
Robin garzaresearch
 
Research
ResearchResearch
Research
 
Las Vegas Chamber of Commerce Student Testing Narrative
Las Vegas Chamber of Commerce Student Testing NarrativeLas Vegas Chamber of Commerce Student Testing Narrative
Las Vegas Chamber of Commerce Student Testing Narrative
 
Action Research Proposal: Literature Review
Action Research Proposal: Literature Review Action Research Proposal: Literature Review
Action Research Proposal: Literature Review
 
TEST PREPARATION.pdf
TEST PREPARATION.pdfTEST PREPARATION.pdf
TEST PREPARATION.pdf
 
AP Collegeboard Research
AP Collegeboard ResearchAP Collegeboard Research
AP Collegeboard Research
 
Criterion-Referenced Competency Test
Criterion-Referenced Competency TestCriterion-Referenced Competency Test
Criterion-Referenced Competency Test
 
Persuasive Essay
Persuasive EssayPersuasive Essay
Persuasive Essay
 
Action Research Proposal: Problem, Purpose, and Research Questions
Action Research Proposal: Problem, Purpose, and Research Questions Action Research Proposal: Problem, Purpose, and Research Questions
Action Research Proposal: Problem, Purpose, and Research Questions
 

More from butest

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEbutest
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jacksonbutest
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer IIbutest
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazzbutest
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.docbutest
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1butest
 
Facebook
Facebook Facebook
Facebook butest
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...butest
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...butest
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTbutest
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docbutest
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docbutest
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.docbutest
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!butest
 

More from butest (20)

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
 
PPT
PPTPPT
PPT
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1
 
Facebook
Facebook Facebook
Facebook
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
 
hier
hierhier
hier
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
 

Subject Area Benchmark Tests: Indicators of Success on the Alabama ...

  • 1. qwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmrtyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnm<br />Subject Area Benchmark Tests: Indicators of Success on the Alabama High School Graduation ExamCAT 689Dr. RiceMay 3rd, 2008Virginia Vilardi, Robert Vilardi<br />Introduction<br />In order to comply with Adequate Yearly Progress goals set forth by the No Child Left Behind mandate, several school districts in Alabama developed subject specific end-of-course exams. These “benchmark” exams were designed to determine if students had mastered the skills outlined by the Alabama State Course of Study for each course. The concepts that are outlined in the course of study are also the basis for the Alabama High School Graduation Exam. Students are expected to have mastered the skills necessary to pass this exam through their coursework in Algebra I, Geometry, English, US History, and Biology. Because the benchmark test and graduation exam measure for similar competencies, one of two conclusions may be drawn. If the tests both measure the same objectives then testing students twice is redundant, time-consuming, and expensive; or, if the tests measure similar objectives then one may be used as an indicator for the other allowing for better and earlier remediation for those students who do not have success on the benchmark tests. This may also have implication in the areas of curricular development and data driven instructional design. <br />Review of Literature<br />Browne-Dianis, Judith A. (2008). Graduation Tests Will Harm Students. Fair Test: The National Center for Fair and Open Testing , 1-2.<br />In the article titled “Graduation Tests Will Harm Students” author Judith Brown-Dianis concludes that the new requirements for graduation test completion in Maryland will only exacerbate an already deteriorating system. Beginning in the 2008-2009 school year Maryland students will be forced to pass four state tests in order to receive their high school diplomas. This requirement is in addition to any other previous requirement of grades, Carnegie Units, or credits. The author then notes that research into high stakes testing has shown that these tests have been linked to decreases in high school completion rates and increased dropout rates. Additionally, the author cites a report by the National Academy of Sciences that indicated the use of high stakes testing does not improve overall educational levels, but rather can be a medium through which students are punished. Of those students negatively impacted by these tests, an inordinate number of them are minority students. In this article the author points to Maryland’s already low graduation rates of 70.4 percent overall and 57 percent for African-American or Latino students. If the students in these schools follow the same pattern as other schools implementing high stakes testing these already low graduation rates will continue to spiral out of control. In conclusion the author demands that educators eliminate their reliance on high stakes testing and allow a student’s twelve years of school speak for themselves. <br />Fisher, M., & Elliott, S. (2004, May 25). Unsure Future: Changes planned, but will test problems be solved? Dayton Daily News , pp. A-1.<br />In “Unsure Future” the authors detail some of the struggles that have plagued Ohio School Districts and Ohio Students. Ohio students are required to pass content specific graduation exam tests in order to graduate from high school. At the time of this article the graduation exam was the Ohio Ninth Grade proficiency test. The skills required by this exam are expected to be mastered by the end of the eighth grade and these tests are completed successfully by approximately 98% of students. The author then describes the test that will replace the current test in the year following. This exam is based on benchmarks and content standards that are on average two years above the current eighth grade proficiency requirement. In the first pilot study of this test the success rate for the mathematics portion was only 23.1% and just over 5% of the African American population were successful on the exam. In later studies the success rate was greater, but still only 67.9% were successful with 61% of African American students failing the exam. These abysmal rates were increased significantly after the test was modified and the minimum scores required were lowered. Still the failure of students to perform well on this examination is contrary to increases found in the current Ohio Ninth Grade Proficiency Test and the SAT college entrance exam. One explanation for the failure of students to perform well on these exams is the use of 55 mathematics benchmarks in the creation of the exam. These benchmarks reflect content that the students should be able to accomplish, but these content standards are based on current revisions to policy and do not necessarily reflect the education received by those taking the exam. This is not the first time that Ohio students have been forced to deal with adversity in regards to testing. When the first Ninth Grade Proficiency Test was administered only 33% of the students passed the exam. Four years later the success rate was 97%. While this educational wake-up call is important for administrators the author indicates that it is unfortunate and borderline immoral to hold students accountable for standards and curricula to which they have not been exposed. As the state grapples with hard decisions concerning the graduation exams they are forced to weigh issues of social consequence and fairness against issues of progress and adequacy. Unfortunately neither has a straightforward solution. <br />Zabala, Dalia. (2008). A Move Toward End-of-Course Exams. Washington D.C.: Center on Education Policy.<br />Before 2002 school systems regularly used minimum competency exams in order to determine a student’s eligibility to graduate from high school. Since that time schools have been moving towards comprehensive exams, and more recently to end-of-course exams. The basic skills exams traditionally tested students on skills developed prior to entrance in high school, most commonly eighth grade. Comprehensive exams are traditionally targeted at the tenth grade level and are coupled to state standards for the content at that grade level. End-of-course exams measure mastery of specific content pertaining to a subject that a student has just completed. The use of end-of–course examinations is still in its infancy, but several states are implementing graduation requirements including competency exams and end-of-course exams. <br />Nancy Kober. (2004). Testtalk: My School Didn't Make Adequate Yearly Procress-So What Does That Mean? Washington D.C.: Center of Education Policy.<br />As part of the No Child Left Behind Act of 2002 schools were required to meet specific benchmarks developed by both Federal and State Educational Administrators. In order to meet the standards set up for Adequate Yearly Progress a school must ensure that every major subgroup in the school’s population tests at the proficient level as determined by the state board of education. Schools must have a minimum of 95% participation from each of these subgroups on the state mandated examination. In addition to these requirements the schools are each required to meet at least one other benchmark as prescribed by the state. 2003 marked the first year schools were evaluated for Adequate Yearly Progress as mandated by the NCLB Act of 2002. Of the almost 88,000 schools evaluated, 32% of them failed to meet AYP goals. As schools make progress towards their AYP goals they must also have a plan for continual improvement of proficiency. According to NCLB guidelines schools must continuously improve and have a plan in place with benchmarks to ensure that they reach 100% proficiency by the year 2014. While these goals are ambitious, they may be unattainable as they require gains in achievement that are higher than the gains seen currently in the best achieving schools. In order to meet these goals state boards need to analyze how the performance targets are set and allow for some flexibility in a schools accountability program. Additionally state systems should move beyond a specific proficiency goal towards an improvement goal that is determined by where a school starts their improvement plan. Finally, more factors than simply a test score need to be analyzed, such as dropout rates or percentages of students taking rigorous college preparatory courses. Although AYP guidelines must be set to meet federal guidelines, the use of some of these alternatives may provide schools with much needed alternatives when facing an AYP shortcoming. <br />Rubenstein, Grace, (2008) Reinventing the BIG TEST. Edutopia, 4 (2), 32-37.<br />Reinventing the Big Test evaluates the merit of today’s standardized tests. The author believes that, though the accountability tests show numbers with authority, they have two vital flaws: they encourage practice and drill teaching, and they do not show the quality of student learning. Standardized tests are often the rubric that policy makers use to determine if schools have met their annual yearly progress goals and therefore have satisfied No Child Left Behind. The author states that these tests have some basic flaws. Some of these flaws are in the difficulty level of the tests, scoring errors, or ambiguous questions. Other flaws include the psychological pressure of one test determining a student’s whole future opportunities or the pressure of taking one of these high stakes tests when the student is ill or having personal issues at home. The author researched alternatives to these critical assessments and has found some alternative methods of evaluation and advises more, including the revamping of the whole testing program. While this may sound radical, some of this is already taking place. Some schools use portfolios for evaluation in place of exams, others do presentation, and still others have developed interactive computer programs that test students’ skills. Test writers are skeptical about the feasibility of developing new large scale complex tests that would measure student learning in a more sophisticated manner. The large scale may need to be replaced and tailored to a school or an area, but this will raise a whole new set of issues not discussed here. <br />Kober, Nancy, (2002). What Tests Can and Cannot Tell Us. Test Talk for Leaders, 2 1-15.<br />Center on Education Policy reviewed testing’s strengths and limitations. This article was a qualitative study done for policy makers to help educate them on the viability of today’s standardized tests. Today’s high stakes standardized tests give a wealth of information that is hard to come by with the same amount of time, effort and expense. Testing still has a price but compared to other evaluation methods it gives the most bang for the buck. These tests provide standard and consistent information school to school even state to state. Many feel that this information is more valuable than individual teacher’s appraisals of student performance. For example most colleges will not even look at a student’s application without a SAT or ACT score attached to the package. This is because an” A” in Mr. Smith’s class at high school one might only be a “C” in Ms. Jones class at high school two. However, test scores are not an absolute they are more of an estimate or a range of scores. Most test makers build these variations into the standard error for the test. This range needs to be taken into account when annual yearly progress (AYP) reports for a school or district are looked at. Great fluctuations in scores normally have a reason such as a higher than usual population immigration or emigration. A higher than normal special education population can also vary results. This why AYP results for No Child Left Behind can now be averaged over three years to help correct for the absoluteness of some tests. Tests are indispensible for providing comparable information but using them in conjunction with other methods of grading student work can provide a deeper understanding of the student’s true capabilities. One test should not make it or break it for a student’s or school’s future.<br />Toch, Thomas, (2006). Turmoil in the Testing Industry. Educational Leadership, 64 (3) 53-57.<br /> This article was a summary of a qualitative study by testing industry executives, state testing officials, and other testing experts. These experts examined the results of the No Child Left Behind (NCLB) testing demands on all states. Annual Yearly Progress on standardized tests is the meter-stick by which schools are held accountable. Some of the problems with these standardized tests are that they do not always align with the state’s course of study or that they only test lower level thinking skills. Other problems include the stress one the test making industry to generate multiple tests quickly. Making a credible test takes time. It involves not only writing the test but proofing the test, standardizing the passing score by giving it to students, checking it for racial bias and much more. All of these things are not done or not well if there is a rush to produce a test in a limited amount of time. Cost is also an issue in making a test. A test that can be run through a machine costs pennies where a test that must be scored by hand costs $.50 - $5.00. When cost is an issue multiple-choice wins hands down over essay. NCLB testing is being called the race to the bottom by the testing industry. This is of course the complete opposite to NCLB goals. States must meet AYP to continue to receive funding; therefore changing the standards at the state level allows schools to achieve results without actually showing improvement. Policy makers must acknowledge what is going on and address it or the goals of NCLB will not be worth all of the effort that has been made by so many to meet it. <br />Terry, Brooke Dollens, (2007). End-of-course exams as a measuring stick. Beaumont Journal, 1-3.<br />This article compared the Texas Graduation Exam – TAKS with end-of-course tests. This article was a qualitative observation stating that students were getting inflated grades and that students taking honors courses were not truly doing honors work. A student that received credit for courses such as geometry and algebra II could not pass the math section of TAKS. This was especially apparent among lower income students. The author believes that an end-of-course exam would eliminate issues such as this plus give timely feedback to students and parents. TAKS is not given until the end of the 11th grade giving students little time to remediate the problem before facing graduation. Fifteen states and several countries in Europe and East Asia already use end-of-course exams instead of exit or graduation exams. The curriculum is normally only as tough as the test used to measure it if you want to up the ante toughen the test. <br />Toch, Thomas, (2008). Test Results and Drive-By Evaluations. Education Sector, 1-3.<br /> This article was an observational study of what test scores actually measure. It was a qualitative study looking at a proposal by New York City’s Chancellor Joel I. Klein. Klein wants to rate teachers according to their students’ success rate on standardized tests. The problem with this is that only a portion of the instructors teach subjects tested by the exam. This would make team teaching or remediation more valuable in these schools. Klein also wants teacher reviews to be much more in- depth. Evaluations study a checklist of items (like being presentably dressed) but do not normally focus on the quality of instruction. Tougher teacher evaluations would do a lot to take away seniority privileges and favoritism. These evaluations would help to promote administration support. More teachers would rather work in a school that had better administration support than to work in one that paid more money. Teachers with administration support also show less opposition to being judged by their students’ test scores. It seems if you are doing a quality job, the boss knows it and supports you regardless of the numbers. <br />Neill, Monty Ed.D. (2008). Fair Test. National Center for Fair and Open Testing, 1-4.<br />The article was an evaluation of the progress achieved by mandated high school graduation exams and advice to the state of Rhode Island in constructing this type of exam. The article was a quantitative and qualitative study done comparing the graduation exam intensity with dropout rate. The author claims that the evidence clearly shows that exit exams do more harm than good. He believes it stifles creativity, critical thinking and teamwork. He says he understands the need for the testing but that at the present time this type of testing tends to harm the disadvantaged student or the lower income student much more than it assists them. In states where the exit exam has become increasingly tough the dropout rate has risen proportionally. Minority drop out in these states is two to three times higher than white students. The author advises Rhode Island to adopt a multi-tiered evaluation route similar to Wyoming or Nebraska rather than invest in an exit exam that is sure to harm more students than help them. <br />Zabala, Dalia, Minnici, Dr. A. , McMurrer, J., Hill, Dr. D., Bartley, A. P. and Jennings, J., (2007). State High School Exit Exams: Working to Raise Test Scores. Center on Education Policy, 1-21.<br />This article was a summary of a study by the Center on Education Policy reviewing the current year’s issues with high school exit exams. It was both a quantitative and qualitative study reviewing what was working and what was not in the states that use exit exams for determining graduation. The items that were working were intervention and remediation at the state and local levels. Most state exit exams are aligned to grade ten and are supposed to measure mastery of the state curriculum. All states administering exit exams have passing gap rates between various groups of students. White English speaking students tend to do better than ESL or African American students. The article also noted that several states are moving towards end-of-course exams or doing a dual process of both exit exam and end-of-course exam. By 2015 twelve states will be using the end-of-course exam compared with the eighteen that will require standards based exit exams and four that will require a combination of both. These are all a move away from the minimum competency exams that were once commonly used as exit exams by many states.<br />Problem and Significance<br />High Stakes testing is a contentious issue in the current climate of consumer driven educational strategies where individual school systems endure significant scrutiny of their respective scores on various tests. These systems are required to explain, account, and improve any deficiencies in the educational process both real and perceived. Because so much is at stake, administrators look for opportunities to increase performance such as remediation, intervention, and content specific review. Additionally, administering these tests takes an average of nine instructional days per semester, benchmark testing taking four and the Alabama High School Graduation Exam (AHSGE) taking five. All of these activities take away from the traditional course instruction and can affect both the fluidity and the pace of instruction. In order to meet the content requirements outlined by the course of study teachers find themselves rushing through important concepts failing to give the quality instruction they know is necessary to reach higher levels of learning. Students are often taught using comprehension or knowledge based instructional strategies to facilitate the pace and “practice” test taking skills while forfeiting a truly meaningful educational experience. Do the additional days of benchmark testing assist students or are they harming them by taking away additional educational opportunities. Through this study we are examining the possibility that these subject area benchmark tests can be used as indicators of success on the AHSGE increasing the benefits from the benchmark tests. <br />Purpose<br />The purpose of this study is to examine whether subject specific end-of-course exams can be used as an indicator of success on the Alabama High School Graduation Exam. If a correlation between the tests can be established then remediation opportunities can be implemented before the graduation exam is administered. <br />Research Design<br />The use of end-of-course exams as an indicator of student achievement is a fairly new idea with most states still opting to use comprehensive examinations. Many states are beginning to do research with end-of-course exams with several planning to establish these tests as graduation requirements by 2015. In order to determine whether these examinations could be used as indicators on the Alabama High School Graduation Exam a thorough investigation of literature was performed to determine what research has been completed in this area. The findings of this review revealed many issues with the current testing policies, as well as a shortage of viable options to the traditional graduation/exit exams. Because many of these end-of-course exams are still in developmental stages there were no long-term results to compare to this study. However, the Center on Education Policy has stated that it will include research into this area in its August 2008 report. <br />Data Collection<br />The population for this study was the tenth grade student body at Wetumpka High School. This group of students had the greatest percentage of students who had both taken the AHSGE and subject specific end-of-course exams. This data was supplied from STI Assessment, the school’s database. Student scores on each of the subject tests were collected in the areas of Algebra IB, Geometry, English, and Social Studies and then paired with student scores on the AHSGE. <br />Data Analysis<br />Student data was entered into a spreadsheet with values of zero or one. Students who had passed the AHSGE were given scores of one in the respective category and students who had failed that portion were given a zero. Because there are no set parameters for passing subject area end-of-course tests students were evaluated at the standard ten-point grading scale pass score of 60%. Students who met this score were given a one and students who failed to reach this mark were given a zero. This was completed for each of the subjects in which an end-of-course exam could be paired with the graduation exam. AHSGE requirements for science are currently being evaluated and no scores are available at this time. Additionally, some of the end-of-course evaluations pertained lower grade-levels than tenth and these were not evaluated due to the graduation exam being given starting in the tenth grade. The AHSGE tests that were evaluated were the language, reading, mathematics, and social studies tests. Because the English benchmark test contains both reading and language skills it was compared to the reading and language portions of the AHSGE separately and then combined to determine the overall comparison. In analyzing the mathematics benchmark tests both Algebra IB and Geometry scores were used, because these courses have significantly different content standards the exams were only compared separately. Additionally, in the Algebra IB end-of-course exam results, only two students out of 31 earned a passing score. This along with interviews with Algebra IB instructors suggests that the end-of-course test may be invalid for measuring mastery of course content. <br />Results<br />AHSGE TestEnd-of-Course TestSample Size (n)Prediction Percentage of Graduation Exam Success When EOC Exam is PassedPrediction Percentage of Graduation Exam Success When EOC Exam is FailedPrediction Percentage of Graduation Exam Success Overall Social ScienceUS History I8092.592679.245383.75ReadingEnglish 1011181.481562.571.8182LanguageEnglish 1011172.222262.567.2727Combined Reading and Language English 1011164.814862.563.6364MathematicsAlgebra IB3150.0 *65.517264.5161MathematicsGeometry6091.304332.432455.0<br />*Only two students of the 31 tested earned a passing score<br />The results in social science indicated a strong connection between the scores on the AHSGE and the end-of-course exam in US History I. Students who earned a passing score on the EOC exam were 92.5926% likely to also earn a passing score on the AHSGE. Students who failed the EOC exam were 79.2453% likely to also fail the AHSGE. When these scores are considered together the US History I EOC exam successfully predicted the score on the AHSGE 83.75% of the time. <br />The results in reading also indicated a strong connection between the scores on the AHSGE and the end-of-course exam in English 10. Students who earned a passing score on the EOC exam were 81.4815% likely to also earn a passing score on the AHSGE. Students who failed the EOC exam were 62.5% likely to also fail the AHSGE. When these scores are considered together the English 10 EOC exam successfully predicted the score on the AHSGE 71.8182% of the time. <br />The results in language indicated a moderately strong connection between the scores on the AHSGE and the end-of-course exam in English 10. Students who earned a passing score on the EOC exam were 72.2222% likely to also earn a passing score on the AHSGE. Students who failed the EOC exam were 62.5% likely to also fail the AHSGE. When these scores are considered together the English 10 EOC exam successfully predicted the score on the AHSGE 67.2727% of the time. <br />When compared together the reading and language results indicated a moderate connection between the scores on the AHSGE and the end-of-course exam in English 10. Students who earned a passing score on the EOC exam were 64.8148% likely to also earn a passing score on the AHSGE. Students who failed the EOC exam were 62.5% likely to also fail the AHSGE. When these scores are considered together the English 10 EOC exam successfully predicted the score on the AHSGE 63.6364% of the time. <br />The results in mathematics indicated a connection between the scores on the AHSGE and the end-of-course exam in Algebra IB. Students who earned a passing score on the EOC exam were 50% likely to also earn a passing score on the AHSGE; however, the sample size was too small to assign any significance. Students who failed the EOC exam were 65.5172% likely to also fail the AHSGE. When these scores are considered together the Algebra IB EOC exam successfully predicted the score on the AHSGE 64.5161% of the time. <br />The results in mathematics indicated a mixed connection between the scores on the AHSGE and the end-of-course exam in Geometry. Students who earned a passing score on the EOC exam were 91.3043% likely to also earn a passing score on the AHSGE. Students who failed the EOC exam were 32.4324% likely to also fail the AHSGE. When these scores are considered together the Geometry EOC exam successfully predicted the score on the AHSGE 55.0% of the time. <br />Conclusions<br />The results of this study indicate that the subject specific end-of-course examinations can be used as a predictive tool for the Alabama High School Graduation Exam. Although the effectiveness of the indicator varied from subject to subject the end-of course exam successfully predicted the outcome of the AHSGE 67.6656% of the time. The connection between the tests is apparent, allowing schools to make remediation opportunities available before the graduation exam is administered. Additionally, if teachers and administrators further revise the EOC exams to rectify inherent problems exposed by this study, they can expect the predictive properties of these exams to increase making them more viable as tools for improved student achievement. While the administration of these exams is inconvenient for educators and puts further restrictions on already tight schedules, the validity of these tests to measure content standards is significant and thus the use of subject area end-of-course examinations should continue. <br />References<br />Browne-Dianis, Judith A. (2008). Graduation Tests Will Harm Students. Fair Test: The National <br />Center for Fair and Open Testing , 1-2.<br />Fisher, M., & Elliott, S. (2004, May 25). Unsure Future: Changes planned, but will test problems be solved? Dayton Daily News , pp. A-1.<br />Kober, Nancy. (2004). Testtalk: My School Didn't Make Adequate Yearly Procress-So What Does That Mean? Washington D.C.: Center of Education Policy.<br />Kober, Nancy. (2002). What Tests Can and Cannot Tell Us. Test Talk for Leaders, 2 1-15.<br />Neill, Monty Ed.D. (2008). Fair Test. National Center for Fair and Open Testing, 1-4.<br />Rubenstein, Grace, (2008) Reinventing the BIG TEST. Edutopia, 4 (2), 32-37.<br />Terry, Brooke Dollens, (2007). End-of-course exams as a measuring stick. Beaumont Journal, 1-3.<br />Toch, Thomas, (2008). Test Results and Drive-By Evaluations. Education Sector, 1-3.<br />Toch, Thomas, (2006). Turmoil in the Testing Industry. Educational Leadership, 64 (3) 53-57.<br />Zabala, Dalia, Minnici, Dr. A. , McMurrer, J., Hill, Dr. D., Bartley, A. P. and Jennings, J., (2007). State High School Exit Exams: Working to Raise Test Scores. Center on Education Policy, 1-21.<br />Zabala, Dalia. (2008). A Move Toward End-of-Course Exams. Washington D.C.: Center on Education Policy.<br />