350 likes | 473 Views
Reading First Evaluation in Georgia: A Multidimensional Approach. Ken Proctor Reading First Director Georgia Department of Education Michael C. McKenna University of Virginia. Overview of External Evaluation. University of Georgia, College of Education Reading Education Department
E N D
Reading First Evaluation in Georgia: A Multidimensional Approach Ken Proctor Reading First Director Georgia Department of Education Michael C. McKenna University of Virginia
Overview of External Evaluation • University of Georgia, College of Education • Reading Education Department • Educational Research Lab Test Scoring and Reporting Services • Occupational Research Group • Implementation, Impact and Progress • April 2004 – June 2007 (3 full years of program implementation)
Key UGA Personnel for External Evaluation • Reading Education Department • Michelle Commeyras, Professor, Reading Education • Donna Alvermann, Distinguished Research Professor, Reading Education • Doctoral level reading ed students/field researchers • TSARS-Education Research Lab (number crunching) • Steve Cramer & Allan Cohen – Professors, Educational Psychology • Occupational Research Group • Dorothy Harnish, Assoc. Research Scientist, Project Director
Annual State Reporting • Has the state made progress in increasing the percentage of students reading at grade level or above? • Which RF schools have made the largest gains in student reading achievement?
Annual State Reporting • Disaggregated data for students reading at or above grade level: • Economically disadvantaged students • Students from major racial and ethnic groups • Students with disabilities • Students with limited English proficiency • Student performance data for percentage of students in RF schools in grades 1, 2, and 3 reading at or above grade level
Evaluation Questions • Impact • Implementation • Progress
Evaluation Questions: Impact • What is the impact of Reading First (RF) on student achievement in reading as measured by standardized test scores? Is reading achievement in RF schools higher than in non-RF schools?
Evaluation Questions: Implementation • Is the RF program being implemented by schools as intended in the Georgia RF plan? How does the level of implementation of RF relate to results being achieved in RF schools? Is the level of RF implementation positively correlated with higher reading achievement?
Evaluation Questions: Progress • What progress is being made by RF schools in improving student reading achievement? Where progress is not apparent, what are the reasons for this? What interventions are required? • Are RF teachers more knowledgeable of scientifically based reading research after three years of professional learning experiences?
Impact of RF on Student Reading Achievement • Comparison of RF and non-RF schools on ITBS reading comprehension scores in grade three • Compare RF-funded schools with sample of non-RF schools matched by key demographic variables • Are there significant differences in reading test scores that can be explained by the RF program when other variables are controlled?
Impact of RF on Student Reading Achievement • Year-to-year changes in RF schools • Grades 1, 2, and 3 ITBS scores in reading • Beginning in year 2, compare each grade to previous year mean scores and percent reading at grade level • Identify significance of changes for each grade compared to scores for previous year for that same grade
Impact of RF on Student Reading Achievement • Cohort analysis of RF students • First grade cohort each year of RF program • Track cohort in subsequent grades each year to identify changes in percent reading at grade level using ITBS scores • Identify significance of changes for cohorts after one or two years of RF instruction
Impact of RF on Student Reading Achievement • Comparison of CRCT (Georgia Test) results • Means of confirmatory evidence of RF impact • CRCT passing rates in reading for grades 1, 2, and 3 • Identify year-to-year changes in pass rates for RF schools • Compare RF student pass rates to those of entire state of GA each year
Fidelity of Implementing Reading First Instructional Strategies Observations of Reading First Teachers Interviews with Literacy Coaches Survey Questionnaires
Observations of Reading First Teachers • Instructional Content Emphasis (I.C.E.) • Half of the schools are observed in the spring and the other half in the fall. Year to year (spring to spring/fall to fall) comparisons are made from observation data. • Observers are UGA Professors and doctoral-level graduate students
Use of Observation Data • Assess what instructional strategies are being used in RF classrooms • Identify percent of teachers and schools using key RF strategies (status of implementation) and changes in use from year to year • Report on statewide trends for grade levels and for all RF schools
Literacy Coach Surveys- Monthly • On-line, web-based ongoing reporting system • Assess implementation processes and concerns of key participants • UGA Reading Education team develop • State department RF staff review/ approve surveys prior to administration • Data is used in a formative manner to inform decisions made by the Professional Development Architects and the GADOE regarding upcoming professional development sessions.
Interviews with Literacy Coaches • Telephone interviews once a year (end of year report)
Use of LCs Time Related to DIBELS Results • DIBELS data compared to literacy coaches’ self-reported time on various tasks.
Yearly Surveys • Reading First teachers • Reading First school administrators • Parents of Reading First students • Regional Reading First consultants
Progress in Reading Achievement • Within-school growth in reading achievement at RF schools • DIBELS subtest scores in grades K-3 • Pre-post analysis each year (beginning and end-of-year test results) • Identify areas of progress across RF schools • Teacher knowledge survey of SBRR (pre-post analysis)
“Conventional Wisdom” Old CW: RF evaluation should focus on the bottom line – achievement
“Conventional Wisdom” Old CW: RF evaluation should focus on the bottom line – achievement New CW: RF evaluation must focus on contributory factors as well
Why fine-grained evaluation is needed • Degree of implementation is important. • RF is not monolithic across states, districts, schools, or classrooms. • Causal relationships cannot be identified by looking only at the effect and assuming a uniform, consistent cause.
So, did Reading First work? That’s not the right question.
Reading First has (perhaps unintentionally) created a variety of circumstances nationwide that … • can be viewed as quasi-experiments • these are long-term and replicable • are numerous enough to address important questions across sites • are also rich in qualitative data
Predictions • Many studies of RF will appear over the next decade, in addition to the comprehensive national report. • These studies will be conducted by independent researchers and will appear in a variety of forums. • The “treatment” in these studies will not be RF in a uniform sense, but a set of factors that RF has made possible.
Cardiac Method of Evaluation
Hypotheses • Findings will converge across studies, substantiating the impact of certain factors. • These factors will include: • instructional practice aligned with SBRI • emergence of strong teacher support for the initiative as a result of PD • presence of a knowledgeable, tactful, and assertive coach
So, what will the right question be? What has Reading First taught us?