260 likes | 281 Views
What Impacts Alternate Assessment Scores. Presented by Diane M. Browder, PhD UNC Charlotte Ginevra Courtade-Little Charlotte Mecklenburg Schools. Authors: Diane M. Browder Meagan Karvonen Stephanie Davis University of North Carolina at Charlotte Kathy Fallin Ginevra Courtade-Little
E N D
What Impacts Alternate Assessment Scores Presented by Diane M. Browder, PhD UNC Charlotte Ginevra Courtade-Little Charlotte Mecklenburg Schools
Authors: Diane M. Browder Meagan Karvonen Stephanie Davis University of North Carolina at Charlotte Kathy Fallin Ginevra Courtade-Little Charlotte Mecklenburg Schools
Background • IDEA 1997, NCLB • Alternate assessment process may be used to improve educational programs (Browder, Spooner, Algozzine, et. al., 2003) • Limited research that focuses on the relationship between educational programs and alternate assessment outcomes • Teachers of students with significant disabilities may feel pressure to improve alternate assessment scores without knowing how to do so • The need exists to develop training that relates directly to the requirements of AA and to evaluate this training in the context of real school programs
Purpose • To determine if training teachers would improve students’ outcome scores on alternate assessments
Hypothesis • Training teachers on three instructional variables that influence outcomes (curriculum access, data collection, and instructional effectiveness) would increase the alternate assessment scores • A manual was developed and used for teacher training that summarized current research on: • how to select skills for students with significant disabilities • how to develop data collection systems • how to improve instruction if students do not make adequate progress
Research Questions • Does training in curriculum, data collection, and data-based decisions increase alternate assessment scores? • Is there evidence the teachers used the instructional skills trained? • Is there evidence the students learned new skills? • How did the teachers and parents perceive gains made?
Method-Setting • Charlotte-Mecklenburg Schools • Large urban consolidated city-county district • served 112,500 students (grades pK-12) • 75% of those students were in grades that required testing • 294 students participated in the NCAAP (.3% of the assessed population)
Teachers 27 initially nominated by a school system administrator 93% female 81% Caucasian, 19% African American Experience (0-24ys, M=10yrs) 5 lateral entry teachers Pool was comparable to district as a whole Students Initially 29 selected by teacher and parent nominations 24% female 60% Caucasian, 34% African American, 3% Hispanic, 3% Asian 21% with severe/profound disabilities, 36% with autism, 39% with moderate mental disabilities, 4% with multiple disabilities Method
Method-Intervention • North Carolina Alternate Assessment Portfolio • Portfolio of evidence collected to demonstrate mastery and generalization of 5 teacher selected IEP goals related to state standard • Instructional implications addressed in training • Selecting appropriate objectives consistent with state standards • Creating data sheets that met state requirements for baseline data and ongoing student performance data • Developing instruction that would promote generalization and mastery of the objectives by the turn in deadline
Method-Intervention • Instructional Components Training • Curriculum-Guidelines for reviewing IEP to determine if it provided access to the curricular domains required by the state • Data collection-how to develop data collection systems (models provided) • Guidelines for making data based decisions
Method-Intervention • Method of teacher training • Teacher training manual • Five group inservice days • Introduction of project • One day devoted to each instructional component • Evaluation of the project • 3 site visits by project staff to answer questions and model data based decision making process
Method-Instrumentation • Primary DV-score on the NCAAP assigned by NCDPI • Student scores obtained for year prior to intervention and intervention year • Scores obtained for comparison group of students in CMS and state • To obtain a proficient score • Below mastery during baseline • At criteria for mastery by the end of the year • Maintained criteria for most of the last days of the school year • Generalized across people and settings • Initiated (unprompted correct or used skill in response to natural cues)
Method-Instrumentation • Additional variables measured • NCAAP portfolio quality score • Teacher use of instructional components • Data collected on curriculum, data sheets, data based decisions • Teacher survey to identify resources used • Behavioral data on IEP goals • Reliability of teacher data • Percentage of growth on skills to determine if criteria was set at an appropriate level • Stakeholder Perception Surveys • To determine if teachers and parents considered gains made on portfolio objectives important
Method-Design • Quasi-experimental pretest-posttest design • Pretest-scores from year prior to the intervention • Post-test scores from intervention year • Scores from the students in the school system whose teachers did not participate in the project were used as comparison group to apply a pretest-posttest control group design
Results Did teacher training positively influence the state alternate assessment scores?
Results-Teachers’ Use of Instructional Components • 92% of teachers mastered the curriculum component • 96% mastered the data sheets component • 84% mastered the data based decisions component • All teachers reported using the training manual and data collection sheets
Results-Portfolio quality score • Students with teachers in the model • 2001-67% adequate or superior • 2002-96% adequate or superior • Comparison group • 2001-42% adequate or superior • 2002-71% adequate or superior
Results-Meaningful Growth on IEP objectives • Median average growth 68.7% • Range from 16%-171%
Results-Teacher and Parent Perceptions of Gains • 84% of teacher respondents reported students made more progress as a result of being in the project • 80% of teacher respondents thought that their students had better IEPs because of the project • Parents had overall strong, positive impressions of their students’ participating in the project
Discussion • Provides evidence that AA scores can be improved by training teachers in instructional variables • Students in the project had significantly higher scores than the previous year • High rates of proficiency were supported by behavioral data
Generalizability of Findings • Generalizable to a wide range of students with severe disabilities due to teacher training overriding the influence of student characteristics • Generalizable to any AA system with a primary focus on student acquisition of skills
Limitation of findings • Use of quasiexperimental design with all students in the Charlotte region not in the project serving as the control group • Possible selection bias • Instrumentation used to document student progress and performance (changes in way portfolios were scored from year to year)
Recommendations for Future Research • Teacher training to meet changing curricular demands (assessment of students on specific LA and math standards) • Teacher buy in to AA process • Validation that AA outcomes reflect educationally significant student learning
Recommendations for Practice • Use of collected data to make instructional decisions • Training in : • Data collection • How to enhance student gains • How to teach and assess language arts, math, and science