470 likes | 612 Views
South Carolina Alternate Assessment (SC-Alt) Advisory Committee. September 28, 2011. South Carolina Alternate Assessment (SC-Alt) Overview. Advisory Committee Role and Purpose. American Institutes for Research (AIR) SC-Alt Contractor. A.I.R. Staff. DeeAnn Wagner Project Director
E N D
South CarolinaAlternate Assessment (SC-Alt)Advisory Committee September 28, 2011
A.I.R. Staff • DeeAnn WagnerProject Director • Jennifer Chou Project Manager • Lynnett Wright Alternate Assessment Specialist • Emma Hannon Research Assistant
Discussion of New Procedures • Print Manipulatives • Packaging of Manipulatives and Test Booklets • Answer Folders and Security Affidavits
Videotaping of SC-Alt Administrations • Implemented to monitor test administration effectiveness and scoring consistency • Annual implementation allows monitoring consistency over testing years as adjustments are made in training, as new tasks/items are used to replace previous tasks, and new content areas are added (i.e., high school biology).
Videotaping Sampling Procedures for 2011 One student per sampled teacher was videotaped: • ELA only for elementary and middle school forms • ELA and biology for high school forms
Videotaping Sampling Procedures • All districts were sampled. • Sampling implemented by teacher and student. • Teachers sampled according to proportions of students in their district. • Approximately 1/3 of teachers and 10% of students were sampled.
Review and Analysisof Videotaped Administrations • All recordings were reviewed by trained AIR raters for: • Fidelity of administration • Accuracy of scoring • Teacher score is used for reporting purposes 10% sample reviewed by AIR alternate assessment specialist
Videotaping Results • Previous results have indicated consistently high rates of scoring agreement at all three form levels (elementary, middle, and high school). • For 2011, the average item agreement statistics for the ELA videotaped samples were: Elementary form - 96.0, Middle School form – 94.9, and High School form – 95.9. The item agreement statistic for High School biology was 94.3. • These results are consistent with the scoring consistency results for previous years and confirm a high level of scoring consistency for the new High School biology assessment.
Second Rater Pilot • A second rater procedure may also be used to obtain data on scorer fidelity. • A pilot of the second rater procedure was conducted for the 2011 administration. • Participation in the pilot was voluntary. • We are seeking feedback and suggestions from you today as we review the outcomes of the pilot study.
Second Rater Pilot Procedures • The DTC-Alt volunteered district participation. • The second rater pilot was limited to elementary ELA administrations. • The DTC-Alt was allowed to select a teacher (and the specified student) identified for videotaping for implementing the pilot second rater session. • The second rater procedure was in lieu of videotaping for that teacher. • The second rater could also serve as the test administration monitor.
Second Rater Pilot Procedures • The participating district was only required to implement the procedure with one teacher. • The second rater scored the student’s responses on a separate answer document marked Second Rater and submitted to AIR separately. • Second rater pilot participants (teachers, second raters, and DTCs-Alt) were asked to complete a brief questionnaires.
Second Rater Qualifications • Must meet the test administrator criteria: • certified teacher • administrator (e.g., school administrator, district level special education consultant, or other administrator) • related services personnel • Must participate in test administration training.
Second Rater Scoring Consistency Results • The second rater observer sores were compared to the teacher scores to calculate scoring agreement in the same manner as was used for videotape data. • Since both the second rater and videotaping procedures were used for samples of Elementary ELA administrations, the results of use of the two methods could be compared. • Analyzable data were obtained for 48 second rater administrations and 70 videotaped administrations.
Second Rater Scoring Consistency Results • The second rater average item agreement statistics were 96.9% for the second rater data and 96.0% for the videotape data. • The comparable results for these two procedures supports the effectiveness of the second rater procedure.
Second Rater Pilot Questionnaire Results • The questionnaires completed by the pilot participants were used to obtain information on the experience of the teachers and observers, the staff positions of the observers, and the recommendations and preferences in regard to the second rater procedure from the three groups of respondents (teachers, observers, and DTCs-Alt). • Survey responses were obtained for 41 teachers, 45 observers, and 8 DTCs-Alt. • Since the number of districts participating in the pilot was 25, the participation rate for DTCs-Alt was low.
Second Rater Pilot Questionnaire Results • The survey respondents reported a very high level of preference for using the second rater procedure over use of the videotaping procedure. • 93% of the teachers and 87% of the observers responded that they preferred the second rater procedure. • 5% and 9% of each group respectively indicated no preference, with only 2% and 4% indicating a preference for videotaping. • These results did not differ by teacher or observer experience or observer staff position.
Second Rater Pilot Questionnaire Results • 75% of the DTCs-Alt reported a preference for the second rater procedure over the videotaping procedure (6 of the 8 respondents). • 25% (2 DTCs-Alt) indicated a preference for videotaping.
Questionnaire Results:Problems Encountered • The students using eye-gaze response were difficult to rate (observe). • A few districts reported some planning issues, e.g., determining who should be the second rater. • Being a pilot, all materials were sent to the DTC-Alt.
Questionnaire Results:Reported Benefits • The second rater was able to observe some administration problems related to teacher preparation. • Teachers reported the procedure to be less stressful than videotaping. • Teachers reported the procedure was less distracting to students than videotaping.
Questionnaire Results:Suggestions to Improve the Process • Provide a test booklet to the second rater. • Identify second rater teachers prior to TA training. • Include documentation of mode of response on the second rater answer folder. • Other: second rater materials, packaging, and return procedures
Alternate Assessment Participation 2006 – 2011* *PACT-Alt/HSAP-Alt 2006; SC-Alt 2007-2011
Changes in Rates of Participation • The overall number of students increased by 233, which was a 7.9% increase. Last year’s increase was 6.6% • Autism students increased by14.0%, compared to an 18.3% increase last year. • Mild MD students increased by 17.0%, compared to a 7.4% increaselast year. • The percentages of increase from 2007 to 2011 have been 72% for Autism and 55% for Mild MD, compared to an increase since 2007 of 7% for all other students.
Increases in SC-Alt Participation:A Continuing Concern • The SC-Alt was designed for students with the “most significant cognitive disabilities.” It was not intended for higher-level autism or even higher –level moderate MD students. Only a very small number of students classified as “mild MD” would be expected to be included. • There is evidence that some districts and schools are gaming the accountability system by identifying SC-Alt students inappropriately.
2011 SC-Alt Students with Previous PASS Scores • A search for previous PASS scores was implemented for the 2009 and 2010 PASS data files. • 2010 PASS scores were identified for 203 students. • 2009 PASS scores were identified for an additional 142 students. • These numbers approximate the student increases from 2009 to 2010 and from 2010-2011 administrations.
Discussion and Feedback • District Level Training • Scoring Worksheets • Other
http://www.ed.sc.gov • Suzanne Swaffield • Douglas Alexander sswaffie@ed.sc.gov 803-734-8274 dgalexan@ed.sc.gov 803-734-3923