300 likes | 329 Views
Assessing the Instructional Level for Writing. David Parker, Kristen McMaster, and Matthew Burns. Activity. Topic: White: Describe why nuclear fission has been easier to do than nuclear fusion. Peach: Describe why this conference will be useful for your practice.
E N D
Assessing the Instructional Level for Writing David Parker, Kristen McMaster, and Matthew Burns
Activity • Topic: • White: Describe why nuclear fission has been easier to do than nuclear fusion. • Peach: Describe why this conference will be useful for your practice. • Pink: Describe the events of your last family vacation. • Pencils down: Think for 30 seconds • Write!!!
Activity • Count # Words Written • Results:
Activity • The Findings: • Did the Peach and Pink writers write more? • Who was more on-task? • Why’d we do it? • Simulate the right amount of challenge • Think of the kids!!!
Overview • Introduction • Why writing? • Why instructional level? • Purpose of this study • Method • Who, what, how? • Results • What was found • Discussion • Why it matters, limitations, what next?
Introduction Why Writing?? National Report Cards on Writing, 2003; 2008
Introduction Why does writing proficiency matter? • Enhances learning in content area courses (Bangert-Drowns, Hurley, & Wilkinson, 2004). • College Entrance, Job Obtainment/Performance (National Commission on Writing, 2004; 2005).
Introduction Problem: Detecting writing problems in late elementary or middle school, makes it more difficult to remediate (Baker, Gersten, & Graham, 2003) Solution: Start Intervening Early!!!!
Introduction Enter the Instructional Level!!! First, some background knowledge
Introduction What is the Instructional Level?
Introduction • Theoretical Foundation • Vykotsky (1978) • Betts (1946) • Gravois & Gickling (BP-V; 2008) • Measurement Tools • Curriculum-based Measurement (CBM; Deno, 1985; Marston, 1989) • For early writers (Coker & Ritchey, 2009; McMaster, Du, Yeo, Deno, Parker, & Ellis, 2009) • Assessment • Curriculum-based Assessment (Gickling & Havertape, 1981; Gickling, Shane, & Croskery, 1989)
Introduction Empirical Findings • Reading: • 93-97% correctly read words (Treptow, McComas, & Burns, 2007; Gickling & Armstrong, 1978) • Improved on-task behavior, task completion, and reading comprehension • 4x Faster growth rates (Burns, 2007) • Math: • 14-31 Correct Digits (2/3rd Graders); 24-49 Correct Digits (4/5th Graders) • Highest growth slopes (Burns, VanDerHeyden, & Jiban, 2006).
Introduction There is NO Instructional Level for writing! Purpose: To identify potential estimates of the instructional level for writing.
Method Participants • 5 classrooms from 2 urban schools • 85 1st grade students • 51% male • 41% White; 28% Black; 26% Hispanic • 57% Free/Reduced Lunch • 17% special education services Setting • Classrooms
Method Measures • Curriculum-based Measurements • Two Types • Picture-Word • Sentence Copy • Scoring Procedures • Words Written • Words Spelled Correctly • Correct Word Sequences • Test of Written Language
Method Picture-Word Prompt (McMaster, Du, & Petursdottir, 2009)
Method Sentence-Copy Prompt (McMaster, Du, & Petursdottir, 2009)
Method Procedure • Weekly progress monitoring data • 12 weeks • Teacher-administered • Students practiced then completed prompts for 3 minutes Fidelity and Agreement • Collected for teacher administration as well as prompt scoring • Teacher administration fidelity: 100% • Agreement: generally > 90%
Method Data Analysis (an 8-step plan) • Establish Reliability of Accuracy/Fluency Metrics • Establish Validity of Promising Metrics • Compute Growth Slopes • Identify top 1/3rd Slopes • Compute Mean Start for top 1/3rd Slops • Create Categories • Establish Reliability of Categories • Establish Validity of Categories Part 1: Find promising measures and scoring procedures
Method Data Analysis (an 8-step plan) • Establish Reliability of Accuracy/Fluency Metrics • Establish Validity of Promising Metrics • Compute Growth Slopes • Identify top 1/3rd Slopes • Compute Mean Start for top 1/3rd Slopes • Create Categories • Establish Reliability of Categories • Establish Validity of Categories Part 2: Find Instructional Levels
Method Data Analysis (a 8-step plan) • Establish Reliability of Accuracy/Fluency Metrics • Establish Validity of Promising Metrics • Compute Growth Slopes • Identify top 1/3rd Slopes • Compute Mean Start for top 1/3rd Slops • Create Categories • Establish Reliability of Categories • Establish Validity of Categories Part 3: Examine promise of instructional levels
Table 1. Means, Standard Deviations, and Correlation Coefficients for Fluency and Accuracy Scores for Sentence Copy and Picture-Word Prompts and Accompanying Scoring Procedures.
Table 2. Criterion-related Validity Coefficients between Scoring Procedures for Each Prompt and the Test of Written Language-3 (TOWL-3) Total Score.
Table 3. Derivation of and Estimates for Fluency Instructional Level Criteria for Scoring Procedures within Prompt Types.
Table 4. Number and Percentage of Fluency Scores Categorized as Frustration, Instructional, and Independent and Kappa Coefficients.
Table 2. Criterion-related Validity Coefficients between Scoring Procedures for Each Prompt and the Test of Written Language-3 (TOWL-3) Total Score.
Discussion Conclusion: • Consistent with previous research for reading (Burns, 2007; Gickling & Armstrong, 1978) and math (Burns, VanDerHeyden, & Jiban, 2006), criteria are plausible that indicate a student will make optimal growth in writing skill. Implications: • MORE research! • Instructional decision-making
Discussion Limitations • Conceptual issues • CBM (General Outcome Measure) vs. CBA (Specific Subskill Measure)? • Material difficulty? • Generalizability?? (only 1st graders?) • Criterion for “high-responders” • Ongoing research with early CBM-Ws Future Research • Investigate effects of instructional level prospectively (vis. Intervention)? • Which measure is most informative? • Appropriate criteria?
Thank you! Email: parke384@umn.edu