480 likes | 489 Views
Dr. Denise P. Gibbs, Director Alabama Scottish Rite Foundation Learning Centers gibbsdenise@aol.com. RTI Implementation: Assessment Components. STUDENT SUCCESS. PST. RtI.
E N D
Dr. Denise P. Gibbs, Director Alabama Scottish Rite Foundation Learning Centers gibbsdenise@aol.com RTI Implementation: Assessment Components STUDENT SUCCESS PST RtI Gibbs 2011
This presentation is provided at no cost to Alabama schools by the Alabama Scottish Rite Foundation. The philanthropy of the Alabama Scottish Rite Foundation began in the 1950’s in Alabama and continues today. The mission of the Alabama Scottish Rite Foundation is to provide help to Alabama Schools as they work with students who struggle in reading -particularly those students with dyslexia. Gibbs 2011
Anticipation Guidethen…Turn and Talk Bell ringer activity As soon as you are seated, please complete the Anticipation Guide included in your handout. Then…turn to your neighbor and talk about your initial answers Gibbs 2011
Gains From High Impact Instructional Strategies: Research Findings(Marzano, 2001) Gibbs 2011
Gains From High Impact Instructional Strategies: Research Findings(Marzano, 2001) Gibbs 2011
Today’s High Impact Strategies • 9. Questions, cues, and advance organizers – Anticipation Guide, session outcomes, RTI Vocabulary, Double Bubble • 6. Cooperative learning – Turn and Talks • 2. Summarizing and note taking – Power Point slides, your notes, 3-2-1, • 1. Similarities and differences – CBM vs CAT - Double Bubble • 5. Nonlinguistic representations - VVWAs • 4. Homework and practice – utilize at least one of these strategies in your work Gibbs 2011
Session outcomes…. Today, you will: Practice high impact strategies Identify uses of high stakes tests such as the ARMT or AHSGE in the RTI screening process. Analyze essential criteria to guide selection or development of RTI screening and progress monitoring tools. Use the National Center for Response to Intervention's (NCRTI) tools chart to compare some commercially available screening and progress monitoring tools. Gibbs 2011
RTI Essential Vocabulary: Assessment Screening process vs tool Progress monitoring process vs tool Classification accuracy False positives? False negatives? Reliability and predictive validity of the performance level and slope of improvement Reliability and predictive validity of the slope of improvement Sensitivity to student improvement Efficiency!!!!!! Gibbs 2011
Screening Process and Examples of Some Tools Gibbs 2011
Designing your screening process • Commercially available screening tools • Curriculum-Based Measures (CBM) • Computer Adapted Testing (CAT) • Standards-Connected Assessments • High stakes tests • ARMT • Alabama High School Graduation Exam • End of course tests • Other variables • Grades • Behavior • Attendance Gibbs 2011
Using High Stakes Tests in the Screening Process • A starting point in screening older students could be to use the results of the ARMT or the AHSGE. • Students who score a I or II on the ARMT in reading or in math could be considered as at risk and could proceed to the next step in the district’s screening process • AHSGE results could be analyzed to determine the relative severity of the student’s deficiencies • Determine if the student would benefit from remediation classes or is in need of a more basic skill-level intervention class Gibbs 2011
Double Bubble Time! As an advanced organizer, compare and contrast CBM and CAT Gibbs 2011
Three types of screening measures • Curriculum Based Measures • Probes which look at skills which predict that a student will not be successful in reading or math. • Computer Adapted Testing • Answers determine subsequent questions. Can learn more about skill levels with fewer test items. • Standards Connected Assessments • Usually a fixed set of questions that mirror standards-based state assessments and provide practice and performance prediction. Gibbs 2011
Independent Review of Screening Tools • National Center for Response to Intervention has charts summarizing their review of various screening tools. • Some of the tools reviewed as of Spring 2011 are: • AIMSweb - CBM • DIBELS - CBM • STEEP - CBM • STAR - CAT • Discovery Education (Think Link) – Standards-connected Gibbs 2011
Curriculum-Based Measures (CBM) • Was initially synonymous with RTI. • If concerned about reading, have student complete various reading screening probes • If concerned about math, have student complete math computation and math reasoning/problem solving screening probes. • AIMSweb, DIBELS, and STEEP are all based on CBM Gibbs 2011
A look at some reading CBM probes Early Literacy Measures Letter Naming Fluency Letter Sound Fluency Phonemic Segmentation Fluency Nonsense Word Fluency R-CBM(8th grade example) Grades 1-12 Mazes(7th grade example) Grades 2-12 Gibbs 2011
A look at some math CBM probes Early Numeracy Measures Quantity discrimination Missing Numbers Oral Counting Number Identification Computation (M-CBM) (1st – 6th grade example) Grades 1-12 Concepts and Applications (M-CAP) (7th grade example) Grades 2-8 Focal points (STEEP) Gibbs 2011
Computer Adapted Testing (CAT) • The best computer adapted testing is based on Item Response Theory (IRT) • IRT is a statistical framework in which examinees can be described by a set of one or more ability scores that are predictive, through mathematical models, linking actual performance on test items, item statistics, and examinee abilities. • Test items are determined by responses to questions • Score is reported as a scaled score. • STAR is an example of CAT based on IRT Gibbs 2011
Computer Adapted Testing (CAT) • STAR Early Literacy • PreK-Grade 3 • 41 skills in 7 domains • 10 minutes • STAR Reading • Grades 1-12 • 10 minutes • STAR Math • Grades 1-12 • 10-15 minutes Gibbs 2011
Standards Connected Assessments • Test includes a fixed number of grade-specific questions reflecting common core or state standards. • Can serve as a screening tool for students who might not be likely to perform satisfactorily on state assessments. • Can serve as a practice test for state assessments. • Generally, will not be able to show growth over time and monitor progress. Gibbs 2011
Reports from Screening Tools Some of the most helpful screening tools provide reports which reflect Student’s percentile scores Student’s expected performance on high stakes tests (ARMT!) Longitudinal data for a group of students Level of proficiency on key common core standards Instructional grouping/planning suggestions Gibbs 2011
Turn and Talk:Where are we in terms of screening process? In the bags, quick fixes, long-term growth opportunities! Gibbs 2011
Progress Monitoring Process and Examples of Some Tools Gibbs 2011
What progress to monitor? Growth of skills in areas recognized as deficient Foundational academic skills in reading or math Growth of appropriate social and behavior skills Growth of content knowledge and information Gibbs 2011
Progress Monitoring Process Use of a formal commercially available progress monitoring tool or Use of a formal teacher/school developed progress monitoring tool Should also consider additional information which can reflect progress Use of work samples reflecting baseline and current work Observation of classroom behavior and participation Gibbs 2011
NCRTI Progress Monitoring Tools Chart Provides administration information Compares tools according to key reliability, validity, slope, and improvement criteria. Many of the screening tools have also been favorably reviewed as progress monitoring tools. But some screening tools were NOT found to have favorably reviewed progress monitoring probes. Gibbs 2011
Key progress monitoring criteria Reliability of the performance level score the extent to which the score (or average/median of 2-3 scores) is accurate and consistent. Validity of the performance level score the extent to which the score (or average/median of 2-3 scores) represents the underlying construct Gibbs 2011
Key progress monitoring criteria Reliability of the slope of improvement an indicator of how well individual differences in growth trajectories can be detected using a particular measure Predictive validity of the slope of improvement the extent to which the slope of improvement corresponds to end-level performance on highly valued outcomes. Sensitivity to student improvement the extent to which a measure reveals improvement over time, when improvement actually occurs. Gibbs 2011
Progress Monitoring Tools Need to be efficient. Need to be able to provide a reliable baseline. Need to be able to reflect growth over time. Need multiple, equivalent probes. Results need to be easily graphed. Gibbs 2011
Progress Monitoring Decisions Skills to be progress monitored? Do you have the probes you need? Frequency of progress monitoring Weekly? Bi-weekly? Monthly? Who will conduct progress monitoring Who will score? Who will enter data in computer? Print reports? Gibbs 2011
Some reading skills to be targeted… • Word-level needs • General benchmark should be 95% accuracy for grade-level texts • To reach desired accuracy, may need to target both regular (decodable) words and irregular (learned) words • Rate needs to be that designated for student’s grade or 90-110 words correct per minute (whichever is less) • Prosody/inflection patterns!!!! Gibbs 2011
Some reading skills to be targeted… • Comprehension needs • Vocabulary • Active reading strategies • Before, During, After • Magnificent Seven! • Metacognitive strategies • Fix-up strategies Gibbs 2011
Math skills to be targeted… • Computation automaticity • I am learning the facts • I can think of the answer • I can not avoid seeing the answer when I see the problem. • Grade-specific problem solving and applications • Concrete-representational-abstract continuum • Subject specific skills Gibbs 2011
Selection of Progress Monitoring Tools • Tools should be able to be administered weekly, should be efficient and should be sensitive to growth • If improved reading comprehension is the goal, then you may elect to use mazes or computer adapted testing • If improved reading accuracy is the goal, then you may elect to use oral reading fluency passages and graph percentage of accuracy • If improved math problem solving skills is the goal, then you may elect to use timed problem solving probes or computer adapted testing • If improved math calculation skills is the goal, then you may elect to use timed calculation probes • If improved behavior is the goal, then you may elect to use behavior report card points earned . Gibbs 2011
Verbal and Visual Word Association SensitivityEfficiency Gibbs 2011
Developing Progress Monitoring Probes • Interventioncentral.com equips you to generate • Maze passages – from your own texts • R-CBM passages – from your own texts • Wordlist fluency probes • Math fact fluency probes • Behavior report cards! • Establish baseline and set goal Gibbs 2011
Developing Progress Monitoring Probes • To monitor progress in learning content, you can develop a set of vocabulary matching probes (Espin, et al., 2005). • Comprehensive list of terms and definitions for the course/subject • Randomly generated probes with 25 or more terms given as frequently as desired. • Graph number or percentage correct Gibbs 2011
About norms for locally developed PM measures • You could initially use norms included in ABCs of CBM and other resources to give you a “ballpark idea” of student’s performance level • But remember, since you are measuring the student’s progress from baseline, you can determine growth without much attention to norms! • Goal setting will require some creativity! Gibbs 2011
Goals setting with locally developed PM measures • You could determine the performance levels on your probes of your most proficient students and use those levels in developing goals for students in interventions. • You could use “norms” included in ABCs of CBM and other sources to set goals • Over time, you could determine scores which correspond to successful performance on ARMT or AHSGE and set those as goals. Gibbs 2011
Graphing progress monitoring data • Commercially available progress monitoring tools are equipped with graphing features. • Results of teacher or school developed progress monitoring probes can be graphed with a number of graphing tools including the Data Management Tool available on the ALSDE website and at this Dropbox URL http://dl.dropbox.com/u/24788238/GraphingtoolGibbs.xls Gibbs 2011
Some valuable resources Gibbs 2011
The ABCs of CBM: A practical guide to curriculum-based measurement (Hosp, et al., 2007) • Instructions regarding creating progress monitoring probes • R-CBM, Mazes, Math Computation, Math Problem Solving, Written Expression, Spelling, Early Literacy, Early Numeracy • Expected levels of performance for grades 1-8 Gibbs 2011
Assessing Reading Multiple Measures for All Educators Working to Improve Reading Achievement (2nd ed, 2008) • Phonics Screener • Phoneme Deletion Test • Phonological Segmentation Test • Phoneme Segmentation Test • Graded High Frequency Word Survey • San Diego Quick Assessment of Reading Ability • Maze passages • Oral reading fluency passages Gibbs 2011
3-Minute Reading Assessments: Word Recognition, Fluency, and Comprehension (Rasinski and Padak, 2005) • Includes a 16 point multidimensional fluency scale!! • Expression and volume • Phrasing and intonation • Smoothness • Pace Gibbs 2011
Mathematics RTI: A Problem-Solving Approach to Creating An Effective Model (Allsopp, et al., 2010) • Includes specific recommendations for analyzing mathematics curriculum as it relates to struggling students. • Includes discussion of responsive mathematics teaching practices. Gibbs 2011
RTI in Mathematics (Riccomini and Witzell, 2010) • Excellent tools to facilitate assessment, instruction, and intervention for • Number sense • Whole numbers • Fractions and decimals • Problem solving • Mathematical vocabulary! Gibbs 2011
Anticipation Guide Revisited Gibbs 2011
THANK YOU! RTI for Early Readers: Implementing Common Core Standards in Your K-5 RTI Model (LRP, 2011) RTI for Middle and High Schools: Strategies and Structures for Literacy Success(LRP,2008) Leading the Dyslexia Challenge: An Action Plan for Schoolwide Identification and Intervention (LRP, 2004) gibbsdenise@aol.com Gibbs 2011