150 likes | 288 Views
Webinar 3. Core Instruction (Tier 1). Tier 1: Core Instruction. Assessments: Screening Evaluating effectiveness of core instruction Research-based/Evidence-based Instructional Practices Matching student needs with research-based instructional practices.
E N D
Webinar 3 Core Instruction (Tier 1)
Tier 1: Core Instruction • Assessments: • Screening • Evaluating effectiveness of core instruction • Research-based/Evidence-based Instructional Practices • Matching student needs with research-based instructional practices
Problem Solving Applies to Programs and Systems In Handouts
School-wide Evidence Can be Used for Judging Effectiveness Effectiveness of instruction • Data suggests how well instructional practices and materials are meeting needs of all students • Data is used proactively to formulate core instructional match to student needs ALL EL SES
We Need to Identify Acceptable Range of Differences Across Classrooms Is the problem at a systemic level? Is it an effort or fidelity issue? Is it instructional, curricular, or environmental? -73.5 hrs -83 hrs 90 min. sessions 60 min. 95% 75% 67% 95%
Considerations when Selecting Screening Tools • How does the data inform instructional decisions? 2. How will the screening process work? • Is there training to ensure that staff are doing it the same way and making same judgments? • How will we be sure we are accurate in our judgments? 5. Make consistent judgments of data.
Screening Requires Cut-off Scores Aimsweb normative scores (2007) & Assisting Students Struggling with Reading: Response to Intervention (RtI) and Multi-tier Intervention in the Primary Grades (2009). Institute of Education Sciences Practice Guide.
Screening Tools Must Correctly Target Students Measures are specific if do not pick up students who are proficient Surprised but happy Measures are sensitive if pick up students truly at-risk Surprised and unhappy
Webinar 3 Tier I Core Instruction Part II: Karl
Effective Core Instruction: Aligned to Standards, Aligned Language, Differentiated Core instruction designed to address needs of 80% of students Limited Prior Knowledge Large Body of Prior Knowledge Needs Systematic Explicit Instruction Can Perform with Implicit or Guided Discovery
Effective Core Instruction Research Based Instructional Practices Differentiated Instruction (Content/Process/Product) for Heterogeneous groups to make progress -Pre-skills identified and taught -Universal Design for Learning (UDL) High rates of student response to teacher talk -Frequent, clear & specific feedback Coordination between services (core and intervention) -vertically aligned curriculum and stamdards 4. Aligned instructional language -Common language and vocabulary Range of Academic and behavioral skills
Evidence-Based Practices that Strengthen Core Instruction These inform intensity and access to instruction: • Horizontal and vertical alignment • Coaching/mentoring of evidence based practices • Interpretation of data • Quantitative • Student work • Effectiveness of implementation • Alternatives or options to strengthen student response • Collaboration to improve instruction based on performance data See SLD Manual Chapter 4 and 6
Instructional Content Phonemic Awareness Phonics Fluency Vocabulary Comprehension Instructional Design Explicit Instructional Strategies Coordinated Instructional Sequences Ample Practice Opportunities Aligned Instructional Materials For Example: Scientifically-based Reading Instruction
Matching Needs: Use Multiple Sources of Data Engaged time or ODR’s Attendance Data Screening scores– CBM, MCA, MAP Student Work • Language: • Reading: • Decoding/Word work • Comprehension Strategies • Math: • Number Sense • Fact Fluency Accurate match of student needs to intervention Social-emotional Regulation Medical (e.g. glasses, blood sugar monitoring etc.) Attendance and anger management
Criteria for Matching Needs with Instruction Intended use of data: • Discriminate between high and low-risk and average performers • Establish similar decisions across educators • Make reliable decisions across time • Cross-validate with informal measures and teacher judgments • Tease out inconsistencies or nuances in performance • Open vs. close ended response • Timed vs. untimed • Automatic from acquisition stage of learning