1 / 58

Enhancing School Improvement with Differentiated Accountability

Learn how Differentiated Accountability can align supports for struggling schools, improve instruction, and boost student achievement. Develop effective problem-solving strategies for educators. Implement targeted and monitorable improvement plans.

richardy
Download Presentation

Enhancing School Improvement with Differentiated Accountability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Differentiated Accountabilityand PS/RtILeading the School Improvement ProcessPresented by: Rebecca Sarlo, Ph.D.Region IV Response to Intervention SpecialistFlorida Department of EducationDr. Eric J. Smith Commissioner

  2. Connections Differentiated Accountability… • Utilizes the PS/RtI framework to align and assign supports for Florida’s most struggling schools • Prescribes interventions and intervention supports which build capacity for effective problem solving for all educators • Utilizes an instructional review processes and school improvement planning built on a PS/RtI framework to foster targeted, managable, and monitorable school improvement.

  3. PS/RtI and School Improvement Key Differentiated Accountability Beliefs: • School improvement should focus on 1) improving instruction and 2) increasing student achievement. • All administrators and teachers need to know what effective instruction looks like, how to plan for it, how to deliver it, and how to assess it. • Improving instruction leads to increased student achievement. • School Leadership Teams should utilize the problem solving process to identify instructional, curricular, and environmental barriers to improving student achievement. • School Improvement Plans (SIPs) should focus on removing or lessening the impact of these barriers. • Professional development should be focused on building team’s capacity to remove barriers to student achievement.

  4. 2010/2011 School Improvement Plan Changes Student Achievement Trend Data High Qualified (HQ) Administration, Instructional Coaches, and Teachers Non-Highly Qualified Instructors Staff Demographics Teacher Mentoring Program Coordination and Integration-Title I Schools Only Response to Instruction/Intervention (RtI) Literacy Leadership Team (LLT) NCLB Public School Choice Preschool Transition and/or Postsecondary Transition Problem-Solving Process to Increase Student Achievement Goals for Reading, Writing, Mathematics, and Science Goals for Attendance, Suspension, and Parent Involvement Additional Goals Problem-Solving Process to Improve Instruction Professional Development (PD) and/or Professional Learning Communities (PLCs) Budget Differentiated Accountability (DA) Strategies and Support Requirements and Checklists School Advisory Council (SAC)

  5. Content Changes in 2010-2011 SIP Template The school demographic section including school mission and vision was eliminated. Goals should still be connected to mission and vision of the school. Florida’s Continuous Improvement Model (FCIM) is no longer a separate section. FCIM should be embedded as an instructional strategy within the Expected Improvements section. Problem Solving (PS) embedded within the Goals chart template. Language and processes have been more closely aligned with PS/RtI. Barriers section has been added to assist teams in analyzing why differences exist between current and expected levels of performance. Professional Development (PD) plan and Professional Learning Community (PLC) functions are integrated. Expectation is that most content specific professional development will occur through PLCs. Expands PD to include coaching and collective inquiry (e.g., Lesson Study).

  6. Table Talk • How have schools in your district approached school improvement planning in the past? • How involved were teachers, parents, and students in the development of the SIP? • How effective was this approach? • What changes, if any, need to occur to improve the school improvement planning process?

  7. Expected ImprovementsProblem Solving Process to Increase Student Achievement Data Analysis Use the data analysis section to guide your determination of the areas in need of improvement. Identify areas for which there is a difference between students’ current levels of performance and the expected levels of performance. Goals Align goals with the areas in need of improvement identified through data analysis. Goals should be specific, measureable, attainable, realistic, and time-bound (SMART). Safe Harbor targets constitute minimal improvement goals. Anticipated Barriers Identify what barriers have or could preclude students from meeting the expected levels of performance and/or achievement goals. Review relevant instructional, curricular, and environmental variables. Focus on alterable barriers only. Problem Identification Problem Analysis

  8. Expected ImprovementsProblem Solving Process to Increase Student Achievement Instruction/Intervention Design and Implementation • Strategies • Given the identified barriers, what strategies will be implemented to address these barriers and ensure the goal is met? • Person or Position Responsible for Monitoring Strategies • Identify the person or position who will be responsible for ensuring the strategy is implemented with fidelity. • Describe how fidelity of action steps will be monitored. • Process Used to Determine Effectiveness of Strategies • Describe the process that will be used to determine if the strategy/action is effective. • Effectiveness of strategies is determined through the analysis of student data. • Effective strategies will result in a decrease in the difference between expected and current levels of performance. • Evaluation Tool • Describe/Name the assessment tool to measure student achievement linked to objectives. Fidelity Check Program Evaluation

  9. Improving the School Improvement Process • Consensus is built though data analysis and problem analysis. • The more stakeholders involved in analyzing data, identifying barriers, and planning instructional strategies, the more buy-in to the school improvement plan. • The more buy-in to the school improvement plan, the more consistent implementation of school improvement strategies. • The more consistent implementation of school improvement strategies, the better student outcomes.

  10. Identifying and Defining Problems

  11. SIP Goals Charts *When using percentages, include the number of students the percentage represents (e.g., 70% (35)).

  12. School Improvement Planning Student Achievement Data Trend data will assist schools in determining thedifference between expectedlevels of performanceand actual student performanceover time. This analysis will help determine a need and focus for school improvement efforts. School Grades Trend Data (Use this data to complete Sections 1-4 of the reading and mathematics goals and Sections 1 and 2 of the writing and science goals.) Adequate Yearly Progress (AYP) Trend Data (Use this data to complete Sections 5A-5D of the reading and mathematics goals and Section 3A-3D of the writing goals.) Florida Comprehensive Assessment Test (FCAT) Trend Data (Use this data to inform the problem solving process when writing goals.)

  13. Problem IdentificationStep 1Current Level of Performance • Identify the Current Level of Performance. • Analyze School-wide and AYP subgroup data. • FCAT data • Percent of students with excessive absences • Percent of students with discipline referrals and/or suspensions • Graduation and Dropout Rates • Percent of students on-track/off-track for graduation • Percent of students enrolled in accelerated coursework • Performance rates of students within accelerated coursework • Percent of students scoring college ready on college readiness exams • Percent of FCAT level 3 or higher students taking college readiness exams

  14. Problem IdentificationStep 2Expected Level of Performance • Identify the Expected Level of Performance. • School-wide or AYP subgroup • Utilize Proficiency Standards or Safe Harbor goals to determine expected levels for academics. • Set ambitious yet realistic goals to improve attendance, suspensions, enrollment and performance in accelerated coursework, and graduation rates.

  15. Determining Expected Levels Utilizing Safe Harbor Example: Determine percent of non-proficient students: • 49% of African American students scored level 3 or higher on Reading FCAT during 2009-2010 school year • 100%-49%=51% • 51% of African American students are non-proficient Plan to Reduce non-proficient students by 10% • 10% of 51% is 5.1% Always round up • 5.1% is rounded to 6% Add required improvement to current level • 49% + 6% = 55% Safe Harbor Goal

  16. Example Guiding Questions • What does the data tell us? • What is the difference between expected and current levels of performance? • Are there groups of students for whom the core is not effective? • Are our results consistent with our mission and vision? • What results should we expect next year if we do not make changes to our system (i.e., instruction, curriculum, environment)

  17. Table Talk Discuss the professional development that school personnel in your district may need regarding: • Determining current and expected levels of performance • Data management and manipulation (e.g., graphing) • Data analysis

  18. Identifying Barriers to and Drivers of Student Achievement

  19. Problem Analysis • Allows us to identify barriers to reaching expected levels • Is key to strategic planning • Is often skipped because it takes time and effort to complete • Will lead to more effective instructional and intervention plans • Will allow for more efficient application of instructional and intervention resources

  20. Identifying Barriers to Student Achievement • Collect and compile relevant information from multiple domains. • I = Instruction • C = Curriculum • E = Environment • L = Learner • Consider multiple methods for data collection. • R = Review existing data • I = Interview or survey stakeholders • O = Observe instructional environment • T = Test to obtain diagnostic data

  21. Problem Analysis Learner Variables to review include: • Skills • Motivation • Health • Prior knowledge • Consider what learner variables tell us about the need to manipulate instructional, curricular, and environmental variables. 22

  22. Problem Analysis • Use information to identify instructional, curricular, and environmental barriers. • The problem is occurring because… • Determine if the barriers are alterable. • Discard unalterable barriers. • Develop prediction statements. • If we…the problem would be eliminated/reduced. 23

  23. DA Instructional Review Elements • Classroom Culture & Environment • Instructional Tools & Materials • Lesson Planning & Delivery • Higher Order Questioning & Discourse • Student Engagement • Rigorous Tasks & Assessments • Differentiated Instruction • Cross Content Reading & Writing Instruction • Florida Continuous Improvement Model • School & District Leadership

  24. Identified Barriers and School Improvement Planning • A thorough analysis will likely result in multiple identified barriers. • Some barriers will be identified across content areas. • e.g., grade level vocabulary is not directly taught • Some barriers may be unique to specific content areas. • e.g., core science instruction rarely include the use of science labs • Some barriers have impact on all aspects of the organization. • e.g., a significant amount of instructional time is lost due to student absenteeism

  25. Identified Barriers and School Improvement Planning • Prioritize! • Are there barriers that need to be addressed first before other barriers can be addressed? • Are there barriers that are cross-content and could be addressed as a school-wide focus? • Are there barriers that if addressed would have greater impact on student outcomes than others? • Are there barriers that we are ready to address right now (i.e., we have the expertise, support, and materials to address the barrier)? • Prioritize barriers which are foundational in nature, school-wide, high yield, and immediately actionable.

  26. SIP Goals Charts *When using percentages, include the number of students the percentage represents (e.g., 70% (35)).

  27. Problem Analysis: Challenges and Opportunities School personnel, particularly at the secondary level, typically require significant support for problem analysis in order to move past student motivation and parent involvement barriers Secondary students must be involved in the identification of barriers and as much as possible in the selection of strategies to address barriers

  28. Table Talk • What type of support is provided to schools in your district to assist them in identifying barriers to student achievement, attendance, pro-social behavior, and graduation? • How are students involved in this process? • What type of support is needed at the school level to get more stakeholders, including students, involved in the identification of barriers to student achievement, attendance, pro-social behavior, and graduation?

  29. Instruction/Intervention Design and Implementation

  30. SIP Goals Charts *When using percentages, include the number of students the percentage represents (e.g., 70% (35)).

  31. Intensifying Core Instruction • Increase time and response opportunities. • Improve core program efficacy. • Improve core program implementation. • Decrease group size. • Increase coordination of programming and instruction. Simmons, 2003

  32. Characteristics of Effective Interventions Focus on intervention rather than remediation. Intervention response is: • Systematic • Practical • Effective • Essential • Directive

  33. Instructional Design and Implementation • Instructional and intervention strategies are chosen to address identified barriers. • Common Mistake: Teams select strategies which are not directly linked to identified barriers. • Example of Common Mistake: • Indicator: Student absenteeism is chronic and pervasive in Algebra 1 and Geometry classes. • Strategy: School provides math tutorials during extended learning program (ELP).

  34. Instructional Design and Implementation • Select strategies to address specific barriers. • Break strategies down into manageable chunks (action steps). • Determine timelines and identify who is responsible for each action step. • Determine what data will be collected to monitor progress. • Instruction and intervention effectiveness is always determined by a review of student data.

  35. Instructional Design and Implementation • Determine… • What resources are needed to implement the strategy? • What personnel support is needed to implement the strategy with fidelity? • What, if any, professional development is needed to implement the strategy with fidelity? • Plan professional development on SIP • How will fidelity of implementation be monitored?

  36. Professional Development (PD) and/or Professional Learning Community (PLCs) Determine the professional development plan to support the implementation of SIP strategies Professional Development includes traditional training, coaching, and lesson study All professional development should be… Aligned with strategies Address anticipated barriers to implementing the strategies Focused on improving instruction to increase student achievement.

  37. Table Talk • What supports are needed to assist schools in identifying and implementing instructional/intervention strategies to address barriers to student development? • How will school personnel’s working knowledge of research-based instructional practices and interventions impact their school improvement planning? • How could staff development improve this process?

  38. Determining Effectiveness of Instructional Strategies

  39. Team Talk • How do school teams in your district typically evaluate the effectiveness of our instruction/interventions? • Does this method allow for timely data-based decision making based on student response to instruction/intervention?

  40. Program Evaluation Measuring student response to instruction/intervention • Compare students’ rate of progress to rate of progress required to close the current level/expected level gap. • Examine rate of progress for all disaggregated groups. • Modify interventions for groups for whom intervention effectiveness is questionable or poor. 41

  41. Student Response & Instruction/Intervention Decisions Positive Response Continue intervention with current goal Continue intervention with goal increased Fade intervention to determine if student(s) have acquired functional independence. 42

  42. Celebrate Keep current interventions until goal is met Once goal is met, determine if interventions can be faded Trend Line Aim line

  43. Questionable Response Was intervention implemented as intended? If no - employ strategies to increase implementation fidelity If yes - increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving. Student Response & Instruction/Intervention Decisions 44

  44. Continue Current interventions, extend monitoring phase and consider: Increasing time of intervention Increasing intensity of intervention Have student self-monitor or self-evaluate performance

  45. Poor Response Was intervention implemented as intended? If no - employ strategies in increase implementation fidelity If yes - Is intervention aligned with the verified barriers? (Intervention Design) Are there other barriers to consider? (Problem Analysis) Was the problem identified correctly? (Problem Identification) Student Response & Instruction/Intervention Decisions 46

  46. Recycling through PS process and address: Do the students have needed prerequisite skills? Should alternative hypotheses be examined? aimline trendline

  47. Instruction/Intervention Fidelity Decisions Research based instruction/intervention linked to verified hypothesis planned Research based instruction/intervention implemented Student Outcomes (SO) Assessed Fidelity (F) Assessed Continue Instruction/Intervention +SO +F Data-based Decisions -SO -F Implement strategies to promote instruction/intervention fidelity Modify/change Instruction/Intervention -SO +F Adapted from Lisa Hagermoser Sanetti, 2008 NASP Convention

  48. Program Evaluation • Data must be effectively managed to allow for program evaluation decisions to be made • Excel • SWIS • PMRN • Interventions are ALWAYS evaluated first • Use data to determine who the intervention is working for and for whom it is not working • Problem-solve around ineffective interventions

  49. Team Talk • What data would schools need to consistently collect and analyze to evaluate the effectiveness of their school improvement strategies? • Does your district’s data management system allow for efficient, ready access to student data? • What type of professional development and on-going support do school personnel need to access and manage student data and determine students’ response to instruction/intervention?

More Related