1 / 53

Decision Making for Results

Decision Making for Results. Part One: Objectives. Develop a deeper understanding of the Decision Making for Results: Data-Driven Decision Making process Increase awareness of the relevance of data and its impact on leadership, teaching, and learning

amandla
Download Presentation

Decision Making for Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Decision Making for Results

  2. Part One: Objectives • Develop a deeper understanding of the Decision Making for Results: Data-Driven Decision Making process • Increase awareness of the relevance of data and its impact on leadership, teaching, and learning • Reinforce the importance of collecting both cause and effect data

  3. Objectives Apply the Decision Making for Results: Data-Driven Decision Making process to monitor leadership, teaching, and learning Implement the Decision Making for Results: Data-Driven Decision Making process to monitor school improvement

  4. Principles ofDecision Making For Results Antecedents Accountability Collaboration

  5. Seminar Overview • Introduction • Building the foundation • Process and application • Action planning

  6. Becoming Data Driven How are you currently embracing a data-driven decision making process that leads to results?

  7. Results-Driven Schools • Where is the proof? • 90/90/90 Schools, Reeves 2003 • Education Trust, 2002 • NCREL, 2000 • Consortium for Policy Research in Education, 2000 • EdSource, 2005 • Northern Illinois University Center for Governmental Studies, 2004

  8. Reflection “The value of the data emerges only when analysis provides insights that direct decisions for students.” S. White, 2005

  9. Part TwoBuilding the Foundation • Cause data and effect data • Continuous improvement cycle • Principles and processes of Decision Making for Results: Data-Driven Decision Making

  10. “Only by evaluating both causes and effects in a comprehensive accountability system can leaders, teachers, and policymakers understand the complexities of student achievement and the efficacy of teaching and leadership practices.” Reeves, 2006

  11. Effect data: Outcomes or results Cause data: Professional practices that create specific effects or results Definitions and Examples

  12. The Leadership & Learning Matrix

  13. PIM

  14. Part Three:Process and Application

  15. Ocean View Elementary School A Look at Collaboration

  16. The Process for Results Treasure Hunt Analyze to Prioritize Inquiry; Develop Questions SMART Goals Monitor & Evaluate Results Results Indicators Specific Strategies

  17. Inquiry “Data-driven decision making begins by asking fundamental questions.” Doug Reeves • What questions do you have about teaching and learning in your school? • What data sources are you using to gather the specific information?

  18. Step 1: Conduct a Treasure Hunt • Why? To gather and organize data in order to gain insights about teaching and learning practices • Considerations • Measures of data • Disaggregation • Triangulation • Reflection

  19. Measures of Data • Student learning • Demographics • Perceptions • School processes – Behaviors within our control: instructional and leadership strategies, programs and resources, and organization

  20. Disaggregation • To separate something into its component parts, or break apart • “Disaggregation is not a problem-solving strategy. It is a problem-finding strategy.” Victoria Bernhardt, Data Analysis, 1998 Think, pair, share: What data do you disaggregate, and how do you use the information?

  21. TriangulationA Look at Learning DRA Running Records Benchmark

  22. Case Study • Read case study • Part 1: How did they categorize the different data sets and record their observations? • Part 2: What did they discover?

  23. Conduct a Treasure Hunt Application • Review inquiry questions • Conduct a “Treasure Hunt” • Organize data on templates • Use rubric to monitor and evaluate your work

  24. Can You Identify with This? “It is not so much a lack of data, but an absence of analysis, and an even greater absence of actions driven by the data.” White, 2005

  25. Step 2Analyze Data to Prioritize Needs Data Analysis at Northside Middle School

  26. Analyze Data to Prioritize Needs • Why? To identify causes for celebration and to identify areas of concern • Considerations • Strengths • Needs • Behavior • Rationale

  27. Quality Prioritization • Why? To take immediate action on the most urgent needs • Quality prioritization requires a thorough understanding of: • Student population • Curriculum and Power/Priority Standards (leverage, readiness) • Antecedents affecting student achievement • Quality of program implementation White, 2005

  28. Case Study • Review case study • What insights did you gain after reading analysis of student performance? • Make a recommendation: What is the most urgent need?

  29. Review, Analyze, and Prioritize Application • Review data from Step 1 • Conduct analysis using the guiding questions • Prioritize urgent needs using the suggested criteria • Record your work on the templates • Use rubric to monitor and evaluate your work

  30. Step 3Establish SMART Goals • Why? To identify our most critical goals for student achievement based on the challenges that were identified through the inquiry process • Specific, Measurable, Achievable, Relevant, Timely

  31. Establish Your SMART Goals Application • Review prioritized needs • Review Treasure Hunt baseline data • Apply SMART goal formula, use templates • Use rubric to monitor and evaluate your work

  32. Goals – Application • Review prioritized needs • Review Treasure Hunt baseline data • Apply SMART goal formula; use templates to record your work • Use rubric to monitor and evaluate your work

  33. Share Your Findings with Colleagues • Meet in the middle of the room • Be prepared to share your findings from Steps 1-3 • Highlight one celebration from a colleague

  34. Step 4Select Specific Strategies Let’s watch Lake Taylor High School as they discuss strategies.

  35. Select Specific Strategies • Why? • Adult actions will impact student achievement • Strategies are – • Action-oriented • Measurable/accountable • Specific • Research-based • Considerations: Instructional, organizational, leadership, programmatic

  36. Research-Based Strategies • Reeves, D.B. (2003). 90/90/90 schools. Retrieved from www.LeadandLearn.com • Reeves, D.B. (2006). Ten things high schools can do right now to improve student achievement. • Learning 24/7 Observation Study (2005).What’s happening in schools? Or not?

  37. Additional Evidence in Support of Research-Based Strategies • Zemelman, S., Daniels, H., & Hyde, A. (2005). Best practice. Portsmouth, NH: Heinemann. • Marzano, R. (2007). The art & science of teaching. Alexandria, VA: ASCD. • Barr, R., & Parrett, W.H. (2007). The kids left behind. Bloomington, IN: Solution Tree. • Marzano, R., Waters, T., & McNulty, B. (2005). School leadership that works. Alexandria, VA: ASCD.

  38. Let’s Do It! Guided Practice

  39. Case Study • Revisit case study analysis • What types of strategies (instructional, organizational, leadership, programmatic) did they select? • How will the strategies help students overcome the obstacles?

  40. Select Your Specific Strategies • Revisit your prioritized needs • Research the best possible strategies to meet the learner needs • Group by type of strategy: Instructional, organizational, programmatic, and leadership • Use rubric to monitor and evaluate your work

  41. Step 5Determine Results Indicators Why? To monitor the degree of implementation and evaluate the effectiveness of the strategies

  42. Results Indicators • Considerations • Serve as an interim measurement • Used to determine effective implementation of a strategy • Used to determine if strategy is having the desired impact • Help to determine midcourse corrections

  43. Case Study • Review case study • How will their results indicators serve as an interim measurement? • How clearly will the results indicators help to monitor implementation and impact?

  44. Results Indicator Application • Revisit strategies (Step 4) • Develop results indicators • Use rubric to monitor and evaluate your work

  45. “Improvement cycles require leadership follow-up and relentless efforts to maintain the focus on data if decisions are truly going to be driven by informed data.” White, 2005

  46. Step 6Monitor and Evaluate Results Why? To engage in a continuous improvement cycle that – • Identifies midcourse corrections where needed • Adjusts strategies to assure fidelity of implementation

  47. Case Study • Review the case study • How did they monitor strategies? • Was there any evidence of midcourse corrections?

  48. Develop Your Monitoring Plan • Review your work from developing questions to determining results indicators then determine how you will monitor the strategies. When you create your monitoring plan consider: • Teacher or administrator teams • Monitoring cycles • Goals • Strategies • Impact on student and adult behavior • Ability to make midcourse corrections

More Related