1 / 77

FDRESA Design Team Training

FDRESA Design Team Training. Candler County Schools Day One Presented by Dr. Sharonda W. Johnson. Pre-assessment Results. Data Analysis Strand. Progress Monitoring Strand. Planning and Organization Strand.

tory
Download Presentation

FDRESA Design Team Training

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FDRESA Design Team Training Candler County Schools Day One Presented by Dr. Sharonda W. Johnson

  2. Pre-assessment Results

  3. Data Analysis Strand

  4. Progress Monitoring Strand

  5. Planning and Organization Strand

  6. “Results are achieved when leaders implement—transferring their learning into practice; collaborating with others to find solutions; managing, monitoring and supporting adoption of new behaviors by those they lead; and measuring the impact on performance.” --Georgia’s Leadership Institute for School Improvement

  7. Essential Questions • How can we use the Design Team process to analyze our current practices and to design a systematic, systemic approach to school improvement? • How can we ensure that priority interventions are implemented and are effective?

  8. Design Team: Areas of Work • Leading staff in the analysis of data and identification of targets for improvement • Leading the staff in prioritizing interventions • Benchmarking improvement plan activities • Monitoring implementation • Leading the staff in modifying the plan at least annually

  9. Design Team Task • Read your assigned role/responsibility. • Discuss each one, using discussion guidelines to answer the following questions: --What does this item ask us to do? --Why would this task be assigned to the design team? --How might this action benefit students?

  10. Brainstorming and Discussion Guidelines for More Productive Interaction Brainstorming • Appoint a recorder for the group. • Move in consecutive order around the group, with each person contributing an idea or saying, “I pass.” • Allow no discussion at this time. • Limit contributions to 20 seconds. • Piggyback on other’s ideas, extending or adding to an idea already offered. From Mike Schmoker's Results Now

  11. Brainstorming and Discussion Guidelines for More Productive Interaction Discussion • Appoint a facilitator and recorder. • Restrict comments to information directly linked to questions under discussion. • No one speaks a second time until everyone who wishes to be heard has been heard. • Give facts versus opinions, or give facts to support opinions. • Listen to sort fact from opinion and ask clarifying questions when needed. • Pause periodically for facilitator to summarize discussion points. • Facilitator must refocus the group when members stray from question under discussion.

  12. Design Team Activity • Individuals read all roles and responsibilities once again. • Place a + beside items in which you feel you have skill and knowledge. • Place a – beside items in which you need more skill and information. • Share at your table. Be prepared to share with the large group.

  13. Data Collection Student Data Programs & Structures Data Family& Community Data Professional Practices Data In successful schools, a thorough look at data guides decisions. 14

  14. Design teams will… • Review AYP reports in light of higher requirements for making AMO in spring, 2009. • Determine subjects and subgroups in need of intervention. • Conduct root-cause analysis to identify potential barriers to students’ learning. • Use findings to identify actions in various school improvement plans that most closely align with new targets.

  15. State Proficiency Levels 18

  16. State Proficiency Levels 19

  17. AYP Guidelines • 10 or more students to be reported • 40 or 10% of enrollment in AYP grades to be accountable (maximum 75) • Mathematics goal beginning spring, 2009: 59.5% or 74.9% meeting or exceeding • Reading/ELA goal beginning spring, 2009: 73.3% or 87.7% meeting or exceeding • Subgroups with current pass rates less than 5% above new goals are in jeopardy

  18. As a design team… • Review AYP report for reading/ELA, highlighting any subgroups whose percent meeting or exceeding is 79% or less. • Review AYP report for mathematics, highlighting any subgroups whose percent meeting or exceeding is 65% or less. • Repeat procedure for second indicator selected for your school for FY 09 (upon return to school).

  19. Data Collection and AnalysisBased on the work of Bernhardt, the Georgia Department of Education, Marzano, Reeves, Sargeant, and Schmoker • Task 1: Organize Data/Create Table • Task 2: Graphic Representation • Task 3: Observe, Discuss, and Document • Task 4: Hypotheses • Task 5: Prioritize Primary Issues • Task 6: School/Classroom Connections

  20. While comparing student achievement, keep in mind that the assessments were developed using objectives linked to two distinct curricula (QCC/GPS). Therefore, it is recommended that results are utilized cautiously if considering trend data. 23

  21. Guiding Question: What patterns do we observe in the data? Study the data and individually record observations on yellow post-it notes. (Be careful not to make judgments or to draw conclusions.) Observations must be written as factual statements. Example Observation: Females have scored lower in 3rd grade math than males over a three year period on the CRCT. SWDs in grades 6-8 have scored below all other subgroups over a two year period on the Reading/ELA CRCT. Discuss patterns that members see. Record the observations as “data findings” on the flip chart for all members to see. Be sure each statement indicates: What was the pattern and over what period of time? What was the source? Which subjects or skills? Which students? Task 3: Observe, Discuss, & Document – note data patterns (Yellow Post-its)

  22. Prioritize Concerns Team Task  • Look at all of the data findings that have been listed. • Use a group process to determine which of these concerns rises to the top as high priority. • List the top 1-3 primary concerns as determined by group consensus. Observations: Go ToGOALS As a team, write or rewrite an initial goal statement for each primary concern. Record on chart paper.

  23. Moving From Facts to Causes: Getting to the Root Cause

  24. Root Cause Analysis: In 50 words or Less Rooney and Vanden Heuvel (2004) • Root cause analysis helps identify what, how, and why something happened, thus preventing recurrence. • Root causes are underlying, are reasonably identifiable, can be controlled by management and allow for generation of recommendations. • The process involves data collection, cause charting, root cause identification and recommendation generation and implementation.

  25. Hypotheses should: Be explanations that come from school and classroom factors. Example: Students of poverty are not gaining ample access to reading materials from our school. Be explanations about practices that can be altered. Hypotheses should NOT: Be regarding characteristics of individuals (students, parents, staff, or community members). Example: These students are poor. Be explanations about unalterable factors. Hypotheses – pose hypotheses for data patterns observed (Green Post-its) Task 4: • What is a HYPOTHESIS? A theory; An assumption; An educated guess – The WHY! 29

  26. What is it that we are doing or not doing that might contribute to these results? How can we explain our results in terms of our practices? The _____ grade _____ (subject) scores __________ (increased/decreased/stayed the same) because we…

  27. Hypotheses Examples Female 3rd grade math scores were lower because we don’t utilize systematic questioning techniques such as collaborative partners. Female math scores were lower because we need to consider the number of females/males taking the test. SWD students’ scores decreased because the expectations and rigor are not the same for these students as regular education. Eighth grade math scores increased 5% because we implemented Connected Math. Hypotheses: Go ToActions/Strategies 31

  28. Analyzing Proficiency  Team Task  Why do we think these patterns occur? Pose Hypotheses. Using the Georgia School Standards (GSS), pose three no more than four possible explanations for the data patterns you observe. Write your hypotheses on the fishbone diagram. Include the standard and component. What “curriculum” issues can contribute to your findings? What “assessment” issues can contribute to your findings? Repeat for each of the Georgia School Standards (GSS) strands, standards, and components. 32

  29. Task 5: Primary Issues Student Achievement Data • CRCT (Proficiency Levels, Cut-off Scale Scores, Domains, Individual Students) • Local Assessments Our Primary Issues in Our Practice Data analysis is inconsistent and does not result in revising of instruction. Basis for Improvement Actions/Strategies 33

  30. Identify “common threads” across strands • Use a group process to determine your top 3 to 4 common threads across strands that emerged from your data analysis. • Write each thread in a complete statement (primary issues). • Group hypotheses by strand/component under each column. • Prioritize primary issues.

  31. Primary Issues Team Task  Use a group process to determine your top 3 to 4 common threads across strands that emerged from your data analysis. Write the thread in a complete statement. Group GSS component statements under each column. Example: Thread: Collaborative work is inconsistent and not focused on student learning. Components: Curriculum 2.2 Collaborative planning was not consistently used for teachers to reach consensus on what all learners should know, do, and understand. 35

  32. Guiding Question: How can we connect our data patterns and our hypotheses to the classroom and to our school? Classroom strategies may include instructional methods and school-wide curriculum and strategies. Task 6: School/Classroom Connections – record ideas of school/classroom strategies to improve data patterns (Pink Post-its) 36

  33. Identify “best bet” interventions • Divide SIP/BSC among design team members. • Scan actions, strategies, or interventions to find those that directly address subgroups and subjects in jeopardy. • Flag those items using sticky notes with the subgroup and subject written on the note. • Share findings as a group.

  34. Identify “best bet” interventions (Continued) • Use the Implementation Resource to identify possible interventions that directly address subgroups and subjects in jeopardy. • Flag those items using sticky notes with the subgroup and subject written on the note. • Share findings as a group.

  35. Performance/Action 1The school has established a process to determine what all learners should know, do, and understand by the end of each grading period, at all grade levels, and within all subject areas Artifacts: Curriculum units Curriculum maps Thematic/ concept-based units Teacher meeting minutes Teacher meeting agendas Analyzed data Adjusted plans Evidence: Teachers and other instructional leaders analyze their formative and summative assessment data and can show the areas of need for all students. Teachers can explain how their instructional plans are adjusted based upon student work. Expectations are consistent within and across grade levels.

  36. Prioritize interventions using the following criteria… • Which ones most directly address target subjects and subgroups? • Which ones will likely have greatest impact on student performance? • Which ones are within our control? • Which ones can be afforded given budget constraints?

  37. Next Steps

  38. Information to share with staffs • Roles and responsibilities of design team • New annual measurable objective (AMO) targets • Interpreting AYP reports • Subgroups and subjects in jeopardy • Root cause analysis chart • Prioritized school improvement plan actions with greatest potential impact on subgroups and subjects in jeopardy

  39. Design Team Day Two Progress Monitoring: Inspect What We Expect

  40. Driving Teaching and Learning fromGood to Great What are the district’s goals for the system of monitoring progress and supporting implementation? • Use language of the standards • Align instruction to standards-based classrooms • Differentiate instruction • Use formative and summative assessments appropriately

  41. Driving Teaching and Learning fromGood to Great What is the current system of progress monitoring and supporting implementation? • To make the organization great requires that we use a process for assessing implementation and for professional development that takes people fromwhere they aretowhere they need to be.

  42. Driving Teaching and Learning fromGood to Great What do you see as the strengths and weakness of your system of monitoring teaching and learning and supporting implementation of research-based best practices?

  43. Are we on target? Monitoring progress toward full implementation

  44. Benchmark (n): • A marked point of known or assumed elevation from which other elevations may be established • A standard by which something can be measured or judged • “His painting sets the benchmark of quality.”

  45. Implementation benchmark (n): • A description of the desired level of use against which the actual level of use can be judged • “Implementation benchmarks set concrete goals for teachers and administrators and help them determine their progress toward those goals.”

More Related