370 likes | 475 Views
PVAAS and Grade-Level Planning. PVAAS Statewide Core Team Fall 2008. PA Data Tools…. Context: 3-Phase Data-Informed Inquiry Cycle. Analysis Discovery. Data. Solutions. Unit of Analysis: The Grade/Course. Purpose of Analysis:
E N D
PVAAS and Grade-Level Planning PVAAS Statewide Core Team Fall 2008
Context: 3-Phase Data-Informed Inquiry Cycle Analysis Discovery Data Solutions
Unit of Analysis: The Grade/Course Purpose of Analysis: • Plan to Improve Student Achievement and At Least Meet AYP Targets by Grade Level • Analyze Grade-Level Results • Not student level, different meeting purpose • Grade-level Goal Setting • Ensure Instructional Program Coherence/Alignment • Curriculum • Instruction • Assessment • Supports (infrastructure including scheduling, staffing, etc) • Monitoring Implementation and Effectiveness of Grade-Level Program
School Structures for Data-Informed Decision Making District-Level Support (Budgetary Support, Professional Development, Resources and Time) Demographic lPerceptual l Process Data Student Learning Data Annual Building-Wide Planning Process Focus: All Students Who: School-Wide Team How: PDE Getting Results, Data Retreat, School/Continuous Planning Process • Building Level • School Demographic Data • PennData • Discipline Data • Attendance Data • Mobility Rate Data • Parent Surveys • Building Level • PSSA & PVAAS • Final 4Sight Benchmark Test • Standardized Assessments • District End-of-Year Tests • EAP/Tutoring Assessments • Grade/Course Level • Initial: PSSA/PVAAS/Final Tests • Class/Subgroup Levels • Cyclical: • 4Sight Benchmark Data – Grade Level • District Quarterly Assessments • Common Classroom Data • Classroom Summaries • EAP/Tutoring Assessments Periodic Grade-Level Planning Process Focus: Groups of Students Who: Teacher Teams How: Regular 1-2 Hour meetings • Grade/Course Level • Class Demographic Data • Class Engagement Data • Satisfaction Data • Attendance Data • Walk Through Data Student-Planning Process Focus: Classroom of Students Who: Teacher • Classroom Level • Initial: PSSA/PVAAS/Final Tests • Student-Level Achievement and Growth Data • Cyclical: • 4Sight Benchmark Data – Student Level • Continuous • Individual Classroom Assessments • EAP/Tutoring Assessments • Progress Monitoring • Classroom Level • Qualitative Data • Student Historical Information • Student Medical Information • Student Learning Information
Periodic Grade-Level Planning Meetings Unit of Analysis: Grade Level/Course
Post the Data Deliveries Baseline Benchmark Data Baseline DIBELS Benchmark Data DIBELS Data Local Assessment PSSA & PVAAS DIBELS Benchmark Assessments Local Assessments June Sept Jan Mar June PSSA & PVAAS Reports; Final Benchmark Final DIBELS Final Benchmark Final Local Assessment Benchmark Data Local Assessment
Post the Data Deliveries PSSA & PVAAS Baseline Benchmark Data Benchmark Assessments Local Assessments June Sept Jan Mar June PSSA & PVAAS Reports; Final Benchmark Final Benchmark Final Exam Benchmark Data Midterm Exam
Considerations • Should you schedule an assessment without scheduling the data meeting to interpret the data and make instructional decisions? • Does the number of data meetings in your school depend on the frequency and clustering of the deliveries of the assessment data reports?
What Does Grade-Level Planning Look Like Now In Your School? • Does it occur Monthly? Weekly? • Who is at the table? • Principals • Teachers (Regular Educators, Special Educators) • Others • Is there an established protocol? • Ex. Mike Schmoker’s protocol • Who leads process? • What data are used? • Does this result in an action plan? • Is the action plan followed? • Process monitored to ensure strategic use of time? Small Group Discussion
PVAAS Performance Diagnostic ReportThe Key is the RED Whisker!
What the whiskers tell us… Exceeded Growth Standard; More than One Year’s Growth Green Met the Growth Standard; Made One Year’s Growth Yellow Growth Standard (One Year’s Growth) Below Growth Standard; Less than One Year’s Growth Rose
Performance Diagnostic Report Every performance level (below basic, basic, proficient, advanced) in each grade has met or exceeded a year’s worth of growth? (“Getting Results!”™, p. 12)
Check for Understanding:Performance Diagnostic Report • What do the Gain and Standard Error values mean? • What do the colors mean on the pie chart version of this report? • What does the red whisker, or red line, mean? How does it help in the interpretation of this report? • What do the blue bars represent versus the gold bars?
Patterns: Performance Diagnostic Report Every performance level (below basic, basic, proficient, advanced) in each grade has met or exceeded a year’s worth of growth? (“Getting Results!”™, p. 12)
A Patterns of Growth C B
Line Of Inquiry:Questions for Grade-Level Teams • Where are we doing well? Which groups are showing a significant positive gain? • Where are we not doing as well? Which groups are showing negative gain? • What patterns are you seeing across the groups? • Is this consistent with previous reports? • What is the impact of this growth on achievement?
Hands-On with Performance Diagnostic Reports • Review by Grade Level • Review by Subject • Be Sure You Know How to Locate Subgroup Reports as well • Discuss Patterns and Meaning of Patterns
PVAAS HELP Menus: Great Resource! • HELP Menu for Performance Diagnostic Report has been greatly enhanced • All HELP Menus will be organized into following areas: • Navigating • Understanding • Interpreting • Using • Assistance • Other report HELP menus will be developed throughout the school year
Custom Diagnostic Report • What is it? • Procedure to examine growth patterns based on user-defined educational criteria for groups of 15 or more students. • How might a grade-level team use this report? • Explore effects of intervention programs, etc. • Explore effects of varied curricular and/or instructional experiences. • Compare growth patterns: • COMPARE Group A (15+ Students)- Intervention #1 TO Group B (15+ Students)- Intervention #2 • Cautions • This report does not infer any causal relationships between any educational variables and student growth.
Projection Summary Reports • What is it? • Report that summarizes the numbers and percents of students in likelihood ranges of performing at the proficient level on a future PSSA exam. • How might a grade-level team use this report? • Intervention Planning • Resource Allocation • Strategic Planning • School Improvement Planning • Cautions • This report is one indicator about likelihood of future performance and should not be used in isolation.
Grade Projection Summary Report Is our grade level on the trajectory to meet the AYP targets?
Which Grade Projection Reports can you see? • Those of the cohorts last tested in your building, unless you have an account with district-wide access • There may be a need to send grade projection summaries to the “receiver building” from the “feeder” building(s) • Or give receiving buildings access to feeder buildings’ school level data
Hands-On with Grade Projection Summaries • Review by Grade Level • Review by Subject • Discuss Meaning as It Relates to AYP Targets • Discuss as It Relates to Other Building/District Goals • Discuss When These Would Be Used by Grade Level Teams
Additional Professional Development • Each IU has a PVAAS contact • Fall 2008 Professional Development • 21 statewide webinars for districts • Dates on PDE website • 42+ days of hands-on professional development offered at IUs • Dates on PDE website • Contact your IU for registration/details • Follow-up support • Professional Development Materials • PDE website
Statewide PVAAS Professional Development, Support & Materials Contact Information; Account Management Questions pdepvaas@iu13.org 717-606-1911 PVAAS Website https://pvaas.sas.com
www.pde.state.pa.us Gerald L. Zahorchak, D.Ed. Secretary of Education Commonwealth of Pennsylvania