730 likes | 865 Views
Marshfield Public Schools District Determined Measures. Dr. Deborah A. Brady Ribas Associates, Inc. Do Now*. Please create a name tag or a “name tent” with your first name and school or department. Read the Table of Contents on page 1 Respond to the DO Now on page 2 of your handout.
E N D
Marshfield Public SchoolsDistrict Determined Measures Dr. Deborah A. Brady Ribas Associates, Inc.
Do Now* • Please create a name tag or a “name tent” with your first name and school or department. • Read the Table of Contents on page 1 • Respond to the DO Now on page 2 of your handout. *The materials are on line if you want to follow along and add notes at http://tinyurl.com/l7287z9
The SCOPE of the Work PLANNING
Today’s objectives By the end of this session, participants will: Understand the timeline, expectations, and implications of District Determined Measures for all Marshfield educators. Leave with a year-long plan for developing your department’s, school’s, team’s DDMs for this year’s pilot and next year’s full implementation. Have the tools and resources and will have begun the process of developing and implementing at least one DDM.
DESE is still rolling out the evaluation process and District Determined Measures 4 3 2 1
NEW DESE Support for Teacher Evaluation and Alignment to the Common Core • Sample DDMs in the five required pilot areas (last Friday). • Technical Assistance and Networking sessions on September 19th across the state • Technical Guide B (in this PowerPoint) addresses the practical application of assessment concepts to piloting potential DDMs and measuring student growth. • Model collective bargaining language will be available • An ongoing Assessment Literacy webinar series continues • Guidance on constructing local growth scores and growth models will be released • Guidance on determining Student Impact Rating will be released. (A work in progress with changes along the way)
Support from DESE • Additional Model Curriculum Units, which include curriculum-embedded performance assessments CEPAs • Guidance on the use of CEPAs as part of a DDM-strategy. • Professional development for evaluators on how to focus on shifts embedded in the new ELA and math Curriculum Frameworks during classroom observations. • Professional development for evaluators on how to administer and score DDMs and use them to determine high, moderate or low growth, focused on the five required DDM pilot areas. • A Curriculum Summit in November
DDM Impact 2014 • Take advantage of a no-stakes pilot year to try out new measures and introduce educators to this new dimension of the evaluation framework. • Districts are strongly encouraged to expand their pilots beyond the five required pilot areas. • Fold assessment literacy into the district's professional development plan to stimulate dialogue among educators about the comparative benefits of different potential DDMs the district could pilot. • Consider how contributing to the development or piloting of potential DDMs can be folded into educators' professional practice goals.
DDM Impact 2014 From the Commissioner: “Finally, let common sense prevail when considering the scope of your pilots. “I recommend that to the extent practicable, districts pilot each potential DDM in at least one class in each school in the district where the appropriate grade/subject or course is taught. “There is likely to be considerable educator interest in piloting potential DDMs in a no-stakes environment before year 1 data collection commences, so bear that in mind when determining scope.”
2014 Pilot Pilot Year SY2014 PILOT YEAR SEPTEMBER provide DESE a tentative plan for: • Early grade literacy (K-3) • Early grade math (K-3) • Middle grade math (5-8) • High school “writing to text” (PARCC multiple texts) • PLUS one more non-tested course, for example: • Fine Arts • Music • PE/Health • Technology • Media/Library • Other non-MCAS growth courses including grade 10 Math and ELA, Science DECEMBER—Implementation Extension Request Form for specific courses in the JUNE PLAN BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator The scores will not count for those who pilot DDMs in 2014.
SY 2015 • All professional personnel will be assessed with 2 DDMs, at least one of which will be locally determined and one will be MCAS growth scores, when available: • All teachers • Guidance • Principals, Assistant Principals • Speech Therapists • School Psychologists • Nurses EXCEPT those waivered by DESE based on a case-by-case decision process. The scores will count as the first half of the “impact score” with the waivered courses as the only exception
SY2016 • “Impact Ratings” will be given to all licensed educational personnel and sent to DESE • Two measures for each educator • At least one locally determined measure for everyone • Some educators will have two locally determined measures • The locally determined measure can be a standardized test such as the DRA, MAP, Galileo, etc. • The MCAS can be only one measure • The average of two years’ of scores • And of a two year trend of those two scores “Impact Ratings” Are based upon two years’ growth scores for two different assessments, at least one non-MCAS score that is locally determined.
Every educator earns two ratings Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low Summative Performance Rating Impact Rating on Student Performance 14 *Most districts will not begin issuing Impact Ratings before the 2015-16 school year. Massachusetts Department of Elementary and Secondary Education
Student Impact Rating Determines Plan Duration for PST (not future employment) Impact Rating on Student Performance Massachusetts Department of Elementary and Secondary Education
Acceptable (Standardized, but still considered District Determined) Assessments • MCAS can serve as one score for (ELA, Math, Science) • One or two locally developed assessments; some educators may have three • DESE Exemplars for the required piloted areas will be available in August 2013 • The MA Model Units Rubrics can be used • Galileo • BERS-2 (Behavioral Rating Scales) • DRA (Reading) • Fountas and Pinnell Benchmark • DIBELS (Fluency) ??? • MCAS-Alt • MAP • AP
A Variety of Assessment Types • On Demand (timed and standardized) • Mid-Year and End-of-Year exams • Projects • Portfolios • Capstone Courses • Unit tests • Other Formats can include: • Multiple choice • Constructed response • Performance (oral, written, acted out)
What kinds of assessments will work for administrators, guidance, nurses, school psychologists? • Use School-wide Growth Measures • Use MCAS growth measures and extend them to all educators in a school • Use “indirect measures” such as dropout rates, attendance, etc., as measures • Use Student Learning Objectives (SLOs) • Team-based SLOs • Or create measures. • A pre- and post- test are generally required to measure growth except with normed assessments
GROWTH SCORES for Educators Will Need to Be Tabulated for All Locally Developed Assessments MCAS SGP 244/ 25 SGP 4503699 230/ 35 SGP 225/ 92 SGP
According to Technical Guide B (Summarized on page 3 of handout)Focus on the Following: • Is the measure aligned to content? • Is the measure informative?
The first entry point, more specifically: • Is the measure aligned to content? • Does it assess what is most important for students to learn and be able to do? • Does it assess what the educators intend to teach?
The second entry point, more specifically: • Is the measure informative? • Do the results of the measure inform educators about curriculum, instruction, and practice? • Does it provide valuable information to educators about their students? • Does it provide valuable information to schools and districts about their educators?
Five Considerations • Measure growth • Employ a common administration procedure • Use a common scoring process • Translate these assessments to an Impact Rating • Assure comparability of assessments (rigor, validity).
More specifically, what is comparability? • Comparable within a grade, subject, or course across schools within a district • Identical measures are recommended • Comparable across grade or subject level district-wide • Impact Ratings should have a consistent meaning across educators; therefore, DDMs should not have significantly different levels of rigor
Approaches to Measuring Student Growth • Pre-Test/Post Test • Repeated Measures • Holistic Evaluation • Post-Test Only
Pre/Post Test • Description: • The same or similar assessments administered at the beginning and at the end of the course or year • Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year • Measuring Growth: • Difference between pre- and post-test. • Considerations: • Do all students have an equal chance of demonstrating growth?
Repeated Measures • Description: • Multiple assessments given throughout the year. • Example: running records, attendance, mile run • Measuring Growth: • Graphically • Ranging from the sophisticated to simple • Considerations: • Less pressure on each administration. • Authentic Tasks
Repeated Measures Example Running Record # of errors Date of Administration
Holistic • Description: • Assess growth across student work collected throughout the year. • Example: Tennessee Arts Growth Measure System • Measuring Growth: • Growth Rubric (see example) • Considerations: • Option for multifaceted performance assessments • Rating can be challenging & time consuming
Holistic Example Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho. Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts
Post-Test Only • Description: • A single assessment or data that is paired with other information • Example: AP exam • Measuring Growth, where possible: • Use a baseline • Assume equal beginning • Considerations: • May be only option for some indirect measures • What is the quality of the baseline information?
Examples • Portfolios • Measuring achievement v. growth • Unit Assessments • Looking at growth across a series • Capstone Projects • May be a very strong measure of achievement
Piloting DDMs • Piloting: • Test • Analyze • Adjust • Repeat • Being strategic and deliberate: • Collaboration • Iteration • Information
Pilot Steps: • Prepare to pilot • Build your team • Identify content to assess • Identify the measure • Aligned to content • Informative • Decide how to administer & score • Test • Administer • Score • Analyze • Adjust
Analyzing Results: Example Focus Questions • Is the measure fair to special education students? • Are the variations of scores in scores due to rater? • Is growth equal across the scale?
Analyzing and adjusting: Each DDM should have: • Directions for administering • Student directions • Instrument (the assessment) • Scoring method • Scoring directions
Resources • Existing • ESE Staff • Part VII of the Model System • Technical Guide A • Assessment Quality Checklist and Tracking Tool • Assessment Literacy Webinar Series • Materials from Technical Assistance sessions • Commissioner's Memorandum • Technical Guide B • What’s Coming • Exemplar DDMs (August 30th) • Other Supporting Materials
Considerations See page 5-6 of Handout for DESE recommendations Table or Partner Talk
Time to Consider and Begin to Plan Pages 5-6 in Handout
Can you turn this into an opportunity?Can you capitalize on present strengths or initiatives?Can you strengthen horizontal and vertical alignment? • Some options: • Writing to text 9-12? K-12? (NEASC) • Research K-12? • Specialist Coordination Opportunities • Support for Art, Music, PE, Health • Math—one focus K-12? (fractions, e.g.) • Are there present assessments that might be modified slightly
ONE PLAN Consider all of the options, concerns, initiatives, possibilities as you look at what the next step for your school and district should look like. Be ready to share this very basic “first think” on DDMs. After this, you will be given tools that will support your assessments of tasks and curricula’s quality, rigor, and alignment.
“The task predicts performance”Elmore http://edworkspartners.org/expect-success/2012/09/21st-century-aligned-assessments-identify-develop-and-practice-2/ Page 2 (The DO Now) Process with a partner. Why might Elmore’s idea be germane to your planning? What can educators learn from DDMs?
Tools to Facilitate the Work Tools to assess Alignment Tools to assess Rigor Tools to assess the quality of student work
2 DESE Tools to Facilitate the Tasks Quality Tracking Tool Educator Alignment Tool Temporarily Removed from doe web site. Interactive data base for all educators and possible assessments that could be used for each. It has been taken down from the web site temporarily. • Assess the Quality of your inventory of assessments • Also use Lexicon of Quality Tracking Tool Terms (in packet) • On DESE website http://www.doe.mass.edu/edeval/ddm/
Tracking Tool Sample (page 11) • Checklist • Tracker
Two Essential Quality Considerations Alignment Rigor • Alignment toCommon Core,PARCC, and the District Curriculum • Shifts for Common Core have been made: • Complex texts • Multiple texts • Argument, Info, Narrative • Math Practices • Depth over breadth
More Tools to Guide the Process For Assessing Rigor and Alignment • Daggett’s Rigor/Relevance Scale • DESE’s Model Curriculum (Understanding by Design) • DESE’s Model Curriculum Rubrics (a destination) • PARCC’s Task Description • PARCC’s Rubrics for writing • Protocols for Calibration (to use with teacher groups) • Writing to Text Wikispace: http://tinyurl.com/l7287z9
Daggert’s Rigor/”Complexity” Scale “Task Complexity Continuum” 1 2 3 4 5 MCAS ` MCAS PARCC CC Aligned Classrooms ORQ Composition multiple Authentic Tasks ELA ORQ Math texts Simple/Complex