1 / 27

Getting the Most Out of Assessment

Getting the Most Out of Assessment. “You haven’t taught until they’ve learned.” – UCLA Basketball Coach John Wooden. “So how do you know they have learned?” . 2012 Legislation.

ronna
Download Presentation

Getting the Most Out of Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Getting the Most Out of Assessment “You haven’t taught until they’ve learned.” – UCLA Basketball Coach John Wooden “So how do you know they have learned?”

  2. 2012 Legislation • Requires Colorado to participate as a Governing Board member in a consortium of states that focuses on the readiness of students for college and careers. • Requires the Board to rely upon the assessments developed by the consortium expected to be ready for spring 2015. • Encourages the Board to conduct a fiscal and student achievement benefit analysis of Colorado remaining a Governing Board member starting on or before January 1, 2014.

  3. Proposed Summative Assessment Timeline

  4. Tensions-Multiple Measures • validity vs. reliability • performance/constructed response tasks vs. selected response items • all students vs. sampling • content vs. skills • local educators vs. professional testing contractors • local scoring vs. outside scoring • student work vs. numeric scores • summative vs. formative • holistic vs. analytic • stand-alone vs. embedded • one-year’s growth vs. differences in resources (instructional time, etc.) • mandate by edict vs. preparation through professional development Moving Through Tensions Re-envisioning the purpose of assessment -asking key questions to invite innovative thinking regarding evidence of student learning

  5. A Private Universe/ MIT and the Light Bulb Honors Student and the Light Bulb What Causes the Seasons? Office of Assessment, Research & Evaluation Evidence of Student Learning

  6. Assessments and protocols which uncover private thinking • You and the Moon • More examples (Traffic Light, etc.) • Honeycomb!

  7. Tensions-Assessment Literacy • validity vs. reliability • performance/constructed response tasks vs. selected response items • all students vs. sampling • content vs. skills • local educators vs. professional testing contractors • local scoring vs. outside scoring • student work vs. numeric scores • summative vs. formative • holistic vs. analytic • stand-alone vs. embedded • one-year’s growth vs. differences in resources (instructional time, etc.) • mandate by edict vs. preparation through professional development Moving Through Tensions Defining purpose s for assessment across all disciplines

  8. Office of Assessment, Research & Evaluation Formative Assessment Typical uses • An instructional process used to inform instruction and learning during the learning process • Aligned to standards and focused on learning progressions • Intended to motivate students towards learning targets • Used for instructional purposes and is not punitive or used to compare students to students, teachers to teachers, schools to schools, or districts to districts • Using informal and formal instructional strategies to gather, interpret, and use information to adjust and monitor teaching and learning. Examples • Exit ticket • Formative Performance Task • Think-Pair-Share • Self-Assessments • Response Journals • Observations • Anecdotal Records “Process used by both teachers and students during instruction that provides ‘in the moment’ feedback for adjusting teaching and learning. It reveals points of confusion, misunderstanding or progress toward mastery of an idea.” (CDE, 2011)

  9. Office of Assessment, Research & Evaluation Interim Assessment Typical uses • Provides a predictive measure of postsecondary & workforce readiness; • Provides student demonstration of current knowledge and progress towards mastery of standards; • Informs instructional and/or programmatic adjustments; • Results can be used in educator effectiveness evaluation Examples • Acuity • Galileo • NWEA • Quarterly District Assessments “Assessments typically administered every few months to fulfill one or more of the following functions: instructional (e.g., to supply teachers with student diagnostic data); evaluative (e.g., to appraise ongoing educational programs; predictive (e.g., to identify student performance on a later high-stakes test).” (CDE, 2011)

  10. Typical uses Accountability, including school, educator, and student (e.g., graduation) Certify mastery Program/curricular evaluation Monitor trends and progress Know students’ achievement levels Grades Office of Assessment, Research & Evaluation Summative Assessment End of unit or end of year, comprehensive and standardized measurement of student mastery in order to assess student learning, inform taxpayers, state policy makers, support identification of successful programs, and/or serve a variety of state and federal accountability needs.” (CDE, 2011) Examples • TCAP • NAEP • End of Unit Summative Assessment • Final Exam

  11. Tensions-Diverse Stakeholders • validity vs. reliability • performance/constructed response tasks vs. selected response items • all students vs. sampling • content vs. skills • local educators vs. professional testing contractors • local scoring vs. outside scoring • student work vs. numeric scores • summative vs. formative • holistic vs. analytic • stand-alone vs. embedded • one-year’s growth vs. differences in resources (instructional time, etc.) • mandate by edict vs. preparation through professional development Moving Through Tensions Commitment to collaborative approaches and multiple layers of implementation

  12. P-12 educators from around the state are gathering to identify and create high-quality assessments, which are aligned to the new Colorado Academic Standards and may be used in the context of Educator Effectiveness evaluations. The Content Collaboratives, CDE, along with state and national experts, will establish examples of student learning measures within each K – 12 content area including: Content Collaboratives Dawn

  13. Cohort I & II: Flow Chart of Work Dawn Pilot then peer review Colorado Content Collaboratives Bank National Researchers Technical Steering Committee Future Work I: Jan-Mar 2012 II: Jun-Aug 2012 I: Feb-May 2012 II: July-Nov 2012 I &II: Feb-Dec 2012 I & II: Aug 2012- Aug 2014 I: Aug 2013II: Aug 2014 Researchers gather existing fair, valid and reliable measures for Consideration. Collaboratives use protocol to review researchers’ measures for feasibility, utility and gaps. Prepare to fill gaps. Provide recommendations to Technical Steering Committee. Technical Steering Committee creates frameworks and design principles for collaboratives to use in reviewing and creating measures. Committee reviews recommendations of collaboratives. Piloting and peer review of measures. Aug 2012-Aug 2013: Cohort I piloting & peer review January 2013-Aug 2014: Cohort II piloting & peer review Measures placed in online Education Effectiveness Resource Bank for voluntary use.

  14. High Quality Assessment Review Tool A high quality assessment should be... Aligned to the content you want students to master Uses clear and rigorous scoringcriteria Is fair and unbiased Provides students with an opportunity to learn colorado content collaborativescde

  15. Office of Assessment, Research & Evaluation How are we doing this?Who is involved? Researchers Content Collaborative Members Technical Steering Committee Center for Assessment (NCIEA) Pilot Districts Peer Reviewers Other states and districts

  16. Inventory of Assessments colorado content collaboratives cde

  17. Review Progress colorado content collaboratives cde

  18. How Colorado Will Determine Student Learning Quality Criteria for One Measure Multiple Measure Design Principles for Combinations of Measures Growth Measure Composite colorado content collaboratives cde

  19. Assessment in DPSDenver Public Schools • Teachers in math, reading, and writing have a variety of standardized state and/or district assessments that can contribute measure of student growth to a teacher’s evaluation • In all other subjects, there are neither state nor district assessments that can be used in a similar way • Nearly 70 percent of the 4500 teachers in DPS teach something other than math, reading, or writing

  20. Non-Tested Subjects Assessment Development • Beginning in November 2011, DPS embarked on developing assessments in traditionally non-tested subjects • Working collaboratively with teachers • Assessments will be used to develop measures of student growth and contribute to teacher evaluations • Work will continue through summer of 2014

  21. Current and Future Work • Cohort 1 • Music • Visual Arts • Physical Education • Work during the 2012-13 school year will include: • Dance • Drama/Theater Arts • Social Studies

More Related