1 / 69

Tom Buffett Jeannene Hurley Melinda Mangin Marilyn Roberts Virginia Winters EPFP

Tom Buffett Jeannene Hurley Melinda Mangin Marilyn Roberts Virginia Winters EPFP May 5, 2008 Kellogg Center. Agenda. Introduction & Hands-On Assessment Historical Perspective on Assessment Don Peurach, MSU Assessment Survey Results Multiple Users of Formative Assessment

juliuse
Download Presentation

Tom Buffett Jeannene Hurley Melinda Mangin Marilyn Roberts Virginia Winters EPFP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tom Buffett Jeannene Hurley Melinda Mangin Marilyn Roberts Virginia Winters EPFP May 5, 2008 Kellogg Center

  2. Agenda • Introduction & Hands-On Assessment • Historical Perspective on Assessment • Don Peurach, MSU • Assessment Survey Results • Multiple Users of Formative Assessment • Kerry Dallas, Forest Hills S.D. • Building Capacity to Use Formative Assessment • Anne Mull, WRESA • State & National Perspectives • Rubrics & Shared Understanding

  3. Big Ideas • A single data source can be used formatively and summatively • Multiple data sources are needed to understand human learning • Using assessment data to improve teaching and learning takes time, knowledge and skill

  4. Assessment Working Definitions • Standards-based Assessment • Tells what students know and are able to do based on content standards • Characteristics • Is a process, tool, and multi-faceted • Takes a variety of forms and purposes

  5. Assessment: Working Definitions • Standards-based Assessment • Characteristics (continued) • Can be used for formative or summative decision-making • Intended use should be determined and communicated beforehand • How results are used determines whether an assessment is formative or summative

  6. Assessment: Working Definitions • Formative Classroom Assessment • Formal and informal classroom assessment “for learning” • Characteristics • May take place most anytime • Involve student-teacher dialogue, student self assessment, observation • May be periodic (e.g., benchmark tests)

  7. Assessment: Working Definitions • Summative Classroom Assessment • Assessment “of learning” in classrooms; documents how much learning has occurred at a point in time • Characteristics • Results are used to make some kind of judgment (e.g., grades, AYP) • May be periodic (e.g., benchmark tests)

  8. Assessment: Working Definitions • Balanced Assessment System • A system using both summative and formative assessments as integral parts of information gathering. • Characteristics • Formative, interim, and summative elements • Does not depend heavily on one type of assessment

  9. Hands-On Assessment Volunteers

  10. International Clapping Institute Three Traits: • Volume • Appropriateness • Creativity

  11. Formative Assessment: In Context and in Perspective Donald J. Peurach Michigan State University

  12. Overview • Historical Context • Contemporary Context • A Metric for Comparison • Key “Take Aways”

  13. Historical Context:The “Missing Middle” • Problem: The absence of managerial capabilities in schools. • Interacting causes: • In environments. • In instruction. • In leadership. • The result: • Weaknesses in both summative and formative assessment; • Management via a “logic of confidence”.

  14. Contemporary Context:Filling the Missing Middle • Solution: Building managerial capabilities in schools. • The product of interdependent reform initiatives: • In environments. • In instruction. • In leadership. • The result: • Active management of instruction and instructional improvement; • Strengthening of both summative and formative assessment.

  15. A Metric for Comparison:Success for All • A comprehensive school reform program. • A multi-level assessment system: • Annual state assessments; • Quarterly benchmark assessments • In-practice assessments • Curriculum-based assessments. • Supplemental diagnostic assessments. • Supports for implementation: • Technical resources; • Professional development; • Three year implementation window.

  16. Formative Assessment:Key “Take Aways” • The challenge: Bucking centuries of interdependent expectations, norms, and practices. • The task: Building an entire, missing layer of organization. • The metric: Extensive designs, resources, and professional development structured over a long-term implementation window.

  17. Assessment Survey Results

  18. Assessment Survey Results Please rate you level of knowledge related to each of the following assessment topics. High Knowledge: Data driven decision making 52.4 % (11)* MEAP 38.1% (8) Moderate Knowledge: 7 topics Limited or Non-Existent Knowledge: Balanced Assessment 66.7% (14) District Assessments 52.4% (11) *21 Respondents

  19. Assessment Survey Results How useful are state-level standardized assessments compared to classroom-level curriculum-based assessments? Drawing inferences about what teachers & students do: Standardized: 33.3% (7) not useful Classroom-based: 47.6% (10) useful Drawing inferences about quality of teaching & learning: Standardized: 38.1% (8) not useful Classroom-based: 38.1% (8) useful

  20. Assessment Survey Results How useful are state-level standardized assessments compared to classroom-level curriculum-based assessments? Informing teachers in planning their work with students: Standardized: 33.3% (7) useful Classroom-based: 57.1% (12) very useful Informing teachers & leaders in school-level planning: Standardized: 38.1% (8) somewhat useful Classroom-based: 33.3% (7) very useful

  21. Assessment Survey Results How useful are state-level standardized assessments compared to classroom-level curriculum-based assessments? Making teachers & school leaders accountable for teaching & learning: Standardized: 57.2% (12) useful / somewhat useful Classroom-based: 42.8% (9) useful/somewhat useful

  22. Assessment Survey Results Conclusions: 1. Wide range of knowledge across participants 2. Need for knowledge about: Balanced Assessment 66.7% (14) District Assessments 52.4% (11) 3. Classroom-based assessments are more helpful for understanding the kind & quality of teaching & learning. 4. Both kinds of assessments are useful for planning.

  23. Assessment Survey Results Big Ideas: Both kinds of assessments—state standardized & classroom-based—are necessary for assessing student learning. How we use assessments determines whether or not they are formative or summative. state standardized ≠ summative classroom-based ≠ formative

  24. Is That Your Final Answer? Small group activity objective- to discuss assessment purposes Steps: 1. You will receive a card that identifies one kind of assessment tool. 2. In your small group discuss how that tool might be used for formative or summative purposes. 3. Use chart paper to record key points of the discussion. 4. Report back your ideas to the whole group in a 90 second summary.

  25. Break ~ 10 Minutes… • When the music stops…we will begin again.

  26. Multiple Users of Formative Assessment Teachers Parents Students

  27. Teacher’s as users of formative assessment… • Kerry Dallas, Teacher • Forest Hills School District (Kerry’s slide presentation goes here ).

  28. Emily’s Story… • When students are users of formative assessment, what impact does it have on their learning? “Assessment for learning turns day-to-day assessment into a teaching and learning process that enhances (instead of merely monitoring) student learning.” Rick Stiggins “Assessment Through the Student’s Eyes” Educational Leadership, May 2007

  29. Emily’s Writing Samples • At your tables, take a moment to discuss Emily’s writing. What observations do you have when you compare the two pieces? • Emily’s interview…Use the questions on your handout to focus your thinking.

  30. What does it look like in the classroom? Teacher’s part. • Share clear learning targets with students (presented in student-friendly language. • Provide examples of exemplary student work. • Provide students with the opportunity for frequent self assessment. • Provide the student with descriptive feedback. • Allow students to chart their own path to mastery of learning targets.

  31. What’s the student’s role? Student’s Part • Understand what success looks like. • Use the feedback from each assessment to locate where they are in relation to where they want to be. • Determine how to do it better the next time.

  32. Parent as Users of Formative Assessment

  33. Building Teacher Capacity to Use Formative Assessment

  34. 2002-2003 • Formative and Summative, (OF/FOR) • Districts formed assessment teams • District teams met for assessment literacy pre-work • Stiggins series with others as well • ATI introduction with Keys to Quality, Target Method Match • Standards of Assessment from the Michigan Curriculum Framework • Data Analysis- MEAP, MI ACCESS

  35. 2003-2004 • District teams • Support in districts, trainer of trainer • Data Analysis and Collaborative Analysis of Student Work • Instructional Strategies that Work, including the instruction of vocabulary • Began leading principal professional development including Aspiring Principals and New Teachers • GLCEs! Target-Method-Match, Pacing, Assessment Development

  36. 2004-2005 • GLCE deconstruction and pacing in districts • Detroit Data Specialists • Professional Learning Communities • High Priority Schools Series • Formative Assessment • Quality Assessment • Collaborative Analysis of Student Work • Collaboration with Content Consultants • All Kids Achieving Collaboration with Special Education

  37. 2005-2006 • HS MASS I- HSCEs and Quality Assessment Practices • Math and science consultants • Data Representation Strand from Science ACT • Data Analysis • Common Assessment Development • ELA and Math Grades 2-7 • 24 days with teams from Wayne County districts • Collected assessments and examples of student work • Parallel work with HPS • Began MME Series on content and administration

  38. 2006-2007 • MME- content and administration • Data Analysis • Continued development and use of common assessments

  39. 2007-2008 • High Schools • Grading • Professional Learning Communities • Data Analysis- MEAP, MME • HS MASS II- Deconstructing Standards, Summative Formative, Creation of Common Formative Assessments • Assessment of Chemistry For All • Item review and revision and analysis of distracters • HSCES • Trimesters • Common Assessment Development

  40. 2008-2009 • Data Analysis • Common Assessment Development as part of a Professional Learning Community • Grading • Secondary Credit Assessment System • CLASS A/DataWise Tool

  41. Formative Assessment: In Michigan and Other States Marilyn Roberts

  42. Formative Assessment: Michigan and Other States • Overview • Michigan’s Current Assessment System • Statewide Formative Assessment Initiative • Formative Assessment in Other States

  43. Formative Assessment: Michigan and Other States Michigan’s Current Assessment System

  44. Current Statewide K-12 Educational Assessments • MEAP: Michigan Educational Assessment Program • MME: Michigan Merit Examination • ACT • WorkKeys • Michigan-developed components • MI-Access • ELPA: English Language Proficiency Assessment • SCAS: Secondary Credit Assessment System

  45. MEAP & MME Overview MME Fall retest has ceased

  46. MI-Access Overview

  47. ELPA Overview • Subjects tested (K-12) • Overall English Language Proficiency • Reading, Writing, Listening, Speaking, and Comprehension • Yearly Assessment (Full ELPA) • Once a year in the Spring • All students eligible for English Language Learner (ELL) services • ELPA Screener (Shortened ELPA) • Available any time • Online immediate scoring • Used to determine eligibility for ELL services • Use for assistance in placement decisions

  48. SCAS Proposed Development Schedule

  49. Number of Students Who Took OEAA Assessments in the 2007 Calendar Year Approximately 4,280,000 subject tests total

  50. Improved Reporting of Results • Preview of Reports on OEAA Secure Site • Provides Access to Electronic Data • Analyses of Multiple Assessments (ELPA and MEAP) • Use of UIC to analyze results for cross-test analyses • Historical Results for Individual Students • Available for District of Record • Shows Results for statewide tests taken by a student • Comparison of Results by Demographic Data • Available for MEAP • ELPA, MI-Access being added

More Related