1 / 73

Improving the Quality of Locally-Developed Assessments MSTC February 18, 2016

Improving the Quality of Locally-Developed Assessments MSTC February 18, 2016. Dr. Ellen Vorenkamp Wayne RESA Mac Board Member. Who’s in the room?. Who am I? Who are you?. Learning Outcomes. Participants will… recognize the need for ensuring quality classroom assessments

sroughton
Download Presentation

Improving the Quality of Locally-Developed Assessments MSTC February 18, 2016

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving the Quality of Locally-Developed AssessmentsMSTC February 18, 2016 Dr. Ellen Vorenkamp Wayne RESA Mac Board Member

  2. Who’s in the room? • Who am I? • Who are you?

  3. Learning Outcomes • Participants will… • recognize the need for ensuring quality classroom assessments • identify and distinguish the elements of quality assessment • understand a process for ensuring quality assessments • recognize appropriate resources to aid in the development/modification of locally-developed assessments

  4. Setting the stage… Table activity Protocol – Chalk Talk Center of chart paper write Quality Assessments Without comment… What comes to mind when you think of quality assessments? Debrief/Categorize

  5. Setting the Stage We need to do something different!

  6. Our Charge… We have reached a tipping point: We either change our assessment beliefs and act accordingly, or we doom struggling learners to inevitable failure. ---Rick Stiggins

  7. The How • 5 Keys to Quality Assessment Carousel

  8. What is the purpose of our classroom assessments? Do we share the results with the correct users? How can we improve in this area?

  9. Do students and teachers have clarity around what we need students to know and be able to do and what proficiency on it looks like? How can we make learning more transparent?

  10. How do we ensure our assessments are of high quality? What more can we do to ensure sound design?

  11. How is information from our assessments communicated to the correct users? How can we ensure better communication with our learning partners?

  12. In what ways are students involved in the assessment process? What more can be done to make them partners in the assessment process?

  13. Keys to Quality Assessments Accurate Assessment Clear Targets Assess What? What are the learning targets? Are they clear? Are they good? Clear Purposes Why Assess? What’s the purpose? Who will use results? Good Design Assess How? What method? Sampled how? Avoid bias how? Effectively Used Sound Communication Communicate How? How manage information? How report? Student Involvement Students are users, too. Students need to understand learning targets, too. Students can participate in the assessment process, too. Students can track progress and communicate, too.

  14. Key 1: Clear Purpose • Evidence gathered from assessments informs instructional decisions • Some decisions occur frequently: As learning progresses teachers and students need to know what comes next in student learning or determine what is blocking student learning • Some decisions are made periodically: When we certify learning for purposes of report card grades or identify students for special services • Some decisions are made less frequently: Such as when districts need to make adjustments in instructional programs or resources

  15. Clear Purpose • As a result it becomes apparent that different assessments serve a variety of users and uses, centering on student achievement at varying levels, and require a variety of different assessment information gathered at different times

  16. Clear Purpose • Student Bill of Rights • By Rick Stiggins excerpted from Improve assessment literacy outside of school to, PDK October, 2014 • Students are entitled to know the purpose of each assessment in which they participate; that is, they have a right to know specifically how the results will be used and by whom. • Students are entitled to know and understand the learning target(s) to be reflected in the exercises and scoring guides that make up any and al assessments. • Students are entitled to understand the differences between good and poor performance on pending assessment and to learn to self-assess their progress toward mastery. • Students are entitled to dependable assessment of their achievement gathered using quality assessments.

  17. Clear Purpose • As you develop quality assessments, keep the following questions in mind… • What is the purpose of the assessment? • Who will use the information? • How will it be used? • Is the use formative or summative?

  18. A Formative View • As you develop quality assessments, keep the following questions in mind… • What is the purpose of the assessment? • To provide teachers immediate information on student learning • Who will use the information? • Teachers and students in the classroom • How will it be used? • To inform next steps in the learning process • Is the use formative or summative? • Formative

  19. A Summative View • As you develop quality assessments, keep the following questions in mind… • What is the purpose of the assessment? • Educator Evaluation / Accountability • Who will use the information? • Teachers and Administrators • How will it be used? • To certify the learning process • Is the use formative or summative? • Summative

  20. Assessment: Comparing Purposes Assessment of learning…(summative) • strives to document student achievement. • diagnoses a program’s strengths and weaknesses by providing comparable results. • provides summative results at the end of a unit or course of study. • informs others (teachers, parents, administrators, community members) about students and their achievement. • assumes the teacher’s role is to gauge student success. • reflects the standards themselves. Assessment for learning…(formative) • strives to increase student achievement • diagnoses a student’s strengths and weaknesses by providing results that are unique to individual students. • provides data throughout a unit or course of study that allows tailoring instruction and motivation for improvement. • informs students about themselves and helps them learn how to take charge of their own progress. • assumes the teacher’s role is to promote student success. • reflects the knowledge, skills, and understandings that underpin standards.

  21. Assessment: Comparing Purposes

  22. Meta Moment • How confident are you that assessments given in classrooms in your school/district have a clear purpose? • How is that purpose communicated to the students and other interested parties? • Is the data from the assessments used for the intended purpose?

  23. Key 2: Clear Learning Targets • We must have a clear sense of the achievement expectations we need our students to master • Expectations must be worded in student-friendly language and be shared with students • We need to ensure the targets are tightly aligned with the standards and are taught/learned at the appropriate cognitive demand

  24. Clear learning Targets • Research…John Hattie • High Leverage Strategies Activity • What did we find? • Impact of our findings?

  25. High Leverage Strategies 1.44 Student Expectations .90 Formative Evaluation for Educators .90 Teacher Credibility in the Eyes of the Students .82 Classroom Discussion .75 Feedback .75 Teacher Clarity .74 Reciprocal Teaching .72 Teacher-Student Relationships .69 Metacognitive Strategies .67 Vocabulary Programs .64 Self-Verbalization/Self-Questioning .61 Problem solving Teaching

  26. Clear Learning Targets • “Making the learning intentions and successcriteria transparent, having high, but appropriate, expectations, and providing feedback at the appropriate levels is critical to building confidence in taking on challenging tasks.” • John Hattie, Visible Learning for Teachers: Maximizing Impact on Learning, 2012

  27. Clear learning Targets Quality Formative and Summative Assessments Learning Targets, Connie Moss and Susan Brookhart

  28. Learning Targets… • A target becomes a learning target only when students use it to aim for understanding throughout a particular lesson, and students aim for a target only when they know what it is. • Learning targets must be shared and actively used by both halves of the classroom learning team—the teacher and the students.

  29. Learning Targets • Clear Learning Targets shift us away from what we, as teachers, are covering towards what our students are learning.

  30. Success Criteria? Success criteria are statements that specifically outline what the students need to do in order to achieve their learning target. “…success criteria summarize the key steps or ingredients the student needs in order to fulfill the learning intention (target) – the main things to do, include or focus on.” -Shirley Clarke (Ireland School Board)

  31. Success Criteria • Setting Criteria with students is a powerful tool that helps students talk through and think about what mastery of the learning target looks like. • Success criteria describes what it means to do quality work and be proficient on “today’s” learning target. • Criteria is often displayed in well developed rubrics. These can be developed for or with students.

  32. Effective Success Criteria… • Are linked to the learning target • Are discussed and agreed with students prior to undertaking the activity • Are linked to what students learned yesterday and what the will learn tomorrow • Are used to determine learning progression • Are used to activate student prior knowledge • Provide a scaffold and focus for students while engaged in the activity • Are used as the basis for feedback and peer/self assessment • Are used to determine mastery of the learning target through assessment, both formative and summative

  33. Meta Moment • Does the assessment author have a clear idea of what she or he is trying to measure? • Do all teachers agree on the evidence to be collected? • Are the learning targets aligned and derived from the curriculum standards? • Is there a clear/shared criteria for success? • Are the learning targets shared in student friendly language and used throughout the lesson? • Do the learning targets create a clear, reasonable learning progression? • Do the learning targets represent the emphasis of what was taught at the appropriate cognitive demand?

  34. Key 3: Sound Design • Items must be well written and developed using sound item development guidelines. • Items should be field tested to determine their reliability • Items should be arranged appropriately within the test construct and hold true the intent of the blue print • Test should be field tested • Data reviewed • Revised and Modified as necessary

  35. Sound Design • Assessments must be accurate • Accuracy requires selecting the appropriate assessment method for the context of what is being assessed • Once the method have been selected, we must ensure the assessment items, tasks, questions are of high-quality • Finally, we must anticipate what could go wrong with the assessment and prevent those issues if possible

  36. Sound Design • Quality Assessments…are Reliable and yield Valid data. • In order for these two requirements to be met assessment developers must pay special attention to the following: • Standard/Item Alignment • Balance of Representation • Target-Method Match • Quality Items • The best way to ensure your assessment is reliable and valid is to create a test blueprint and follow the blueprint while developing the assessment.

  37. What’s Your DOK?

  38. What’s Your DOK?

  39. Methods of Assessments Selected Response Constructed Response Performance Assessment Observations/ Conversations • Multiple Choice • True-False • Matching • Diagram • Fill-in-the-blank (words, phrases) • Essay • Short answer (sentences, paragraphs) • Web • Concept Map • Flowchart • Graph • Table • Matrix • Illustration • Presentation • Movement • Science lab • Athletic skill • Dramatization • Enactment • Project • Debate • Model • Exhibition • Recital • Performance Task • Oral questioning • Observation • Interview • Conference • Process description • Checklist • Rating scale • Journal sharing • Thinking aloud a process • Student self-assessment • Peer review 40 Adapted from the work of Dr. Robert Marzano

  40. Target-Method Match Activity How well does your method of assessment match your target?

  41. Target-Method Match How well does your method of assessment match your target?

  42. Sound Design • Methods of Assessment • Written Response • Selected Response • Short Written Response • Extended Response/Essay • Performance Tasks • Observation/Conversation • Collection of existing work in portfolios • Each has advantages and disadvantages

  43. Sound Design

  44. Performance Tasks • Gives knowledge of methods, procedures, and analysis of skills related to student learning. • Performance assessments, in concert with more traditional forms of assessment can give us a more complete picture of student achievement. • Provides better insight into a student’s level of conceptual and procedural knowledge.

  45. Performance Tasks • Demonstration of skills through process or product • Designed to judge students abilities to USE knowledge and skills • Performance matches the depth of knowledge of skills and abilities being measured • Usually requires students to manipulate equipment, solve a problem, or make an analysis • Provide • Scoring is based on observation and judgment

  46. Meta Moment • Is the assessment method most appropriate for the learning target being assessed? • Are the assessment items and/or tasks assessing at an appropriate depth of knowledge? • Are the assessment questions and/or tasks written well? • In the case of a constructed response item and/or a performance task, does the scoring guide or rubric clearly indicate the criteria on which the student performance is judged, a scale that identifies levels of performance and a detailed description of student performance at varying levels of quality?2 • Are the items free from bias? • Have the items been field tested?

  47. Key 4: Effective Communication • Assessment results need to be communicated effectively to the appropriate intended user • Everyone must understand the meaning of the achievement target • Information in the communication must be accurate • Everyone must understand the symbols being used to report information • Communication must be tailored to the intended audience (level of detail, timing, format)

  48. Israel Study • Three kinds of feedback… • Scores • Comments • Both Scores and Comments Results?

  49. Results Feedback Type Percent Gain 0% Gain 30% Gain 0% Gain • Scores • Comments • Both Scores and Comments

  50. Elements of Feedback • Timing • Amount • Mode • Audience • Individual • Group

More Related