1 / 43

Triangulating Evidence in the Assessment for Learning Context

Triangulating Evidence in the Assessment for Learning Context. 3 rd Annual Canadian Assessment for Learning Conference Queen’s University, May 13, 2016. Damian Cooper (905) 823-6298 dcooper3@rogers.com. Two Goals that Guide My Work.

Download Presentation

Triangulating Evidence in the Assessment for Learning Context

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Triangulating Evidence in the Assessment for Learning Context 3rd Annual Canadian Assessment for Learning Conference Queen’s University, May 13, 2016 Damian Cooper (905) 823-6298 dcooper3@rogers.com

  2. Two Goals that Guide My Work • To do everything I can to ensure that assessment is good for ALL students • To help teachers become both more efficient as well as more effective in their assessment practices

  3. Time to Talk About Triangulation • What is the biggest question you have about triangulation? • Share your concerns at your table. • Which of these are shared by the majority at your table?

  4. Session Outcomes We shall: • Consider the rationale for triangulation in the assessment context • Examine what triangulation looks like according to different assessment purposes • Consider examples of assessment through observation and conversation, including the benefits and challenges of such approaches

  5. Triangulation: the what and why Definition • Triangulation involves using multiple data sources in an investigation to produce understanding. Reasons to triangulate • A single method can never adequately shed light on a phenomenon.  Using multiple methods can help facilitate deeper understanding. (Citation: Cohen D, Crabtree B. "Qualitative Research Guidelines Project." July 2006. http://www.qualres.org/HomeTria-3692.html )

  6. Triangulation • Navigation • Trigonometry • Social Sciences • Assessment

  7. What “Growing Success” Says about Triangulation • Teachers use a variety of assessment strategies to elicit information about student learning. • These strategies should be triangulated to include observation, student-teacher conversations, and student products. • Teachers can gather information about learning by: • • designing tasks that provide students with a variety of ways to demonstrate their learning; • • observing students as they perform tasks; • • posing questions to help students make their thinking explicit; • • engineering classroom and small-group conversations that encourage students to articulate what they are thinking and further develop their thinking. • Teachers then use the information gathered to adjust instruction and provide feedback. • -Growing Success

  8. Triangulation of Evidence by Assessment Mode Do (performance tasks) observations Valid & Reliable Picture of Student Achievement Write (tests, essays, projects, etc.) products Say (oral defense, conferencing, conversations, etc.)

  9. Triangulation of Assessment Evidence • Why is it necessary to triangulate assessment evidence? • To improve reliability – at least 3 “scores” • To improve validity – write, do, say • To differentiate assessment on the basis of student exceptionality – think Jack

  10. “Backward Design” Program Planning Stage 1: Identify essential learnings Stage 2: Determine critical evidence (summative) of those learnings Stage 3: Plan learning experiences and instruction that make such learnings possible Adapted from Wiggins and McTighe, Understanding by Design

  11. 3 Critical Questions to Ask Before Assessment Occurs • What is the purpose of this assessment? • Diagnostic? • Formative? • Summative? • Who is the primary user of the data? • Teacher? • Student? • Parent? Next grade teacher? Employer? • University/College entrance board? • What kind of data does the user need? • Feedback? • Score? • Percentage grade? • Anecdotal evidence? • Portfolio?

  12. Assessment Purposes Assessment for/asLearning • Diagnostic Assessment (tryouts) • Purpose: Helps teacher make instructional decisions • Formative Assessment (practice) • Purpose: Helps student improve learning as well as quality of work produced Assessment of Learning • Summative Assessment (games) • Purpose: Accreditation of learning for each unit • Summative Assessment (playoffs) • Purpose: Accreditation of learning for course

  13. Plan Backward from What’s Essential… Worth being familiar with Assessment Types Traditional quizzes & tests -paper/pencil Performance Tasks & Projects -open-ended -complex -authentic Oral Assessments -conferences -interviews -oral questionning Important to know and do Enduring understandings/ Essential skills Adapted from Wiggins and McTighe, Understanding by Design

  14. Stage 1: Identify Targeted Understandings Ontario Ministry of Education, The Ontario Curriculum, Grades 1–8: Science and Technology (2007)

  15. Stage 2: Determine Appropriate Assessment of Those Understandings and Skills Jack would be all over most of these!!

  16. Assessment Reliability – simplified! • Reliability answers the question, “How confident I am in the results from this assessment?” • One way to increase reliability is to “standardize” the conditions in which the assessment occurs. e.g. instructions, time, scoring, when administering a test • Another way to increase reliability is to gather multiple pieces of data about a given learning target • This can be done by triangulating in terms of frequency

  17. Triangulation of Assessment Evidence • What does it look like? • Triangulation by frequency Lab Report #1: Level 4 Lab Report #2: Level 2 Overall Level? To determine “most consistent level” we need another piece of evidence Lab Report #3: Level 2

  18. Reliability and Assessment for Learning • Assessment for learning must, by definition, be responsive to different students’ needs i.e. differentiated. • This will reduce the reliability of the assessment …and that’s just fine!

  19. Assessment Validity – simplified! • Assessment Validity addresses the question, “Will this assessment task actually provide evidence of the intended learning outcome(s)?” i.e. curriculum expectations • We may improve validity by matching the assessment to the learning outcome • This will often lead us to design authentic performance tasks. • But designing authentic tasks can cause problems with reliability.

  20. Authentic Tasks and Reliability • When students work in groups, the reliability of individual student assessment data will naturally be compromised • To counter this, you must plan for individual accountability: • Independent student report • Individual student conferencing _ ….

  21. Triangulation of Assessment Evidence • What does it look like? • Triangulation by mode Lab Report Mark: Level 4 Test Mark: Level 2 Overall Level? To determine “most consistent level” we need another piece of evidence Observation of Lab Performance: Level 2

  22. Triangulation of Evidence by Assessment Mode Do (performance tasks) observations Valid & Reliable Picture of Student Achievement Write (tests, essays, projects, etc.) products Say (oral defense, conferencing, conversations, etc.)

  23. Teach for Understanding and Assess for Transfer • When students are able to demonstrate a skill on cue, in a controlled situation, they have knowledge… BUT • When students independently make appropriate decisions by drawing on their knowledge and skills in response to new situations, they understand

  24. Authentic Performance Tasks • Are designed to provide evidence of learning of 1 or more overall expectations • Require transfer or application of knowledge & skills • Are relevant, meaningful, and therefore engaging for students • Are challenging • Often involve simulation or role-play • Are accompanied by clear assessment criteria: a rubric and/or checklist • Include realistic constraints: time, resources, etc. • May require collaboration, and therefore include individual accountability

  25. Task Matched to Overall Expectations:Danny’s Grade 9 Applied Math Class

  26. One of Danny’s Grade-9 Applied Math Performance Tasks The Outdoor Living Space The Problem: Mr. Ciarmoli has decided to build a pergola in his yard in order to have an outdoor living space for special events. He wants the area of the floor to be 20 m2. Mr. Ciarmoli has decided to go with interlocking brick for the floor. Since interlocking brick comes in different styles and are expensive, he wants to minimize his cost of purchasing bricks. Storage Shed The Problem: Mr. Ciarmoli has decided to build a storage shed. He has one section of his yard where he can build a storage shed with an area of 24 ft2. Since lumber is expensive, he wants to minimize the cost of lumber to purchase to build the walls.

  27. Matching Assessment Tasks to Curriculum Expectations Grade 11 English, University Prep

  28. Matching Assessment Tasks to Curricular Competencies Macbeth: Essay Topics 1) The supernatural plays an important role in Macbeth. To what extent does it motivate Macbeth's actions? 2) Discuss King Duncan and examine what contribution he makes to the play. 3) In constructing Macbeth, Shakespeare dramatically altered historical characters to enhance certain themes. Examine Shakespeare's sources and discuss why he made these radical changes.4) Is Lady Macbeth more responsible than Macbeth for the murder of King Duncan? Is Lady Macbeth a more evil character than her husband and, if so, why?

  29. Matching Assessment Tasks to Curricular Competencies • Could a TED Talk possibly provide evidence of those curriculum expectations?

  30. Triangulation of Evidence by Assessment Mode Do (performance tasks) observations Valid & Reliable Picture of Student Achievement Write (tests, essays, projects, etc.) products Say (oral defense, conferencing, conversations, etc.)

  31. Listening to Conversation to Assess Mathematical Understanding

  32. Time to Talk About Feedback • Working independently, list the features of the feedback provided in this videoclip • Share your list with a colleague

  33. Characteristics of Effective Feedback • Needs to cause thinking: questions are preferable to statements • Must not be evaluative • Must direct students towards improvement • Must make reference to specific quality indicators (a rubric or checklist) • Must include an expectation that it will be implemented • Must include strategies for checking that it has been implemented

  34. Planning for Conversations • Decide on purpose: diagnostic, formative, or summative • Create a schedule: triangulation by frequency • Ask questions rather than make statements • Focus conversation on specific work • Have a reliable tool to capture the information: checklist, tablet, camera,

  35. Applying My Learning • Discuss with a colleague the viability of using Stephanie’s approach - i.e. summative assessment through conversation - in your own classroom context.

  36. Effective Assessment Planning: Full Year/Course or Per Unit • Design summative assessment tasks first. These must provide evidence or one or more of the essential learning targets • Review to determine whether this set of summative tasks includes an appropriate balance of write, do, and say • Design smaller formative tasks to build understanding and skills towards the summative tasks • Design initial/diagnostic assessment tasks to determine students’ levels of understanding and skill before you begin teaching

  37. Dare to Dream! It’s June, 2019 – the end of your 3-Year Assessment Improvement Plan. You are conducting a Walkabout on your campus. Use your creativity to communicate to your colleagues what you see and hear during your Walkabout.

  38. A New Reporting Paradigm

  39. Professional Judgment Decisions educators make, in light of their professional experience, with reference to public standards and guidelines (Cooper, Redefining Fair)

More Related