1 / 80

Formative Feedback for Teachers

Formative Feedback for Teachers. Increasing reliability; decreasing bias Documenting objective evidence Providing effective feedback Formative decisions/feedback. Agenda. Increasing Reliability; Decreasing Bias. Increasing Reliability; Decreasing Bias .

taro
Download Presentation

Formative Feedback for Teachers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formative Feedback for Teachers

  2. Increasing reliability; decreasing bias Documenting objective evidence Providing effective feedback Formative decisions/feedback Agenda

  3. Increasing Reliability; Decreasing Bias

  4. Increasing Reliability; Decreasing Bias • What issues exist with reliability in teacher evaluation? • How do we increase reliability? • How do we decrease bias?

  5. What is reliability in terms of evaluation?

  6. Do you see what I see? Is this teacher… Helping students understand a difficult problem through the research-based practice of direct instruction? OR 2. Using the outmoded technique of “stand and deliver” rather than engaging students in hands-on problem solving?

  7. The important thing to remember is that a “red flag” is an indication that something might not be working well…and to collect more evidence. What is a “Red Flag” in Teaching?

  8. Why is reliability important?

  9. Issues with Reliability in Observation The MET Project, 2013 • Variation from lesson to lesson for a given teacher • Ratings from the same evaluator of the same teacher vary significantly • Single observation by single observer generates low reliability • Only 37% of the variance in ratings reflects consistent differences in a teacher’s performance

  10. Discuss... What could be some of the causes of variability in observations by one evaluator of one teacher…that are not related to the teacher’s performance?

  11. Issues with Reliability in Observation/Multiple Data Use The MET Project, 2013 • Differing judgments between two evaluators observing the same lesson • Inconsistent inconsistency: Less than 10% of variance due to some raters rating everyone high or everyone low • Large residual variance left unexplained

  12. Issues with Collection and Use of Multiple Data Sources How do we increase reliability?

  13. Reliability Problem 1:Lack of Criterion Reliability The MET Project, 2013 How consistent are the evaluator’s ratings with those of an expert rater?

  14. Reliability Solution 1:Establish Criterion Reliability The MET Project, 2013 • Train with an expert rater • Practice scoring using videos, documentation, etc. • Provide instruction on how to interpret evidence • Provide information about common sources of systemic rater error

  15. Reliability Problem 2:Lack of Intra-Rater Reliability The MET Project, 2013 How consistent is the evaluator in his/her own ratings?

  16. Reliability Solution 2:Establish Intra-Rater Reliability The MET Project, 2013 • Watch the same lesson multiple times with time between • Give ratings based on multiple pieces of evidence rather than single observations

  17. Reliability Problem 3:Lack of Inter-Rater Reliability The MET Project, 2013 How consistent are ratings between two or more raters?

  18. Reliability Solution 3:Establish Inter-Rater Reliability The MET Project, 2013 • Calibrate ratings during initial training • Conduct tandem observations and performance reviews with multiple evaluators

  19. Overall Ways to Increase Reliability • Same evaluator observing multiple lessons by the same teachers • Add more evaluators to the same observation or summative evaluation

  20. Overall Ways to Increase Reliability Add observers or evaluators from outside the school • Raters’ performance increases when they know they are being monitored (Center of Educational Compensation Reform, 2009) • Can be a small sample of a teacher’s evaluation

  21. Reliability:Doing the Same Thing Over and Over

  22. How do we decrease bias?

  23. Who are you going to hire?

  24. Knowing our biases is half the battle to overcoming them. We all have biases.

  25. Bias Problem 1:Rater Personal Bias Example: Checking the organization of a teacher’s cabin storage during an observation.

  26. Bias Solution 1:Rater Personal Bias Train observers/raters on objective ways to collect evidence from multiple sources on uniform, research-based performance standards.

  27. Bias Problem 2: Halo and Pitchfork Effect “You were very professional during the interview so I’ll give you the benefit of the doubt if I see deficiencies in your classroom.”

  28. Bias Solution 2:Halo and Pitchfork Effect • Train observers/raters on objective ways to collect evidence on uniform, research-based criteria • Multiple raters provide various perspectives

  29. Did you know …? • Positive first impression: • Positive evidence sought • Negative first impression: • Negative evidence NOT sought, but • Negative information obtained is heavily weighted • Result: • First impressions skew accuracy

  30. Bias Problem 3:Error of Central Tendency “We’re all the same…and we’re all pretty good!”

  31. Bias Solution 3:Error of Central Tendency Train observers/raters on: • using precision feedback based on data-generated evidence (formative) • distinguishing between the various ratings on the scale (summative)

  32. Bias Problem 4: Error of Leniency “Everyone is superior...or better!”

  33. Grade InflationChicago: 2003-04 – 2007-08 SuperiorExcellentSatisfactoryUnsatisfactory 25,332 9,176 2,232 149 New Teacher Project, Widget Effect, 2009

  34. Bias Solution 4:Error of Leniency Train observers/raters on distinguishing between the various options/ratings for feedback.

  35. Bias Problem 5: Rater Drift Original agreements on student engagement gradually drift as evaluators begin to define it differently.

  36. Bias Solution 5:Rater Drift • Provide refresher trainings for observers/ evaluators • Tandem reviews to ensure observers/evaluators define terms and ratings similarly

  37. Activity: Name that Bias

  38. Summary • Be aware of and look out for bias • Train, train, train • Reliability in numbers • Look at all the evidence

  39. BIG SUMMARY:How do we best address bias? The MET Project, 2013 • Acknowledge the existence of bias • Understand the different types of bias and how to counteract them

  40. Bias=Defective Results

  41. Documenting and UsingObjective Evidence

  42. Importance ofMultiple Data Sources Documentation Log Student Learning Objectives Observations Surveys Data collection Is messy!

  43. What is Evidence?

  44. What is Evidence? Evidence considered “The available body of facts or information indicating whether a belief or proposition is true or valid.” -Dictionary.com Rating determined

  45. Two Purposes for Collecting and Using Evidence Improve teacher practice Determine and justify summative ratings

  46. Focus of Evidence STANDARDS & INDICATORS Teacher Practice Student Learning

  47. Steps to Collecting and Using Evidence: Evaluators and Teachers Working Together Evaluators and Teachers 4. SUMMATIVE: Determine rating based on preponderance of evidence. Evaluators

  48. Evidence… • Is standards-based • Can include both examples of meeting or not meeting the expectation • May be: • Quantitative – e.g., 8 minutes instruction during 1st 10 minutes • Qualitative – e.g., narrative documentation of lesson activity • Focuses on data documentation, less on judgment (Cumulative judgment comes during the summative rating.)

  49. How do we document evidence objectively?

  50. Documenting Evidence Tips • Avoid terms that express judgment or soft evidence (“neat classroom,” “fun activity,” “caring attitude”) • Avoid words that imply quantity but don’t actually quantify (“most,” “few,” “several”) • Stick to the five senses • Remember Who, What, When, Where, How

More Related