1 / 23

Tipping Points: Tools and Routines To Support Expert-Like Practice in Early Career Teachers

TLT. Tipping Points: Tools and Routines To Support Expert-Like Practice in Early Career Teachers. Jessica Thompson, Mark Windschitl Melissa Braaten Teachers’ Learning Trajectories Initiative University of Washington.

owsley
Download Presentation

Tipping Points: Tools and Routines To Support Expert-Like Practice in Early Career Teachers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TLT Tipping Points:Tools and Routines To Support Expert-Like Practice in Early Career Teachers Jessica Thompson, Mark Windschitl Melissa Braaten Teachers’ Learning Trajectories Initiative University of Washington • Funding by Carnegie Corporation, Annenberg Foundation and the Rockefeller Foundation • National Science Foundation, Grant No. DRL-0822016 http://Tools4TeachingScience.org/

  2. This is a story about tools and teacher learning Seek to develop tools and socio-professional routines to catalyze and sustain ambitious pedagogy We seek to develop theory of teacher learning, of transitions between novice and expert-like teaching performances Towards a science of performance improvement

  3. Novices…we don’t know much about their development Bransford & Stein, 1993; Goodlad, 1990; Grossman et al., 2000; Kennedy, 1999; Nolen, Ward, Horn, Childers, Campbell, & Mahna, in press; Simmons et al., 1999, Wilcox, Schram, Lappan, & Lanier, 1991 What do we know? • All pre-service instruction and experiences filtered through tacit but durable theories about “good teaching”, what counts as learning (Thompson, Windschitl, Braaten, 2009). • Don’t have a sense of what kids are capable of, especially with scaffolding • Have difficulty putting reform practices into play in classrooms • We have reason to believe novices have unique intellectual challenges in learning the craft, they’re not just “inexperienced”

  4. What do we know about science teaching like in American classrooms? Many teachers create dynamic, challenging lessons, but the broad trends indicate-- a focus on activity rather than sense-making discourse pressing for explanations is rare questioning among weakest elements of instruction less than 1/3 of lessons take into account students’ prior knowledge Baldi et al, 2007; Banilower et al, 2006, Roth & Garnier 2007; Weiss et al, 2003; PISA studies

  5. Selecting “Big ideas” in science • Teaching as “Working on student ideas” • Pressing students for evidence-based explanations of science phenomena • Teaching for epistemic fluency Ambitious pedagogy: Tracking fate of 4 key ideas making up Model-based Inquiry Drawn from literature on learning sciences, authentic scientific practice, epistemology

  6. Myth-busting: Induction does not have to be a time of mere survival • Collegial community are important, meaningful intellectual work connected to practice • Big factor in “Why did I stay in teaching?”– professional community • Our model sustains community between teacher ed and induction, focuses on refining practice by inquiring into student thinking with colleagues

  7. Contexts of study: A focus on one HLP—supporting evidence-based explanation University Coursework Analysis of student work A “Third Space” Student Teaching First year Teaching Followed 15 secondary science teachers, instructed in reform teaching for 3 years, some for 4. interviews | observations | video/audio | pupil artifacts

  8. The APEXST Model of InductionAdvancing High-Leverage Practices By Examining Student Thinking • What distinguishes this model from simply “getting together to look at students’ work”? • A focus on high-leverage teaching practices • Systematic analyses of student thinking • The goal of longitudinal learning by students and teacher • Attention to students of all achievement levels

  9. Rubric became rudimentary “vision tool”: Pressing for Explanation

  10. What actually happened? How practice was represented to others revealed teacher thinking, shaped interactions with colleagues Teaching-Learning Unproblematic (TL-U) Teaching-Learning Unproblematic (TL-U)

  11. Participants used rubric in productive, unanticipated ways (TL-P) • Used in CFG’s as a standard to evaluate depth of one’s own current questioning and tasks. • Used in CFG’s to calibrate expectations for students with different abilities. • Used rubrics to envision “where to go next” in pressing students for use of evidence, constructing explanations. • Rubrics modified, then given to and used by students to direct and evaluate their own performances.

  12. Edited Sarah: I think [a level 3] definitely needs to include something about there being a membrane. One of the big ideas we talked about was an internal and external environment, if we are shooting for the stars here, they’d be able to talk about the difference between the internal and external environment, and that there would be pores in the membrane that would allow some things to go across, and that they'd mention if the concentrations were different, that that would allow diffusion to happen… Emily: I think too you’d have something about corn syrup moving from inside the egg-something about direction of the movement— Adam: [refers to the rubric] Isn’t a level three a “why” question? Emily: Oh, yeah, so, why do the molecules—? Jena: So that’s the “how”, that’s good, Adam: It gets more into stuff like entropy, that’s why diffusion works— Emily: Why do molecules move in the first place? Sarah: [She looks at what she had written as her own description for a level 3 explanation] I had a hard time coming up with a level three full explanation, I had “Explains why concentration of eggs change”—so, ok, its not why is this happening, but why did the mass decrease, so, its kind of a lower level of why, its not like [gestures with outstretched arms and looks at ceiling] “Whyyy?” Its not “Why diffusion" but “Why did the mass change?” So I brought it down—so I said for level 1 [refers to rubric] “States expected results based on what he/she saw”, so like “Did the mass change?” Or “Did it look wrinkly?” I had one kid say that. Then for level 2, “Describes how mass changes”, like water went out or in, so they were able to say that the contents were changing and that’s how the mass changed. And then the “why” would be all the things we talked about earlier. Pores, membranes, gradients, that stuff. So it’s a why, but a different why. Its not like the big theoretical— Emily: Now I'm trying to think in my mind—“why”? [laughter]. I'm trying to go to the next level.

  13. CFG conversations of pedagogical value, later in study, supported by the language of the rubric and demands of the protocol. Participants imagine types of scaffolding needed to support high student expectations Participants co-develop general theory about “what counts” as different levels of scientific explanation Participants use rubric to imagine more sophisticated answers students could give Participants use language of rubric to make collaborative sense of student work, identify partial understandings, trends Accountability to science Accountability to student work Tool influence: Conversational demands of protocol Tool influence: Structure and language of rubric

  14. Voices from participants-- Non-productive participation Productive participation “Well, I guess the analysis hasn’t done much for me, probably because I’ve noticed that all their [his students’] answers just don’t go that far. So they’re always the same, basically. They could go further. I think the big help has been in the CFG’s where I’m getting feedback of how I could do this better next time.” “I think, you know, the analysis worksheets that we had to fill out on student work and rate them at a 2 or a 3 … it is definitely the first step in understanding, not particularly what one student is doing, but understanding what the majority of a class is doing and thinking. And picturing where it is that they are at and where they need to go. I don’t think you can reflect in a quality way if you don’t make some things concrete when you’re analyzing student data.”

  15. More sophisticated

  16. Franke and Chan (2007), Lampert & Graziani (2009), below paraphrased from Ball and colleagues (2009) • Helps to improve the learning and achievement of all students • Supports student work that is central to the discipline of the subject matter • Applies to different approaches in teaching the subject matter and to different subject matter areas • Can be articulated and taught • Can be revisited in increasingly sophisticated and integrated acts of teaching. Our additions: • HLPs should be few in number to reflect priorities of equitable and effective teaching, and to allow significant time to develop beginning instantiations of each of these practices. • Each HLP should play a key role in a larger, coherent system of instruction. • HLPs should allow novices to learn from their own teaching. For example, instructional routines that make students’ thinking visible, and that create a record of students’ developing ideas and language across units of instruction in forms that allow teachers to reconcile these changes with instructional decisions they made along the way. High-leverage practices—instructional or planning forms that if enacted with proficiency, lead to significant learning outcomes

  17. Tools to support a core set of high leverage practices for teacher preparation Science Learning Framework

  18. The Big Idea Tool Puzzling phenomena Explanatory model “What counts” as a big idea • BIG IDEA defined as relationship between a puzzling, observable phenomenon and an underlying explanatory model. • A “Why” explanation must be a causal story involving unobservable entities, processes and events, that help us understand why a class of phenomena happen the way they do. • All classroom activities and discourse help kids reason about some piece of explanatory model

  19. Post-reflection: What discourse moves did you make in response to students’ hypotheses? If you did not get a willing response, how might you change your questioning or prompting? Pre-thinking: What will you do if students are incapable of offering any causal hypotheses? What will you do if students are unwilling to offer any causal hypotheses? Pressing for explanation. In this step you are explicitly asking for causal hypotheses. Label them publicly as hypotheses so kids feel more at ease offering them. • What might be going on here that we can’t see? • Why do you think this happens this way? (emphasize cause) • What do you think causes ____? Students offer simplistic cause-effect: Example: "Why does water boil?" "Because you put it on the stove." Kids offer explanations that involve alternative conceptions. Students offer explanations congruent with scientific explanation "You are telling me the beginning and the end of the story, what happens in the middle to cause___?" Note this respectfully without elaborating on it. If you can readily think of an observation that immediately puts this alternative conception into question, then offer that. “But did you notice____?” “How would your theory be possible if…?” Subtly mark and amplify the response to bring it to the forefront of discussion. “So you think that ___ has something to do with it…” 21

  20. Learning Progression: Supporting the pedagogical imagination • Lower anchors derived from empirical studies • Upper anchors from conceptual analysis of sophisticated instruction • Everything described in terms of performances, not declarative knowledge • Teachers are able to identify where they are, imagine where to go next

  21. Start here: Big Idea Tool 1. Teachers transform curriculum topics into “Big Ideas” – explanatory models that can deeply engage learners in content and in discourses around evidence in science. MBI Learning Progression 2. Teachers make big idea the focus of Model-Based Inquiry. They locate their current practice on continuum of development; they see what characterizes “next level” of practice 5. Collegial rounds of analysis of student work and teaching video make visible long-term patterns of development for learners of all levels; helps relate this to past pedagogical moves. Conversations suggest if or how one has gotten to next level of MBI learning progression 3. Teachers select discourse tools to enable complex MBI conversations Discourse Tools Rapid Surveys of Student Thinking Routines for Systematic Assessment of Student Work, Video 4. RSST allows monitoring of student thinking, guides instructional moves from one kind of conversation to the next Tool System for Supporting Ambitious Teaching Practice

More Related