300 likes | 318 Views
Discussant Remarks. Jon R. Star Michigan State University. My role. provide constructive criticism push your work forward I’ve chosen to assume role of journal reviewer allows for a conversation between reviewer and reviewee (I know that these are ‘just’ AERA papers!). For starters.
E N D
Discussant Remarks Jon R. Star Michigan State University
My role... • provide constructive criticism • push your work forward • I’ve chosen to assume role of journal reviewer • allows for a conversation between reviewer and reviewee • (I know that these are ‘just’ AERA papers!) AERA 2006 San Francisco
For starters... • papers were great! • quality was appropriate for discussant-as-reviewer comments • 3 of 4 full and complete papers submitted on website prior to deadline! • apologies to Moreno and colleagues - no comments on your paper in my remarks AERA 2006 San Francisco
Amy Ellis paper • anticipated outlet: Journal for Research in Mathematics Education • decision: reject, but strongly encourage revision and resubmission, taking into account my comments AERA 2006 San Francisco
1. Clarify phenomena • distinction between types of generalization is not sufficiently clear • presence of real-life context? • or nature of reasoning? • particularly important because curriculum A looks a lot like many that we think are good! • yet it did not necessarily support kind of generalization that you think is productive AERA 2006 San Francisco
1. Clarify phenomena • what is quantitative reasoning? • “creating rules that had little quantitative meaning for them” • what features of curriculum A do or do not support quantitative reasoning? AERA 2006 San Francisco
2. Clarify importance • why does it matter that students reason quantitatively in this study? what does it enable students to do that other forms of generalization do not? • evidence presented is somewhat weak • in absence of good evidence, distinction is interesting but not obviously educationally relevant AERA 2006 San Francisco
2. Clarify importance • is the goal to have students reason quantitatively, or by reasoning quantitatively does this allow students to do other things? • The study is not clear, either theoretically or empirically, on this issue AERA 2006 San Francisco
3. Clarify claims • a comparison study? yes or no? • method section says no • but results do make comments about comparison AERA 2006 San Francisco
3. Clarify claims • “This study’s results suggest that reasoning with quantitative relationships can support more sophisticated mathematical activity, which is a claim suggested by researchers but as yet rarely backed by empirical work examining students’ generalizations.” • does this study’s empirical results support this suggestion, since it wasn’t comparative? AERA 2006 San Francisco
3. Clarify claims • ‘Ellis (2006) found that middle school students who were pushed to produce generalizations focused on quantitative reasoning (as opposed to number patterns and procedures) ultimately were more likely to develop understanding of linearity.’ • does your evidence support the claims made with this sentence and citation? AERA 2006 San Francisco
Overall... • interesting, promising, and important paper • you have a clear and well-articulated research agenda and this paper would be another important contribution AERA 2006 San Francisco
Rozy Brar paper • anticipated outlet: Mathematical Thinking and Learning • decision: Revise and resubmit • good (or bad) news: reviewer thinks a lot about conceptual and procedural knowledge! • comments and suggested revisions revolve around this issue AERA 2006 San Francisco
1. Know vs. understand? • What is the difference (for you) between knowledge and understanding? • you use conceptual and procedural understanding, rather than knowledge • not an esoteric point for this paper • future work to do clinical interviews to assess conceptual and procedural knowledge - how you assess these competencies is critical to this and future studies, so you need to be clearer AERA 2006 San Francisco
1. Know vs. understand? • What does conceptual knowledge (or understanding?) look like in this domain? • What conceptually is missing from the students that you interviewed? • Not enough to say that students don’t understand what they are doing • What exactly don’t they understand? • additional clarity on this would strength paper AERA 2006 San Francisco
2. Issue to think more about • possible to ‘proceduralize’ almost anything • what is the potential value of things such as area models, manipulatives, algebra tiles, real-life contexts, various software packages? • conceptual representational crutch - an alternative representation that aims to give students insight into the conceptual underpinnings (the ‘why’) of a procedure AERA 2006 San Francisco
2. Issue to think more about • is a conceptual representational crutch sufficient? • your study says NO • some teachers don’t take full advantage of the connections to underlying concepts that the conceptual representational crutch affords AERA 2006 San Francisco
2. Issue to think more about • is a conceptual representational crunch necessary? • can a teacher provide accessible conceptual explanations for procedural actions to students without such a crutch? • yes - we see it a lot in TIMSS videos AERA 2006 San Francisco
2. Issue to think more about • what is the potential value of the conceptual representational crutch? • we want students to understand why they do what they do - are some representations better at achieving this goal than others? • the conceptual representational crutch often becomes just another way to do the problem AERA 2006 San Francisco
3. Situate work in context • excellent job of connecting work to broader rational number literature • seems related to investigation of children’s fraction schemes - Les Steffe, Amy Hackenberg, Eric Tillema AERA 2006 San Francisco
4. Policy issues of work • problem: students fail to develop conceptual knowledge • worse problem: students fail to see the value of conceptual knowledge • worst problem: we are not sufficiently clear in our research to document the benefits of having conceptual knowledge AERA 2006 San Francisco
4. Policy issues of work • what can students do (or say) when they understand that they cannot when they don’t understand? Do you have empirical evidence that this is the case in your study? AERA 2006 San Francisco
Overall... • interesting particularly (to me) for the way you push on (my words) the proceduralization of a conceptual representational crutch AERA 2006 San Francisco
Van Dooren paper • anticipated outlet: Learning and Instruction • decision: accept, pending revisions • very tight study - nice contribution to a productive line of research on this topic by authors AERA 2006 San Francisco
1. Explanation for posttest? • analogical reasoning literature (Gentner) • difficulties that learners face in determining the similarities between a new problem and a previously solved one • transfer literature (Gick & Holyoak) • how might meaningful, performance-based tasks be used AND impact posttest performance on traditional tasks? AERA 2006 San Francisco
2. Time on task • potential confound between conditions? • P spent most time on task • just reading P problem takes longer? • time differences small enough - need to explain away this possibility AERA 2006 San Francisco
3. Clarify analysis • what is a “contrast analysis” • more detail and clarity on the statistical tests that you used AERA 2006 San Francisco
Overall... • good work! • impressed generally with the work that has come out of this Center at University of Leuven AERA 2006 San Francisco
Closing... • view my remarks as suggestions • feel free to disagree or challenge me • if you do pursue publication, I am happy to read additional drafts of your work, if you would find this helpful AERA 2006 San Francisco
Thanks! (Questions?) Jon Star Michigan State University jonstar@msu.edu www.msu.edu/~jonstar