1 / 38

Date Modeling: From Design Experiments to Scale Implementation

Date Modeling: From Design Experiments to Scale Implementation. Rich Lehrer- Vanderbilt University Anthony Petrosino- University of Texas The Waterbury Summit: Quantitative Reasoning August 8, 2013. Quantifying Chance.

ima
Download Presentation

Date Modeling: From Design Experiments to Scale Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Date Modeling: From Design Experiments to Scale Implementation Rich Lehrer- Vanderbilt University Anthony Petrosino- University of Texas The Waterbury Summit: Quantitative Reasoning August 8, 2013

  2. Quantifying Chance “If you don’t understand statistics, you don’t know what is going on” (Thompson, 2010, Wired).

  3. School Instruction • Variability is given little attention in USA K-12 Instruction • When it is…it is fairly simple: • Brief exposure • Some statistics (mean and standard deviation) • No focus on data modeling

  4. Trouble with Current Practice • Much of current science instruction fails to push beyond simple comparison of outcomes • If the numbers are different, the treatments are presumed to differ • Since tools for understanding ideas about sampling, distribution, or variability are not immediately accessible to students, teachers stick closely to “investigations”

  5. Data Modeling • Participation in contexts that allow students to • Develop questions • Consider qualities of measures and attributes relevant to a question • Structure data • Make inferences about their questions

  6. Contextualizing Chance: Measure of Attribute by Multiple Measurers measurement error a single measurement True measure

  7. Context of the Investigation • A series of tasks and tools aimed at helping students consider error as distributed and as potentially arising from multiple sources. • Introduced distribution as a means of displaying and structuring variation among student observations of the “same” event • GOAL: to view variability as distribution and not simply as a collection of differences among measurements.

  8. Attribution and Mechanisms • How might the process of measuring produce sources of variability? • Students measured a number of things calling attention to variability and structure including: • The schools flagpole • A pencil • Model Rockets at the apex of their flight

  9. Random-Systematic Error • Multiple sources of random and systematic error were identified and investigated • The goal was to determine whether the differences between the resulting distributions in height of rockets was consistent with random variation or could be attributed to a design feature (shape of nose cone)

  10. Contributions to Measurement Variability? • Different procedures for measuring • Precision of different measuring tools • Trial by trial variability

  11. Original Flag Pole Data Center estimates true height “Clumpiness” indicates precision

  12. Quantifying Precision of Measure

  13. Experiment Revisit Rounded vs. Pointed Activity

  14. Comparing Distributions Rounded vs Pointed Pointed- BLUE Rounded- RED

  15. Student Reasoning • Pencil and paper • 1997 NAEP Items • Clinical Interviews • Reasoning about and critiquing experiment 2) Influences of process on distribution

  16. Results On NAEP items… • Far better than National Sample of 4th Graders on “Data and Measurement” • Same Level as National Sample of 8th Graders From clinical interviews- we learned that students generated distributions of measures consistent with process of measurement. Their distributions varied with precision of the instrumentation yet retained the symmetry expected when considering error of measure.

  17. NAEP Item (#MO61905)

  18. NAEP Item (#MO61905) Results

  19. NAEP Item (#M04900)

  20. NAEP Item (#M04900) Results

  21. Summary • Lessons learned about variability were used to reason about experiment • Trial to trial variation • Measurement variation • Effect variation

  22. Inventing Statistics of Signal and Noise Sum of distances from each measurement to the median

  23. Measure Review: Developing an Aesthetic of Measure • What about the distribution is attended to by the invented measure (the statistic)? • What is the relation between the symbolic structure of the invented measure and the characteristics of the distribution? • What is the meaning of different values of the statistic? • What happens to the statistic when changes to the sample distribution are imagined?

  24. Precision: Clumps and Regions “What I did was I circled the median and mode, Our numbers that close to it. … All the numbers inside here (pointing to outlined region in the graph). See how small they are? If we agree on them, they will get bigger. But if we disagree on them, they will get smaller (pointing again to the interior of the outlined region in the graph).”

  25. Negotiating Meaning

  26. S: So um my my method was to first find the range of your data and um to find out how much um how ho:::w much the graph or the people agree is uh the range is the key. Um. If the range would be like one like stack= ((moving his right hand up and down)) Teacher: That S: =it would be the range would be zero and like zero is the best? Sort of like grading of it? And then. << >> and then infinity? ((laughing)) Infinity is like the worst.

  27. Evan: Why can’t they have two way outliers (on less variable of two data sets). (3.0) John (Author): It doesn't matter. T transforms data to follow Evan’s suggestion ((Alex looking at distributions, decides that one is still clearly less variable than the other)) Ethan: OH. Not now. Not by his way. John suggests repair by trimming outliers, then computing the range statistic. Others disagree. When to stop “chopping?”

  28. Modeling Variability

  29. Model-based Inference • Empirical sampling distribution of a measure (statistic) of the model • Compare sample statistic to sampling distribution

  30. Data Display: Case ->Aggregate

  31. Data Display as a Scientific Practice • Grade 3 Investigations of Organism Growth • Context: Social Studies, Role of Silkworms • Students investigations of • Conditions for Hatching Eggs • Growth • Structure-function (mouthparts) • Indicators of growth (measure length) • Challenge: For a day of growth, create a display of the data that shows something that you noticed about the measurements—some pattern or trend—so that someone else looking at your display can see what you notice. (n = 241 larvae) • Display review. Class compares displays re show and hide.

  32. Knowledge-Practice • Shape of Data as bridge between organism and population levels of thinking, a key to reasoning about evolution. • Shape of data may provoke new forms of inquiry.

  33. Case Value (DaD2) DaD 4 Case Value DaD 2

  34. Invented Display: Groups of Similar Values (DaD 3)

  35. Scale (DaD 4)

  36. Display-based Reasoning What about the tail? They all want the food

  37. Sampling Natural Systems • Samples and Material Means • Sample to Sample Variability (6th grade) “It’s like sampling, like the armspan. So we were taking samples of Mr. R’s armspan. Other people would put in their measurement and make a bigger graph. Put it together, so it would be more exact. Like at the Silent Street Pond, taking samples, it will get better.”

  38. Natural Variation: Heights of People Naturally occurring variability Height of a single person ?????

More Related