320 likes | 451 Views
Brief History of MEAs. Mathematics education researchers created to observe student problem-solving competencies and growth of mathematical cognition. Documented as a methodology that helped students become better problem solvers.
E N D
Brief History of MEAs • Mathematics education researchers created to observe student problem-solving competencies and growth of mathematical cognition. • Documented as a methodology that helped students become better problem solvers. • Became a tool that helped both instructors and researchers become more observant and sensitive to the design of situations that engaged learners in productive mathematical thinking. • Introduced to college freshmen at Purdue • Extended by seven university consortium
Typical MEA • Elicits from a team a mathematical or conceptual system as part of its procedural requirements • Students need to make new connections, combinations, manipulations or predictions • Emphasis on testing, revising, refining and formally documenting solutions
A well designed MEA: • A realistic problem with an identifiable client. • Requires the development of a problem solving procedure involving unspecified mathematical, scientific, and engineering concepts. • Motivates students to integrate existing knowledge to develop a generalizable mathematical model. • Leads to heightened conceptual understandings. • Creates an environment where communication, verbalization, and collaboration must be combined with mathematical and engineering thought. • Requires students to acquire new knowledge on a just-in-time basis while reinforcing previously obtained knowledge.
Recent CCLI: Extending the MEA Construct • NSF CCLI Type 3 project • Cal Poly SLO, Colorado School of Mines, Minnesota, Purdue, Pepperdine, Pitt, USAFA, • Extensions – junior – senior level, ethical situations, misconceptions, laboratory experiments, integration of concepts, global and societal settings.
focus on model-building • interpretive systems that help make sense of situations • representations of ideas, connections between them • ascendant in recent years in nsf-funding and in learning sciences
models vary • unclear to …. • patchy or hazy to …. • intuitive to … • fact-deficient to …. • disconnected or uncoordinated to ….. • weak to …. • specialized to ….
models evolve • along these and other spectra • worthy lens for engineering curriculum is to progress along the spectra
models and modeling perspective • or mmp, focuses on nurturing growth along these dimensions • mmp interpretation of the learning science axiom to build on prior knowledge understand the models that students possess
mmp • better problem solvers don’t just do things differently, they see things differently • good solutions to complex problems rarely rely exclusively on topics that can be extracted from a single topical area of a textbook. solutions invoke intuition, ethics, multiple disciplines • small groups working on realistic problems provide far more productive problem-solving venues than individuals in isolation
understanding models • comes by seeing them • not trivial to see models consistently • research tools to see or disclose models were called thought-revealing activities or model-eliciting activities
interesting scientific observation • early mea research focused on how school and college age students expressed, tested and revised their models to solve realistic problems given in small group settings • the research setting changed the thing being studied – not hawthorne, but artifact of expression and testing • meas became promising tools for curriculum
six design principles • reality principle (the “personally meaningful” principle): could this happen in “real life”? • model construction: does the task create the need for a model to be constructed (or modified, or extended, or refined? • model documentation: will the response require students to explicitly reveal how they are thinking about the situation
six design principles • self evaluation: does the statement of the problem strongly suggest the criteria that are appropriate for assessing the usefulness of alternative responses? will students be able to judge for themselves when their responses are good enough? • model generalization: is the model not only powerful (for the specific situation and client at hand) but also sharable (with others) and re-useable (in other situations)?” • simple prototype: is the situation as simple as possible, while still creating the need for a significant model? will the solution provide a useful prototype (or metaphor) for interpreting other structurally similar situations?
what changes? • models • students • professors
Interactive Exercise John Christ Brian Self Tamara Moore
Model-Eliciting Activities: An interactive experienceEngineering solutions for contaminant spill responseLearning Objectives:1. Apply conservation of mass principles2. Employ chemical kinetics to predict contaminant degradation3. Employ technical knowledge in decision model4. Incorporate regulatory policy and ethical concerns in proposed solution evaluation
Toxic Spill: 23,500 gal Lake: 20 acres Avg depth = 14 ft Creek: Q = 10 cfs Pre-MEA: In-Class Example Recommendation: Don’t divert Stream
ABET outcomes • MEAs are ideally suited to improve ABET outcomes. • We saw significant improvement for several ABET outcomes in Engineering Economy, including f and h. • We also saw significant improvement for multiple ABET outcomes in Probability and Statistics, including d and e.
Step 1 – Problem introduction and Process ID(Reality principle, Model construction & documentation) • Individually: • Read the memo and answer the following questions: • What information will be required to enable the assessment of a wide range of spill scenarios and the development of a recommendation on courses of action? • Where might you locate this information? What are valuable sources and why would you rely on these sources over other potential sources?
Step 1 – Problem introduction and Process ID(Reality principle, Model construction & documentation) Teams: 3. Discuss your answers. Develop a list of required information to include recommended sources. 4. Teams – Develop a single-page process diagram outlining the engineering response. The process diagram should make clear decision points and the alternative engineering solutions considered by your decision model. The attached memo from the recent regional directors meeting outlines currently available rapid engineering capabilities that should be considered when developing your response.
Step 2 – Model development(Documentation, Self Assessment & model generalization) • Teams: • Implement the process flow diagram in an on-site tool (e.g., spreadsheet) • Evaluate alternatives • Incorporate constraints • Generalize model for different cleanup scenarios
Toxic Spill: 23,500 gal Lake: 20 acres Avg depth = 14 ft Creek: Q = 10 cfs Step 3 – Model Implementation(Effective prototype) • Student teams apply their model and methodology to an assigned realistic spill scenario • Ask questions about the human dimension
Transitioning Classroom Experience • MEAs • Transition in-class example problems to team-based learning exercises founded in real practice • USAFA focus on civil and environmental engineering • excellent framework to solve problems with “big picture” implications (e.g., human responsibility, international relations) • Integrates many disciplines …can tailor problems to aid students in seeing common links throughout engineering and science • Reinforce topics learned during foundational courses • Develop hands-on based modeling activities at the lab or field scale