260 likes | 456 Views
Northwestern University Computational Thinking in STEM. http :// ct- stem.northwestern.edu. Building interest and proficiency in computational thinking in STEM. Meet the Team. Principal Investigators.
E N D
Northwestern UniversityComputational Thinking in STEM http://ct-stem.northwestern.edu Building interest and proficiency in computational thinking in STEM
Meet the Team Principal Investigators KemiJona Michael Horn Vicky Kalogera Laura Trouille Uri Wilensky Research Faculty Graduate Students High School Lead Teachers: Ami Lefevre Tim Miller Mark Vondracek 11 2012-2013 High School Pilot Teachers Workshop & Lesson Lead: Meagan Morscher Kai Orton David WeintropElhamBehesti & You! the 2013-2014 Pilot Teachers This work is supported in part by the National Science Foundation under NSF grants CNS-1138461 and is covered by IRB study STU00058570. However, any opinions, findings, conclusions, and/or recommendations are those of the investigators and do not necessarily reflect the views of the Foundation.
CT-STEM Workshop: Goals & the Future Workshop Goals: • Build your knowledge, interest, and confidence: • developing your students’ CT-STEM skills • using CT tools to improve your students’ learning of STEM concepts • Connect CT-STEM to what you already do & to Illinois standards • Train in 4 discipline-specific CT-STEM lesson plans • Create draft of a new CT-STEM lesson As Pilot Teachers: • 5 Saturday continuing education workshops • Teaching and assessing 5 CT-STEM lessons in your classroom
CT-STEM: Key Concepts • Algorithmic Thinking: • create a series of ordered steps to solve a problem • allows for automation of a procedure • Examples: • Efficiency at a buffet table • Long Division • Experimental Procedure
CT-STEM: Key Concepts • Abstraction: • Pulling out the important details • Identifying principles that apply to other situations • Examples: • Holiday dinners • Construct a model of an atom • Use the term ‘titration’ in an experimental design
CT-STEM: Key Concepts • Computational Modeling: • Use a computational tool to develop a representation of a system (i.e., visualize an abstraction of a system) • Use a computational tool to analyze, visualize, and gain understanding of a STEM concept • Examples: • CAD (in engineering) • Netlogo and other computational environments
CT-STEM: Key Concepts • Decomposition: • Reformulating a seemingly difficult problem into one we know how to solve • Examples: • Road networks in a major city -> Muddy City
CT-STEM: Key Concepts • Generalization: • How is this problem is similar to others? • Can we transfer the problem solving process from a solved problem to this new one? • Examples: • Can I apply the same strategies that I learned playing soccer to playing basketball? • Gravity and flux
CT-STEM: Key Concepts • Big Data: • Big Data refers to a collection of data sets so large and complex, it’s impossible to process them with the usual databases and tools. • Because of its size and associated numbers, Big Data is hard to capture, store, search, share, analyze and visualize. • Examples: • Sequencing the human genome • The Galaxy Zoo Project of over 1 million galaxies
“Big Data” is Everywhere • ~40 109 Web pages at ~300 kilobytes each = 10 Petabytes • Youtube 48 hours video uploaded per minute; • in 2 months in 2010, uploaded more than total NBC ABC CBS • ~2.5 petabytes per year uploaded? • LSST 30 TB/night • LHC15 petabytes per year • Radiology 69 petabytes per year • Square Kilometer Array Telescope will be 100 terabits/second • Earth Observation becoming ~4 petabytes per year • Earthquake Science – few terabytes total today • PolarGrid – 100’s terabytes/year • Exascale simulation data dumps – terabytes/second
CT in Astronomy • Mass Determination of our Milky Way’s Black Hole • Comparing observed data to simulations
CT in Biology • Shotgun algorithm expedites sequencing of human genome - DNA sequences are strings in a language - Protein structures can be modeled as knots - Protein kinetics can be modeled as computational processes - Cells as a self-regulatory system are like electronic circuits
CT in Chemistry • Atomistic calculations explore chemical phenomena • Optimization and searching algorithms identify best chemicals for improving reaction conditions to improve yields
CT in Engineering • Boeing 777 never tested in a wind tunnel, only in computer simulations • Ability to calculate higher order terms implies more precision, which implies reducing weight, waste, costs in fabrication, etc.
CT in Geology • Modeling the earth inner layers, using seismic waves • Modeling the earth and our atmosphere to track and predict climate changes
CT in Math • Discovering E8 Lie Group • took 18 mathematicians, 4 years and 77 hours of supercomputer time (200 billion numbers). • Profound implications for physics (string theory)
CT in Medicine • Robotic surgery • Electronic health records require privacy technologies • Scientific visualization enables virtual colonoscopy
CT in Social Sciences • Social networks explain phenomena like MySpace, YouTube • Statistical machine learning is used for recommendation and reputation services, e.g., Netflix, affinity card
CT in the Humanities • What do you do with a million books? • Nat’l Endowment for the Humanities Institute of Museum and Library Services • Arts, drama, music, photography Credit: Christian Mueller
CT in Entertainment • Games • Music MP3 sorting/searches • Movies - Dreamworks uses HP data center to renderShrek and Madagascar - Lucas Films uses 2000-node data center to make Pirates of the Caribbean.
The Human Genome…By the Numbers 46…Chromosomesin each cell ~23,000…Genesin the human genome 2.4 million…Base pairs in the largest human gene 3.1 billion…Base pairs in each cell 75-100 trillion…Cellsin the human body
The demand for computation in biology… Analytical technology High-throughput data Biological knowledge Medicine & bioengineering
Cost per Megabase of the DNA Sequence We need cost effective Computing! Full Personal Genomics: 3 petabytes per day
McKinsey Institute on Big Data Jobs • There will be a shortage of talent necessary for organizations to take advantage of big data. By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.