260 likes | 373 Views
2012 National Survey of Science and Mathematics Education Briefing Book. Horizon Research, Inc. Chapel Hill, NC January 2014. Acknowledgement.
E N D
2012 National Survey of Science and Mathematics EducationBriefing Book Horizon Research, Inc. Chapel Hill, NC January 2014
Acknowledgement These materials are based upon work conducted by Horizon Research, Inc. and supported by the National Science Foundation under Grant No. DRL-1008228. Any opinions, findings, and conclusions or recommendations expressed are those of the author and do not necessarily reflect the views of Horizon Research, Inc. or the National Science Foundation.
About the 2012 National Survey of Science and Mathematics Education • The 2012 NSSME is the fifth in a series of surveys dating back to 1977. • It is the only survey specific to science and mathematics education that provides nationally representative results.
About the Study • Two-stage sample that targeted: • 2,000 schools (public and private) • Over 10,000 teachers • Purposefully oversampled teachers of advanced mathematics, chemistry, and physics to allow for disaggregated results for each group • Four main instruments: • Science program questionnaire • Mathematics program questionnaire • Science teacher questionnaire • Mathematics teacher questionnaire
Topics Addressed • Characteristics of the science/mathematics teaching force: • demographics • content background • beliefs about teaching and learning • perceptions of preparedness • Instructional practices • Factors that shape teachers’ decisions about content and pedagogy • Use of instructional materials • Opportunities teachers have for professional growth • How instructional resources are distributed
Excellent response rate: • 1,504 schools agreed to participate • Over 80 percent of program representatives • Over 75 percent of sampled teachers • Sampling and analysis techniques used provide nationally representative estimates.
For More Information on the 2012 NSSME http://www.horizon-research.com/2012nssme/
Organization • The Briefing Book follows the structure of the Report of the 2012 National Survey of Science and Mathematics Education: Banilower, E. R., Smith, P. S., Weiss, I. R., Malzahn, K. A., Campbell, K. M., & Weis, A. M. (2013). Report of the 2012 national survey of science and mathematics education. Chapel Hill, NC: Horizon Research, Inc.
Two sets of slides, one for science and one for mathematics, were created for each chapter of the report that included results from the study. • When the results would fit, all three grade ranges (elementary, middle, and high) are represented on a single slide. • When the grade ranges are on separate slides, note that the slides may show the data for each grade range in different orders.
For each set of slides presenting findings, the Briefing Book first contains an overview slide that is not intended for use in presentations. • The overview slide shows the original data table from the report. Numbers in parentheses in the tables are standard errors. • Note that several tables in the technical report contain both science and mathematics data. The science slides show only the science portion of those tables, and the mathematics slides show only the mathematics portion. • In some cases, select data from a table were chosen to be shown in a graph.
The overview slide notes include: • Where in the report the table appears; • An indication of whether the table shows data from individual survey items, composite variables, or other variables derived from the survey data; • The original questionnaire items the data are based on; and • The text from the report that was written about the data table.
Conventions used in displaying original items: • Instructions that were included with questionnaire items are represented by a darkened circle: • Sub-items for questionnaire items are represented by bulleted letters: a., b., c., etc. • Response options for items with sub-items are shown in brackets: [] • “Select one” response options are represented by an open circle: • “Select all that apply” response options are represented by a check box: • Free-response items are represented by a blank line: _____ • In some cases, only some sub-items were used to create a table; sub-items that were not used in the table are included but crossed out (strikethrough).
Composite Variables • Sets of related questionnaire items were combined to create several composite variables related to key constructs. • Composite variables, which are more reliable than individual survey items, were computed to have a minimum possible value of 0 and a maximum possible value of 100. • Higher composite scores indicate more of the construct being measure by the composite. • Overview slides that contain composite variable data include a more in-depth description of how composites were calculated.
Equity Factors • Chapters 2–7 in the technical report provide data on several key indicators, disaggregated by one or more equity factors: the prior achievement level of students in the class, the percentage of students in the class from race/ethnic groups historically underrepresented in STEM (HUS), the percentage of students in the school eligible for free/reduced-price lunch (FRL), school size, community type, and region. • For school size and FRL, each school was classified into one of four quartiles. Similarly, each randomly selected class was classified into one of the four categories based on the proportion of students identified as HUS.
Accurately Communicating Findings • Results from the NSSME are nationally representative—thus, it is appropriate to talk about schools, teachers, and classes in the nation. • It is not appropriate to talk about schools, teachers, and classes “in the sample.”
Accurately Communicating Findings • It is important to pay attention to figure/axis titles: • Schools vs. Classes vs. Teachers • Because of the sample design, you cannot interchange “classes” and “teachers.”
Accurately Communicating Findings • The scale on a figure axis affects the interpretation of the figure: • Magnifying small differences • Obscuring important differences • In the briefing book, almost all figures are on a 0–100 scale. Exceptions are when the bars are supposed to add to 100.
Accurately Communicating Findings • Be wary of making mountains out of molehills. • Although some results point to differences among groups (e.g., by grade level), it will be important to consider whether a finding is educationally significant as well as statistically significant.
Accurately Communicating Findings • Not all apparent “differences” are statistically significant. • If the “difference” isn’t statistically significant, it’s not a difference – the result could be due to sampling error.
Statistical Significance • Unfortunately, there’s not a simple way for you to test statistical significance.
Option 1 • Look in the 2012 National Survey report • If we wrote about a difference in a report, that difference was tested and is statistically significant.
Option 2 • An approximation of whether two numbers are statistically different is to add the standard errors and compare that to the difference between the two numbers. If the difference is greater than the sum of standard errors, it is likely statistically significant.
Option 3 • If you want to be somewhat more precise, and put in a little more effort… • We have an Excel spreadsheet calculator you can use for comparing percentages.