1 / 19

Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education

Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education. Prepared by: Michael Duffin, PEER Associates, Inc. With support from: the Place-based Education Evaluation Collaborative (PEEC) For presentation at: the NAAEE Pre-conference Research Symposium

marlee
Download Presentation

Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education Prepared by: Michael Duffin, PEER Associates, Inc. With support from: the Place-based Education Evaluation Collaborative (PEEC) For presentation at: the NAAEE Pre-conference Research Symposium October 10, 2006

  2. For more details and related information, seewww.PEECworks.orgorwww.PromiseOfPlace.org(as of Dec. 2006)

  3. The Place-based EducationEvaluation Collaborative (PEEC) • CO-SEED • Community Mapping Program • Litzsinger Road Ecology Center • Wellborn Ecology Fund • Trail To Every Classroom • Sustainable Schools Project • Forest For Every Classroom

  4. Program “Response” (Measures of Intended Outcome) Lower Higher Less More Program “Dose” (Exposure + Implementation) PEEC’s Dose-Response Measurement Strategy • If participants with less dose report lower outcomes, and those with more dose report higher outcomes, then the program is likely to be an active ingredient • Coleman (1966) claimed that schooling accounted for only 10% of the variance in student achievement (or rR2 = .10) • Marzano (2003) claims that that number is actually closer to 20%, with 13% deriving from teacher-level factors, and 7% attributable to school-level factors • Wang (2002) found that weight status predicts 17-19% of cost for treating cardiovascular disease

  5. Constructing Program “Dose” • Survey items estimating # of times participated in each part of the program delivery • Each item multiplied by staff estimate for average duration of typical event resulted in rough estimate of total hours • Converted program exposure to 1 to 4 scale and averaged with responses to items for implementation level to get dose composite • Very blunt instrument, but does seem to capture variability in dose (for the regression calculation, ordinal sequence was more important than gross accuracy of construct)

  6. Measuring Program “Response” or Outcomes • Survey items reflecting behavior changes represented in program logic • Modules reflect major ideas • Indices represent a range of sub-dimensions of major ideas • Two to six items for each index • Used Bonferoni correction to be conservative about statistical significance • Indices, modules, and overall modules rationally constructed (factor analysis and systematic internal reliability tests still to be done)

  7. PEEC Cross-Program Survey Results 2003-2004Changes in Educator Practice • 342 educator surveys • Very diverse sample (4 programs in 55 schools; Whole school change & Prof. development models; Urban, rural, suburban; Grades K-12) • Averages from an aggregate of 12 survey items show PEEC dose accounts for 19% of variance in Overall Educator Practice • Also at ΔR2≥ .10: Educator Engagement/ Growth; Use of Local Places for Teaching; Student Engagement in Learning; Student Civic Engagement; Student Time Spent Outdoors, Student Stewardship Behavior; Community Civic Engagement; Community planning/decision making process • Approx 200 more educator survey responses to be added in 04-06 (2 more programs, approx. 20 more schools)

  8. Whole School “Tipping Point” Hypothesis

  9. Survey Analysis Provides Finer-grained View of Cultural “Tipping” at Haley Elementary, 2006 • Pre-post measures spanning 3 years show large, significant effects • Dose-response calculation for aggregate of the pre- and post responses shows CO-SEED dose accounts for 45% of variance in Overall Educator Practice Change • Dose-response calculation for post-only responses shows CO-SEED dose accounts for 12% of variance in same outcome • Comparing zero dose only: in 2003 Mean=1.9, n=20, SD=.51; by 2006 Mean =2.8, n=4, SD=.60. • “Tipped-but-not-completely” still need some mechanism to sustain

  10. Reflections on UsingDose-Response Measurement Strategy • Single measurement event (without giving up possibility for pre-post comparisons, either of independent groups or matched pairs) • Conducive to both aggregation and disaggregation (across time and across programs) • Effect size is readily apparent (ΔR2 or percent variance) • Requires another level of understanding of statistics for users of the evaluation data • Be open to site-specific construction of dose

  11. First Grade Academic Achievementas a Function of CO-SEED/ Community-Based Units(Young Achievers School, 2005) Design: • Principal says “One thing we know is that kids’ writing is much more interesting, complex, and detailed if they’ve had rich experience…The current first grade has about a third of the kids who didn’t have Kindergarten here and in general it is breathtaking the difference in the academic achievement. Our Kindergarten has the strongest place-based education in the school, especially with language development.” First grade is also strong. • 3 measures (Direct Reading Assessment, TERC Math, YA Writing Assessment) tracked in YA’s assessment database • Compared 1st graders with one v. two years of exposure to strong PBE teachers Young Achievers School, Jamaica Plain, MA

  12. First Grade Academic Achievementas a Function of CO-SEED/ Community-Based Units(Young Achievers School, 2005) Findings: • 1st graders w/ more place-based education outperformed peers on all measures Young Achievers School, Jamaica Plain, MA

  13. Effects of CO-SEED onStandardized Test Scores (MCAS) at theBeebe Health & Environmental Magnet School(Beebe School, 2005, Massachusetts) • CO-SEED worked with Beebe 1999-2003, helped secure CSR funding to continue work 2002-2005 • Several lines of evidence suggest that the environmental theme has become embedded in the school culture • Before analyzing MCAS scores, we predicted that Beebe would deviate from the typical pattern and increase performance relative to district and/or state in the following content areas: • Math (mostly near 3rd and 4th grade) • English Language Arts – Writing • Life Science • Earth Science Design: Beebe School, Malden, MA

  14. Effects of CO-SEED onStandardized Test Scores (MCAS) at theBeebe Health & Environmental Magnet School(Beebe School, 2005, Massachusetts) • Typical pattern: State performs highest, then Beebe, then district Findings: Beebe School, Malden, MA

  15. Effects of CO-SEED onStandardized Test Scores (MCAS) at theBeebe Health & Environmental Magnet School(Beebe School, 2005, Massachusetts) • Only a few deviations from the typical pattern (6th & 8th grade Math, 8th grade Life & Earth science) Findings: Beebe School, Malden, MA

  16. Effects of CO-SEED onStandardized Test Scores (MCAS) at theBeebe Health & Environmental Magnet School(Beebe School, 2005, Massachusetts) • Analysis mildly supported the prediction for two areas (Math & Earth Science) • Analysis strongly supported the prediction in one area (Life Science) • Analysis did not support the prediction in one area (Writing, the typical pattern persisted in both grades 4 and 7) • Future prediction: Strongest results will continue to show up in the upper grades (i.e. where students have the highest cumulative dose of the environmental/ place-based theme integration) Findings: Beebe School, Malden, MA

  17. References • Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfield, F. D., & York, R. L. (1966). Equality of educational opportunity. U.S. Government Printing Office, Washington, DC. • Duffin, M., Powers, A. L., Tremblay, G., & PEER Associates. (2004). Place-based Education Evaluation Collaborative: Report on cross-program research andother program evaluation activities, 2003-2004.Retrieved October 6, 2004 from http://www.peecworks.org/PEEC/PEEC_Reports/S0019440A. • Duffin, M., & PEER Associates. (2006). Portrait of an urban elementary school: Place-based education, school culture, and leadership; An evaluation of Project CO-SEED at the Dennis C. Haley Elementary School, 2003-2006. Retrieved October 6, 2006 from http://www.peecworks.org/PEEC/PEEC_Reports/S00FE7771-0100A9B6 • Marzano, R. J. (2003). What works in schools: Translating research into action. Association for Supervision and Curriculum Development, Alexandria, VA. • Wang, G., Zheng, Z., Heath, G., Macera, C., Pratt, M., & Buchner, D. (2002). Economic burden of cardiovascular disease associated with excess body weight in U.S. adults. American Journal of Preventive Medicine, 23 (1), 1-6.

  18. Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education Prepared by: Michael Duffin, PEER Associates, Inc. With support from: the Place-based Education Evaluation Collaborative (PEEC) For presentation at: the NAAEE Pre-conference Research Symposium October 10, 2006 For more details, see www.PEECworks.orgorwww.PromiseOfPlace.org

  19. Program “Response” (Measures of Intended Outcome) Lower Higher Less More Program “Dose” (Exposure + Implementation) Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education Summary notes from October 10, 2006 presentation at NAAEE pre-conference symposium by Michael Duffin, PEER Associates Duffin, M., Powers, A. L., Tremblay, G., & PEER Associates. (2004). Place-based Education Evaluation Collaborative: Report on cross-program research and other program evaluation activities, 2003-2004. Retrieved October 6, 2004 from http://www.peecworks.org/PEEC/PEEC_Reports/S0019440A. Duffin, M., & PEER Associates. (2006). Portrait of an urban elementary school: Place-based education, school culture, and leadership; An evaluation of Project CO-SEED at the Dennis C. Haley Elementary School, 2003-2006. Retrieved October 6, 2006 from http://www.peecworks.org/PEEC/PEEC_Reports/S00FE7771-0100A9B6 For more details and related information, see www.PEECworks.orgorwww.PromiseOfPlace.org

More Related