400 likes | 572 Views
NASA Postdoctoral Program Evaluation of the NASA Innovations in Climate Education Portfolio. Ann M. Martin ann.m.martin@nasa.gov NASA Postdoctoral Program NASA Innovations in Climate Education (NICE) Minority University Research & Education Program (MUREP). Acknowledgments.
E N D
NASA Postdoctoral Program Evaluation of the NASA Innovations in Climate Education Portfolio Ann M. Martin ann.m.martin@nasa.gov NASA Postdoctoral Program NASA Innovations in Climate Education (NICE) Minority University Research & Education Program (MUREP)
Acknowledgments • Ann was supported by an appointment to the NASA Postdoctoral Program at NASA Langley Research Center, administered by Oak Ridge Associated Universities through a contract with NASA. • The NICE Management Team (Advisors: Margaret Pippin & Lin Chambers; Project Managers: Kate Spruill & Monica Barnes; Susan Givens, Dan Oostra, Andrea Geyer, Mary Jo Leber, Denice Dublin, Cassandra Small) • NICE Awardees & Evaluators • The Tri-Agency Evaluation Working Group • Ruth Anderson, John Baek, Elisabeth Barnett, Rachel Becker-Klein, Bob Bleicher, John Burgette, Beth Cady, Hilarie Davis, John Fraser, Ellen Guisti, Carol Haden, Jim Hammerman, Kathy Haynie, Sue Henderson, Kimberle Kelly, Joan LaFrance, Shirley Lee, Teresa Lloro-Bidart, Carole Mandryk, Eugene Maurakis, Gerry Meisels, Jim Minstrell, Laura Munski, Mike Odell, Frank Rack, Texas Gail Raymond, Christa Smith, Martin Storksdieck, Sarah Yue, Dan Zalles, & others
Goals of Evaluation: NASA & NICE • To describe and understand the program • To drive program improvement • To promote and disseminate findings from funded projects • To capture program stories through qualitative analysis that can’t be captured through simple quantitative metrics • To make evaluation a more entrenched part of education and public outreach funded by NASA
NICE Evaluation Model • Evaluation Support & Capacity Building • Rolling project findings up to program level • Tri-agency logic model • Webinars • Telecons • Conference meet-ups • Online evaluator workspace
Outline: Evaluation Questions • What and how are projects evaluating? • To what extent are NICE program-level integration activities successful? • To what extent do NICE-funded projects partner with each other and with other institutions? • What challenges are encountered, and what lessons are learned, during project implementation? • What overall outcomes for the project portfolio can be assessed using available data? What promising practices are emerging from the NICE portfolio?
Outline: Major Findings • Meta-evaluation suggests opportunities for more varied evaluation practices across the portfolio. Evaluation literacy/capacity can be built within NASA, in order to better support awardees and programs in conducting evaluations.
Outline: Major Findings • Analysis of PI experience surveys demonstrates that the unique program model of NICE, including the tri-agency collaboration, is highly valued and successful. Awardees turn to the NICE team as both an “insider friend” and a “customer service representative.”
Outline: Major Findings • Social network analysis of the NICE partnership network demonstrates that NICE reaches far beyond the 75 projects it has funded, reaching 377 unique institutions. NICE is driving a complex national network of institutions working on climate change education, including a range of types of institutions and a range of minority-serving institutions.
Outline: Major Findings • Projects encounter challenges with project timelines, participant recruitment, baseline quantitative skills among participants, and using technology in classroom settings. They learned lessons that suggest the importance of flexibility, communication, organization, and strategic planning.
Outline: Major Findings • NICE’s funded projects have contributed to the overall goals of the program, and to the goals articulated by a tri-agency collaboration engaged in climate change education, demonstrated through outcomes data and evaluative evidence.
Question 1: What and how are projects evaluating? • Meta-Evaluation of the NICE Portfolio • Sample language from the 2010 GCCE solicitation
Instruments & Protocols • A lack of standardized climate change concept inventories, along with specific content needs for each project, led most of them to develop their own custom instruments.
Summary • Strong Focus on: • Summative Evaluation • Nonexperimental Designs • Self-report data • Match between project goals and evaluation questions • Weaker Focus on: • Formative or Front-End Evaluation • Comparison Groups • Cause-and-effect relationships • Direct measurement • Details of analysis
Meta-evaluation: Recommendations • As program officers • Basic evaluation “literacy” or evaluation capacity on teams • Know what you are looking for (objectives, outcomes, audiences, etc.) • As writers of solicitations • Provide some guidance for PIs and for evaluators • Model what you are looking for – e.g., logic models; clear, concise statements of outcomes and impacts • Recognize evaluation expertise as critical expertise
Question 2: To what extent are NICE program-level integration activities successful? • Evaluation of the novel NICE program model
NICE as a Community of Practice • Awardees highly value their relationships with NICE, with tri-agency projects, and with each other.
Awardee Reflections on Relationships with NICE • "I found communication from the management team to be exceptional. From day 1 . . . it was made clear that my colleagues and I are members of a learning community facilitated by an active management team focused on providing as many opportunities as possible for learning from each other." • "I had far more involvement with the program director on this grant than from any other grant . . . Keep that up!” • "I . . . never felt 'lost in the crowd,' as can happen with some large programs.” • "This [tri-agency PI] meeting is a showcase of best practices . . . and should be observed and replicated."
Program Model Evaluation: Recommendations • As service providers for awardees • Continue pushing for personal, one-on-one relationships • Focus on stability & usability • NASA reporting is the biggest burden, and requires the most energy to flow smoothly! • As facilitators of a community • Remain engaged in broader (national) conversations and efforts related to the climate change education community • Use this expertise to proactively “match” awardees with desired resources
Question 3: To what extent do NICE-funded projects partner with each other and with other institutions? • Social Network Analysis of the NICE Partnership Network
Project: bright orange Non-project: charcoal
MSIs in the NICE Network HBCU: royal blue Tribal Serving Institution: pale blue Minority Serving School District: bright orange HSI: pale orange “Other” MSI: black
MSIs in the NICE Network • In terms of number of relationships and embeddedness in the overall network, no statistical difference is observed between the total population of projects and the projects hosted at MSIs, nor between the full network and the nodes at MSIs.
Network Analysis: Recommendations • Shaping NICE through Activities & Selection • Efforts to provide integration activities and networking opportunities have shaped a true national network; continue to focus on partnership (Ask NICE, PI meetings, etc.) • Identify “touchpoints” in the network, around which partnerships cluster. • Facilitating Partnership & Collaboration • Strive to keep the 4 new NICE-T projects tightly connected to colleagues as previous projects leave (37 projects have sunsetted)
Question 4: Challenges & Lessons Learned • Or: “What challenges are encountered, and what lessons are learned, during project implementation?”
(Some!) Lessons Learned • Flexibility • Assembling a team • Tradeoff between more content and deeper content • Resources • Other climate change education resources • Teachers! • New standards • Evaluation
Challenges & Lessons Learned: Recommendations • Award timelines – lengthen or change? • Current idea: develop, pilot, revise, implement, evaluate, disseminate • What about breaking that down? “Develop/Pilot/Revise” vs. “Implement and Evaluate” vs. “Disseminate & Scale Up.” • Recognize technology as both an opportunity and a barrier • Recognize the demands on time, expertise, and budget • Allocate specific time and resources to content with these aspects • Keep the level of innovation manageable
Question 5: Outcomes & Promising Practices • Or: “What overall outcomes for the project portfolio can be assessed using available data? What promising practices are emerging from the NICE portfolio?” • Synthesis of research & evaluation findings
(Some!) Promising Practices • Professional development for higher education faculty • Incorporation of implementation planning time into professional development • Connecting climate change content to factors that motivate students (social awareness, career interest, multicultural or family-oriented approaches to learning and knowing) • Targeting development of the quantitative skills that underpin scientific research skills
Outcomes & Promising Practices: Recommendations • NASA Project Monitoring: • Better output tracking, to capitalize on all of the resources & educational tools produced. • Making better use of evaluations (first step: getting our hands on them!) • Further research into key practices: • Opportunities for longer-term evaluation in classrooms (realistic timeline is longer than a NICE award) • Higher education students desire pipelines and course sequences that make climate change relevant to their degree programs and careers, but projects to engage faculty were successful when they “plugged in” climate change to existing offerings.
NICE Evaluation Summary • NICE has developed an innovative model for NASA-funded grant/cooperative agreement management • Awardee experiences are significantly strengthened by constant contact with NASA (expert colleague/customer service) • Awardees and their projects benefit from involvement in a national network of climate change educators • NICE’s partnership focus has extended its reach well beyond the 75 funded projects • Evaluation continues to be a frontier for NASA education
Evaluation Within NASA • Coordinated strategy • e.g., Paperwork Reduction Act: duplication of effort with too many small projects trying to climb over a large barrier • Evaluation Capacity & Support • e.g., Logic Models: useful tool for communicating program, keeping on task, and enabling evaluation • Use of project evaluations • How can we ensure that projects use this critical piece of what they fund?
Questions? • Contact: • Ann Martin, ann.m.martin@nasa.govoramartin10@gmail.com • 757-864-5191