1 / 31

The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation

The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation. Ellen Roscoe Iverson,  Carleton College,  eiverson@carleton.edu John A McLaughlin,  Managing for Results,  macgroupx@aol.com Cathryn Manduca,  Carleton College,  cmanduca@carleton.edu.

mgibson
Download Presentation

The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation Ellen Roscoe Iverson,  Carleton College,  eiverson@carleton.edu John A McLaughlin,  Managing for Results,  macgroupx@aol.com Cathryn Manduca,  Carleton College,  cmanduca@carleton.edu This project is supported by the National Science Foundation (NSF) Division of Undergraduate Education under Grants No. 0127310,0127141,0127257, 0127018, and 0618482. Opinions, findings and conclusions or recommendation expressed herein are those of the authors and do not necessarily reflect the views of the NSF. On the Cutting Edge is sponsored by the National Association of Geoscience Teachers (NAGT) and is part of the Digital Library for Earth System Education (DLESE).

  2. Overview On the Cutting Edge program Goals of evaluation Logic models Evaluation methods Results

  3. But first…have you used: • Computer-based interventions? • Pedagogically-based professional development for faculty? • Logic models as part of iterative design?

  4. On the Cutting Edge • Delivered at national level to geoscience faculty • Combines residential workshops and websites for faculty professional development • Supported by grant from National Science Foundation • Began workshops in 2002 • Funded for 3 more years

  5. Workshops (3 to 5 days) Emerging themes (2/year) Teaching X (1/year) Course Design (online and face-to-face) CareerPreparation and Management(2/year) Website Instructional materials Datasets and tools Pedagogical resources Tutorials Course resources Assessment tools Bibliographies Visualizations On the Cutting Edge

  6. http://serc.carleton.edu/NAGTWorkshops

  7. According to Guskey (2000)*, evaluators of professional development make three mistakes: Collect and report descriptive information – who was involved. Focus on attitudes of participants – did they think their time was well spent – and not on actual changes in the participant knowledge or skill. Keep evaluations brief and limit opportunities for application. Developing Evaluation Purpose *Guskey, T.R. (2000). Evaluating professional development. Published:  Thousand Oaks, Calif.: Corwin Press.

  8. Continuously improve the workshops and website Create information for others in our community about what works and does not work with respect to professional development for the members of our community. Initial Evaluation Purpose *

  9. Goals – 5 basic questions • Was the program implemented as planned? • What was the quality of the implementation? • What was the effect of the program on the participants? • What was the impact of the program? • What caused the observed effects and impacts?

  10. Logic Models

  11. Workshop Logic Model

  12. Workshop Logic Model

  13. Workshop Logic Model

  14. Workshop Logic Model

  15. Website Logic Model

  16. Website Logic Model

  17. Website Logic Model

  18. Methodologies Workshops • Road checks • End of workshop surveys • Observations and interviews • Online surveys • Baseline survey • Telephone interviews Website • Web statistics reports • Pop-up survey • Awareness poll • External Heuristic Review of website • Focus groups • Telephone interviews • Pilot • Imbedded assessment • Online discussion artifacts

  19. Results

  20. Effects

  21. Results

  22. Impacts

  23. Results

  24. Results

  25. Summary

  26. Implications for future • Snowball sampling to evaluate website-only participants • Imbedded assessment • Repeat baseline survey • Formal leadership program • 100% participants contribute to website

  27. For more information: http://serc.carleton.edu/NAGTWorkshops/evaluation.html

More Related