280 likes | 509 Views
Using Performance Measurements to Evaluate an Education Leadership Master’s Program. Presented at: Northern Nevada Higher Education Assessment Conference 2/6/04. By: Bill Thornton, UNR Department of Education Leadership Gus Hill, UNR Department of Education Leadership
E N D
Using Performance Measurements to Evaluate an Education Leadership Master’s Program Presented at: Northern Nevada Higher Education Assessment Conference 2/6/04 By: Bill Thornton, UNR Department of Education Leadership Gus Hill, UNR Department of Education Leadership Tara Shepperson, UNR Department of Education Leadership
Why Do We Evaluate? “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason, that there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus
Defining Evaluation • Evaluation is... the systematic investigation of the merit, worth, or significance of an “object”–Michael Scriven
Framework for Program Evaluation 1. Engage stakeholders 6. Feedback and improvement 2. Describe the program Standards . Utility . Feasibility . Valid . Reliability 3. Focus on the evaluation design 5. Develop data-based conclusions 4. Collect credible data from multiple sources
1: Engage Stakeholders • Pre-Evaluation: Early identification of disagreements in… • Definition of the problem • Priority activities • Priority outcomes • What constitutes “proof” of success • Post-Evaluation: Get their help with… • Credibility of findings • Access to key players • Follow-up • Dissemination of results
1: Ed Leadership Stakeholders • Faculty, College of Ed., UNR • Students • School districts, students, teachers, administrators • The public: business, parents, community
2: Describe the Program: • How do logic models structure evaluations and promote continuous improvement? • Clarity for department & stakeholders on: • What activities • Intended effects, outcomes, and relationships • Long term effects v. short term effects • Helps focus decision making • Outcome evaluation (effects) • Process evaluation (activities) • Program improvements
2: EL Master’s Program • Approximately 60 students • State requirements for certification • Program of studies • Exit exam
3: Focus on the Evaluation Design What questions are being asked? • What intervention was actually delivered? • Were impacts and outcomes achieved? • Was the intervention responsible for the impacts and outcomes? • What does the research indicate?
3: Evaluation of EL Program • Expectation • Master knowledge • Exhibit specific performances • Assessment aligned with standards • Validity • Reliable • Basis for data-based decisions • Program improvement • NCATE
3: Education Leadership Program Assessment • Student performance by course • Praxis exam results • Internship portfolio • Program completers survey
You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. “…. today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)
4: Collect Credible Data from Multiple Sources Choosing data collection methods • Typical factors might be: • Time • Cost • Sensitivity of the issue • “Hawthorne effect” • Ethics • Validity • Reliability • Utility, accuracy, & feasibility • Usually trade-offs
4: For example ISLLC Standards: Validity The Department of Education Leadership has adopted the standards of Interstate School Leaders Licensure Consortium (ISLLC). • Vision of learning. • School culture & instructional program. • Management and operations. • Collaboration with families and community. • Integrity, fairness and ethics. • Response to larger political, social, legal context.
5: Develop data-based conclusions Evaluators must “justifying conclusions & recommendations” • Can we consider the project a success--- • Examples: • Can attribute changes to the project • The project reduced disparities • The project leaves a “legacy” • The project can be sustained long-term • Conclusions are data-based
5: Develop data-based conclusions Actual Results • Performance vs. a comparison/control group • Time sequence • Plausible mechanisms (or pathways toward change) • Accounting for alternative explanations • Similar effects observed in similar contexts
5: Initial results for evaluation • Student performance by course • Praxis exam results • Internship portfolio • Program completers survey
5A: Student performance assessment Progress Assessment • Students meet course expectations • Student Field experience • Courses mapped to ISLLC standards
5B: Praxis exam Compares to national standard • Over 16,000 people took the Praxis II Exam over the last year. The typical score was between 640 and 740. 50% of those who took the exam scored in the 640 to 740 range. • How have our program completers fared? • (Through summer, 2003)
What does the Praxis Cover? • Determining pupil & community needs (9%) • Curriculum design & instructional leadership (13%) • Development of staff & program evaluation (15%) • School management (34%) • Individual & group leadership skills (29%)
5C: Internship portfolio • Demonstrate skill and knowledge • Aligned with curriculum map • Product a student portfolio
5D: Program Completers Survey • Mapping Standards • Critical Incidents • Interviews
6: Feedback and Improvement Maximizing use of results • Provide feedback • Major outcomes- data-based • Explain results clearly • Implement continuous improvement plan • Strategic planning process • Vision, mission, goals • Remember the Audience • How will they use the information provided? • How much time will they be willing to spend reading and assimilating the material?
6: Improvements to EL Master’s Program • Refined curriculum map • Required field experience for each course • Added courses • Data-based decision making • SPED Law
Ending Quote The establishment of an orthodox evaluation methodology is no different from the establishment of a state religion. Officially telling you what method to use is only one step removed from officially telling you what results to find. At that point, utilization of findings will cease to be an issue - there will be nothing to use, only orders to follow. - Patton (1990)