990 likes | 2.1k Views
Chapter 11- Policy Evaluation: Determining If the Policy Works. Dr. Dan Bertrand LEEA 554. Definitions. Evaluation- Systematic investigation of the worth or merit of an object. Projects- educational activities that are provided for a defined period of time.
E N D
Chapter 11- Policy Evaluation: Determining If the Policy Works Dr. Dan Bertrand LEEA 554
Definitions • Evaluation- Systematic investigation of the worth or merit of an object. • Projects- educational activities that are provided for a defined period of time. • Stakeholders- individuals or groups that may be involved in or affected by a program evaluation.
History of Education Policy • Early evaluation can be traced to pre-civil war Boston. • 1st large scale evaluation by Joseph Rice to evaluate spelling instruction. • In the 1930’s Ralph Tyler (OSU) directed the 8th Yr Study that set the guidelines for program evaluations in terms of objectives.
History of Evaluation • The War on Poverty • Passage of the Elementary and Secondary Education Act of 1965 mandated the evaluation of Title I and II • The Professionalization of Evaluation • It emerged and became an integral part of the education profession. • Professional Journals were established. • Government and universities established centers to conduct research and development. • In mid 1990’s, ASCD and PDK published evaluation handbooks.
Characteristics of Policy Evaluation • Evaluation Process-7 steps • Determine the goals of the policy • Select indicators • Select or develop data collection instruments • Collect data • Analyze and summarize data • Write evaluation report • Respond to evaluators recommendations.
Criteria for Judging Evaluation • From Program Evaluation Standards of 1994 4 categories with 30 standards 1) Usefulness- evaluation must be done by a qualifedteam. 2) Feasibility- evaluation must be doable without imposing unreasonable strains on the school. 3) Propriety- must conform to accepted norms for research. 4) Accuracy
Purposes of Evaluation • Summative Evaluation- to hold the implementers of the policy accountable. • May assess the quality of the policy over time. • Formative Evaluation- enables the implementers to make changes as needed to improve it. • On-going and recurrent ( version) • Pseudo-Evaluations- unethical in nature. • Politically controlled or a public relations evaluation.
Methodologies Used • Quantitative – involves the collection and statistical analysis of numeric data • Quantitative- Collection of verbal or pictorial data • Triangulation- Collecting of several types of data for comparison. • Holistic – Combination of quantitative and qualitative
Facilitating Meaningful Policy Evaluations • Political • Program or projects are products of the political process. • Reports influence what happens in the political arena. • Careers and reputations of individuals depend on the outcome of the evaluation. • Political Players in the Evaluation Arena • Policy makers, Policy Implementers, Clients and Evaluators
Maneuvers to Prevent a Good Evaluation • Block the evaluation and prevent it from occurring. • Shape the criteria so the desired outcome is assured. • Mobilizing clients against the evaluators. • Implementers make data gathering impossible or difficult. • Attacking the quality of the evaluation upon its completion.
5 Key Steps to a Good Evaluation • Building evaluation in early • Communicating with stakeholders • Selecting indicators at the start • Building in data collection- on-going • Choosing evaluators- inside, outside, organization
Acting on the Evaluation Report • Inaction- do nothing and maintain the policy • Minor Modifications • Major Modifications • Replacement • Consolidation • Splitting • Decrementing • Termination
Activity • Case study – page 329 • News Story for Analysis- page 330