180 likes | 440 Views
Research and Evaluation Skills. R. Martin Reardon’s summary of Chapter 15 Glickman, C. D., Gordon, S. P. & Ross-Gordon, J. M. (2009), 200-224. The rise in school-based research. Deliberate collections of information, and Use of this information for making instructional decisions in:
E N D
Research and Evaluation Skills R. Martin Reardon’s summary of Chapter 15 Glickman, C. D., Gordon, S. P. & Ross-Gordon, J. M. (2009), 200-224
The rise in school-based research • Deliberate collections of information, and • Use of this information for making instructional decisions in: • Curriculum change • Lesson planning • Professional development • Assistance to teachers • Westinghouse effect? Chapter 15: 11 slides
Quantitative v Qualitative • Epistemology • Quant: single, objective reality, external to the investigative process • Qual: many perspectives, all valid • Goals • Quant: explain causes, prediction, generalization • Qual: describe, honor multiple perspectives • Role of researcher • Quant: detached, objective • Qual: immersed, aware of bias • Importance of context • Quant: abstract from context • Qual: context and observation concurrent • Role of values • Quant: ideally value-free • Qual: acknowledge effect of values framework • Cause and effect • Quant: identify links between cause and effect • Qual: intertwined, separate identification not key to understanding • Research design • Quant: statistics validate or disconfirm hypotheses; randomization; “manufacturing” • Qual: grounded theory; triangulation; “hunting and gathering” Chapter 15: 11 slides
Mixed Design • Incorporate the best of both the Quantitative and Qualitative approaches • The issues you are seeking to address determine the best approach • The context of the study (school history, demographics, culture, climate…) • The goals of the study • The values & skills of the researcher(s) • The resources available for the study • The values of participants & time required of participants • The values & needs of the audience for the study Chapter 15: 11 slides
1: Specific Instructional Program Evaluation • Six Steps: • 1. Evaluate the original needs assessment • Data sources: student school records, teachers’ response to survey • 2. Evaluate the fit of the written plan with the needs assessment • Data sources: new program goals, objectives etc. (cf. to needs assessment) • 3. Evaluate readiness of stakeholders (inservice of them) • Data sources: S, teachers, records of PD activities; interviews, rating scales, document reviews • 4. Evaluate fit between implementation and plan • Data sources: teachers, lesson obs., resource utilization; self-reports, student products, interviews • 5. Evaluate intended and unintended outcomes • Data sources: test scores, awards; student school records, lesson obs., surveys • 6. Compare costs to benefits • Data sources: personnel & resource--compared to #5 above • Typical evaluations include only Step 5? Chapter 15: 11 slides
Key Decisions for Specific Instructional Program Evaluation • Purpose? formative or summative • Who will evaluate? include teachers • Questions? specific to each of the 6 components • Data? sources & methodology • Analysis? determined by questions & data • Report? determined by audience • purpose, description, evaluation questions, methodology, results & conclusions, recommendations Chapter 15: 11 slides
2: Overall Instructional Program Evaluation • Phase 1: Select the areas • Steering committee of key stakeholders • 14 suggested broad areas for evaluation p. 221-2; delete or add; but suggestion that all 14 be covered • Outline the rest of the process & involvement of consultants • Phase 2: Identify the specific questions • Large-group session: teachers, staff, parents, community members • Aim: form small planning teams = to areas for evaluation • Each small team develops specific evaluation questions • Each teams questions vetted by general session; re-worked by each team & re-presented • Entire body votes on inclusion of each question • Re-worked subsequently by steering committee & consultant for consistent format etc. • Phase 3: Design the evaluation • Small teams propose data sources & methods; steering committee makes final decision • Design data-gathering instruments (suggested by small teams; coordinated by steering committee) • Phase 4: Gather & analyze data • “Best match”—who will collect what & which tools will be used • Teachers & staff contribute to analysis; not necessarily separate from collection • Phase 5: Prepare & present report • Report not only each area, but relationships among them • Prioritized recommendations directly related to the study’s results & conclusions • Involve key stakeholders in presentations of report Chapter 15: 11 slides
High-Stakes Achievement Tests • Criticisms (Corbett & Wilson, 1991) • Student test scores misused to modify complex systems • Uniform measures ignore differences among schools • Tests create local conditions which work against the intended outcomes of policy-makers • Demean the professional judgment of teachers (Madaus, 1988) • Research support for critics of high-stakes tests • Narrowed curriculum, teaching to test, reduced regular testing • Disruption of teachers’ work lives, increased concern with legal liability • State control of curriculum, cheating, unhealthy competition, loss of time • Achievement gap: Cultural bias in test? Differential preparation? • Latino/a students passed Spanish version of test they failed 1 year later in English • Focus on “drill;” high-ability, bored; at-risk, demoralized; de-emphasize high-level thinking skills • Bigger question: What should be assessed? Taught? • What is the purpose of public education in a democratic society? • Alternatives are available • Student level: PBA; School level: Portfolios Chapter 15: 11 slides
Supervisor’s Role in Program Evaluation • Seeing that evaluation of special programs & overall program is ongoing • Stufflebeam’s standards (1981) • Utility: practical info of use to audience • Feasibility: realistic, prudent, diplomatic, frugal • Propriety: legal, ethical, respectful • Accuracy: technically adequate information • Two basic questions: • Is what we are doing working? • How does it work? • Suggestion: Establish an evaluation cycle • Associate with focused funding of initiatives emerging from evaluation Chapter 15: 11 slides
Teacher Evaluation • Summative • Accountability • Minimum expectations (& award eligibility?) • Mandated (check lists/ratings/narratives: standardized & global) • Validity & reliability (low inference: training for administrators?) • Formative • Professional growth • Continuous improvement • Sought (questioning/artifacts: focused on context) • Building trust & rapport (e.g., invite student/peer/parent feedback) • Both necessary, but best kept separate • Use different evaluators? • Summative in fall; Formative in spring? • S. in Yr 1 of 3; F. in Yrs 2 & 3? Chapter 15: 11 slides
Formative: Self-evaluation & Team Evaluation • Some possible formats for Self Evaluation • Visits to classes of experts • Videotaping • Surveys • Critical Reflection Journal • Holistic review of student performance • Teaching Portfolio • For Evaluation of Teams of Teachers • FLC model • Supervisor as facilitator with periodic meetings • Consistent with notion of collegiality & collaboration Chapter 15: 11 slides