1 / 14

Increasing Evaluation Transparency: A Dialogue Strategy

This dialogue strategy paper by Sheila A. Arens in June 2003 delves into the need for increased evaluation transparency in educational rhetoric. Explore how to communicate findings effectively and make evaluation work more transparent, participatory, and engaging through various dialogue approaches. Discover the ethical and moral considerations surrounding transparency and learn about strategies for public engagement in evaluation processes. Gain insights on promoting accountability, understanding participants' perspectives, and fostering informed discussions through dialogue initiatives.

rblevins
Download Presentation

Increasing Evaluation Transparency: A Dialogue Strategy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Increasing Evaluation Transparency: A Dialogue Strategy Sheila A. Arens June, 2003

  2. Context: Educational Rhetoric • Movement toward “evidence-based” • Accountability for public monies • US: reauthorization of ESEA (Jan ’02) • Canada, Europe • But how transparent are evaluation findings? Or how well do we communicate findings to (or with) broader constituencies?

  3. Evaluative Findings On Stage • In research, positive results are more likely to be published in academic journals – positive bias • In evaluation, findings are sometimes shelved or disregarded • Why? Propriety; dismissive of findings (fail use or validity tests), do not resonate with beleifs • Other instances where evaluation is prematurely given “top fold” status

  4. Paradox: Too much information? • While public attention to educational research and evaluation findings over the past several decades, public faith in education and in its research has decreased (Heath, 1999) • What is our responsibility for making our work transparent? What role ought evaluation serve? • Ethical / moral questions • Should transparency be contingent and relative? • Should transparency underlie all work?

  5. Participatory Evaluation • Capacity to increase transparency; openness / democracy • But evaluators have differential understandings of “participatory” • Emancipatory; transformative; utilitarian • Depth vs. breadth of inclusion • Considerations are moral / ethical

  6. Strategy for Public Engagement • Strategies pursued as data collection efforts or throughout the evaluation? • Strategies for gathering community input: • Surveys, polls • Interviews • Focus groups • Engaged /deliberative dialogue

  7. Deliberative Approaches: Rationale • Wanted approach that constructively engaged & was inclusive (reflected democratic commitments) • Reasoned conversations/ deliberative dialogue: approach that enables positions & contrasting ideologies to surface and be co-explored • Encourages participants to consider collective & personal roles w/ respect to social problems & solutions

  8. Example: Educational Standards Through engaged dialogue, we sought to uncover: • The public’s perception of standards-based education, and • The extent to which the public would support low performing schools

  9. Methods • Conversations with groups of citizens convened • Participants represented wide array of stakeholder groups • Used conversation framework to initiate • Field notes, video tapes and observations collected

  10. Data Analyses • Analysis: notes & videos • analytic inductive process: separately reviewed, formulated tentative assertions based on emerging themes • Compared assertions & sought disconfirming evidence

  11. Findings • Initially common perceptions (large-scale survey findings) reflected • However, conversations led to participants more carefully examining their own assumptions about what is meant by: • Standards • Assessment • Accountability

  12. (con’t) • Support standards, assessment, accountability • Accountability measures: Reliable? Valid? • Current accountability fails to address concerns • Acknowledged myriad (external) factors • Saliency of non-academic issues • Support for LPS [in principle]

  13. Engagement Toward Understanding • Large-scale evaluations grounded in surveys: • Fail to reveal nuances of perceptions • but also fail to engage participants toward increased understanding of own, others’ and commonly held ideological positions / stances emerged • Public awareness increased along with ability to understand and engage in conversations

  14. Next Steps?? • Initial dialogues can help shape evaluative outcomes (in present case, relative to educational standards & accountability) • Add’l research: extent to which participants… • Are more informed & engaged • Are better positioned: policy discussions & decisions • Better understand: evaluation processes, influence evaluation questions, understand evaluation outcomes

More Related