1 / 18

Voter Education – Impact Measurement

Voter Education – Impact Measurement. Case Studies – Australia Ross Attrill – International IDEA. The electoral educators all seemed very passionate about the importance of what they were delivering. However, they were less passionate about the evaluation aspect of their role.

nickan
Download Presentation

Voter Education – Impact Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Voter Education – Impact Measurement Case Studies – Australia Ross Attrill – International IDEA

  2. The electoral educators all seemed very passionate about the importance of what they were delivering. However, they were less passionate about the evaluation aspect of their role. There did not appear to be consistent support for evaluation amongst all electoral educators and there did not appear to be any consistently applied approach to evaluation Voter Education Evaluation

  3. The Case Studies • Evaluation Methodology for Electoral Education Programs - 2008 • Youth Election Survey

  4. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • Electoral Education Context - Three Main Streams • Electoral Education Centres (EECs) • School and Community Visits Program(SCVP) • Teacher Training - “Your Vote Counts” (YVC)

  5. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • Choose Clear Objectives for the Program • Participants Should: • Understand the role of the AEC; • Understand the concept of representation in a democracy; • Be aware of Compulsory enrolment; • Be aware of Compulsory voting; • Understand Preferential voting (alternative vote); and • Understand the concept of Formality (Validity)

  6. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • How the Results Will be Used: • Review and update the content of the education sessions. • Measure the degree of participant/customer satisfaction with the AEC program. • Assess the appropriateness of the delivery and content of the AEC education sessions for all audiences. • Provide data and information for inclusion in various executive reports and the AEC’s Annual report. • Inform the development of business plans.

  7. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • Performance Indicators 1. Participant feedback that indicates improved knowledge and increased understanding of electoral matters. Target 95% for SCVP and YVC sessions, maintaining or exceeding previous years' results for EEC sessions. 2. Audience satisfaction with the education program. Target 95% for SCVP and YVC, high level of audience satisfaction for EEC sessions. 3. Percentage of 17 and 18 year old participants in EEC sessions who are more likely to vote at the next election. Target 75%.

  8. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • Pre - session Evaluations • Baseline Data • The ideal time to measure pre program knowledge against the key program objectives is immediately prior to participation in the program.

  9. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • Post-Session Evaluation - Timing • What is the best time to conduct the evaluation? • immediate, • post session follow up • review questionnaire at a much later date - Teachers

  10. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • How much is enough? • Number of evaluation responses should ensure the results obtained are robust. • Suggestion that a minimum of 400 evaluations per year in each EEC and in the SCVP

  11. Case Study 1 - Evaluation Methodology for Electoral Education Programs - 2008 • Questionnaire Design • short and ideally completed in 5 minutes or less. • record essential demographic information • measure attendee knowledge against the key program objectives. • test attendee knowledge against the key program objectives rather than ask them to self-assess. • tailored for each target audience

  12. Case Study 2 - Youth Electoral Study (YES) 2004 • Rationale • To investigate reasons for youth disengagement with political process and institutions • To provide data on which to base a revised Youth Voter Education program

  13. Case Study 2 - Youth Electoral Study (YES) 2004 • Areas Investigated • The influence of family on Engagement • The influence of school on Engagement • The influence of Political Parties on Engagement • The influence of Political Knowledge on Engagement

  14. Case Study 2 - Youth Electoral Study (YES) 2004 • Methodologies • Review of existing literature • Case Studies - based on in depth group interviews • National School Survey - 154 schools, 4600 students

  15. Case Study 2 - Youth Electoral Study (YES) 2004 • Focus Questions • What sorts of political actions do you take part in? • Rank voting against other events in terms of excitement • Rank the effect of your family on your political participation? • Do you feel like you have enough knowledge to participate in political processes

  16. Case Study 2 - Youth Electoral Study (YES) 2004 • Outcome Australian Electoral Commission Youth Communication Strategy 2007 -10

  17. Conclusions • Each evaluation methodology must be appropriate to what is being evaluated • Each will use different approaches in order to capture the data needed • In general, there are some rules that should be followed: • Choose clear program objectives • Decide how the results will be used • Choose challenging but achievable performance indicators • Get the timing right • Collect baseline data • Design your survey tools carefully • Ensure that the process of data collection is as painless as possible for everyone

  18. Thank you and good luck!

More Related