1 / 16

Programme

How to Evaluate your Communications Programmes Strategic Communications Workshop for UNICs in sub-saharan Africa Nairobi, 8-10 June 2005. Programme. Results-based management Impact of communications activities Data collection methods Surveys. Results-based Management.

Download Presentation

Programme

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Evaluate your Communications Programmes Strategic Communications Workshop for UNICs in sub-saharan Africa Nairobi, 8-10 June 2005 Department of Public Information (ECRU-OUSG)

  2. Programme • Results-based management • Impact of communications activities • Data collection methods • Surveys Department of Public Information (ECRU-OUSG)

  3. Results-based Management Programme planning based on objectives, identification of target audiences, etc. Implementing activities; Collecting information on programme outcome; Feeding this information back into programme planning; Communications Strategy Lessons learned Activities Assessment Department of Public Information (ECRU-OUSG)

  4. Impact of public information IMPACT Change behavior Level of Impact Increasesupport Raise awareness Type of Information Activity, Product or Programme Department of Public Information (ECRU-OUSG)

  5. Data collection methods • Interviews • Focus groups • Content analysis • Media monitoring and analysis • Surveys Department of Public Information (ECRU-OUSG)

  6. Media monitoring and analysis DPI relies mainly on media monitoring: • Number of pick-ups, broken down by issue, country, region etc. Systematic media analysis: • Currently being developed within DPI Department of Public Information (ECRU-OUSG)

  7. Surveys • Telephone surveys • Surveys by mail • Hard-copy surveys • Online surveys -- companies providing free online surveys: • www.surveymonkey.com • www.zoomerang.com Department of Public Information (ECRU-OUSG)

  8. Questionnaire development • What do we want to ask whom? • Think about possible topics and questions. • Prepare the individual questions. • Give the questions a logical order. • Get colleagues to fill out the questionnaire. • Corrections, layout Department of Public Information (ECRU-OUSG)

  9. Basic rules • Keep it simple, keep it short! • Provide clear instructions to the respondent • Use everyday language • Only one topic per question – avoid double-barreled questions • Whenever possible, use closed-ended questions that provide answer codes • Always provide the answer code ‘not sure’ or ‘don’t know’ • Use balanced answer codes • Provide concrete vs. abstract questions • Do not use biased questions • Avoid questions based on exact recall • Do not ask hypothetical, future oriented questions Department of Public Information (ECRU-OUSG)

  10. Questions • Closed-ended question • Open-ended question • Grid question • Scale question • Filter question Department of Public Information (ECRU-OUSG)

  11. Closed-ended question Example: “Do you prefer an annual meeting or do you prefer quarterly meetings?” • Advantages: easy to understand; not complicated; useful for statistical analysis; makes the questionnaire short and convenient • Disadvantages: you get only the answers you listed; tendency to simplify responses Department of Public Information (ECRU-OUSG)

  12. Open-ended question Example: “What do you think about … ?” • Advantages: can capture motivation, values and goals; you do not miss answers you did not mention • Disadvantages: requires coding responses; labor intensive; tiring for the respondent; lowers the response rate; Department of Public Information (ECRU-OUSG)

  13. Scales Example: “What do you think of the last annual meeting: Was it very good, good, poor or very poor?” • Advantages: everyone knows how to use scales; clear for the respondent; easy to answer; shortens the questionnaire • Disadvantages: scales can be culturally specific (e.g. 1 to 5) • Suggestions: make the scale visual Department of Public Information (ECRU-OUSG)

  14. Grid or matrix Example: “Please rate the quality of … with respect to:” Good Neither Poor Not sure Duration     Topics     Presentations     • Advantages: clear for the respondent; easy to answer; shortens the questionnaire • Disadvantages: bores and tires respondents if too many variables are offered Department of Public Information (ECRU-OUSG)

  15. Filter question Example: “Do you have access to a computer in your office?” If yes, go to question ….. • Advantages: makes questionnaire short, respondent answers only relevant questions • Disadvantage: complicated, respondent needs clear instructions Department of Public Information (ECRU-OUSG)

  16. Evaluation and Communications Research Unit (ECRU) Support for surveys: • Help with survey design • Questionnaire development • Statistical analysis of survey results (including cross-tabulation) Contact: • Wieser@un.org • Tel: +1 917 367-2304 Department of Public Information (ECRU-OUSG)

More Related