1 / 19

Alan Stanley, EADI-IMWG, September 2003

Getting to know your user: Summary report from the joint Eldis / id21 user feedback exercise. Alan Stanley, EADI-IMWG, September 2003. The evaluation team . www.eldis.org. www.id21.org. Purpose. The main objectives of the evaluation were to find out more about:

mele
Download Presentation

Alan Stanley, EADI-IMWG, September 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Getting to know your user:Summary report from the joint Eldis / id21 user feedback exercise Alan Stanley, EADI-IMWG, September 2003

  2. The evaluation team www.eldis.org www.id21.org

  3. Purpose The main objectives of the evaluation were to find out more about: • the information environment amongst leading development organisations in the US • the working roles of people that are using our services in the US • how our users search for and use development information • how users use our services - what they like and don't like To meet potential new collaborators and partners in the US

  4. Method: Interviews • Requested interviews with Eldis/id21 users from target organisations • Mostly 1 to 1 interviews but some groups • Loosely scripted in advance • Tailored to project with key themes taken from earlier evaluations • Background information gathered on interviewees

  5. Results • 43 interviews/meetings • 26 different organisations • E.g. • World Bank -UNDP • IMF -USAID • Rockerfeller -IFPRI • Human Rights Watch -UNFPA

  6. Advantages • Greater level of user engagement (than other forms of evaluation) • Opportunity to observe ‘hands on’ use of services • Identify ‘buzz’ topics among organisations • Eg. local content; decentralisation of points of distribution • ‘Foot in the door’ opportunity • Other • Team building • Improved personal understanding • Contact with ‘real people’

  7. Limits • Selection of interviewees • Sample size • Selection (volunteers disproportionately sympathetic, interested in information / knowledge management, geographically confined) • Interview process • Loose scripting makes for subjective results • Observed tendency for interviewees to ‘interpret’ questions depending on their own interests. • The ‘politeness factor’

  8. Observations • Context - the value of ‘snapshots’ increases when you are able to place findings in a wider context • Balance – need to weigh pro’s and cons of ‘networking’ against ‘evaluation’ • Targeting – small sample size means careful thought must be given to which organisation, and which people within those organisations to target • Level of detail – don’t overestimate the level of user understanding of the service being evaluated

  9. Findings The main objectives of the evaluation were to find out more about: • the information environment amongst leading development organisations in the US • the working roles of people that are using our services in the US • how our users search for and use development information • how users use our services - what they like and don't like

  10. The US information environment • Enormous volume • Information overload -insufficient time to find and read -too busy to distinguish between websites -overload of emails problematic • Information overload getting worse

  11. The US information environment IMPLICATIONS: • Users want a trustworthy quality filter • Importance of a strong brand

  12. US organisations • Investments in KM on the up • Inward vs. outward looking • Knowledge networks play central role -network co-ordinators important to us • Generalisations unhelpful

  13. User search strategies 5 broad strategies: • Google (or other generic search) • Development gateways or portals • Specialist sources (country or sector) • Internal KM networks • Employ intermediary staff/services ALL RELY HEAVILY ON INTERNET SOURCES

  14. User search strategies IMPLICATIONS: • Specialist sites must ensure their products are found on general development gateways • Make sure services are prominently featured on search engines e.g. Google

  15. The working roles of users • Information brokers • Information disseminators • Researchers

  16. Information brokers • Find, re-package and disseminate • Examples of work titles: -network facilitators -editors -web content managers • Key multipliers

  17. Information brokers KEY FEATURES: • Good understanding • Time pressured • Not interested in in-depth analysis • Want access to new sources • Emphasis on latest/what’s new • Ascertaining credibility essential • Feeding into decision making process

  18. Information disseminators Important to engage with to improve sourcing of information LOOKING FOR: • Neutral ground • Easy way to submit materials • Large and diverse audience

  19. Researchers KEY FEATURES: • Narrow subject interest • Interested in older materials • Very selective • Credibility of materials essential • Not interested in understanding our services better

More Related