350 likes | 357 Views
Learn how to effectively audit and overhaul your intranet, extranet/public website, or e-newsletter to meet evolving client needs. Discover the methods and strategies to gather client input and create a comprehensive plan for improvement.
E N D
How’s Our Driving?: Simple Auditing Techniques for Overhauling “Anything” Ulla de Stricker and Barbie Keiser Internet Librarian International London, October 2006
Because the world changes so quickly, sooner or later we all need to rebuild or refine … • An intranet, extranet/public website, or e-newsletter • The Library-related portions of an organization’s intranet or public website • Intranet or extranet-delivered services and content • Communications programs (internal or external) • Web based client relationship management AKA CLIENT-BASED CONTINUOUS IMPROVEMENT
What worked well last year may no longer be ideal • Input may “happen our way,” indicating that improvements are needed • But it would be risky to wait for – and rely on - such input! • We need to engage in planned and well-executed audits
“Audit” is not a scary word • But if it seems to be, just choose another name! • Strategic Planning Review • Resource Assessment • Business Process Planning • Communications Checkup • Translation: Regularly repeated, systematic examination of current practices and future needs with the focus on the client
The Audit is supplemented by ongoing monitoring: • Mechanisms to catch evidence of: • Shifts in client needs due to changes in internal operations, changes in their target markets, and developments in industries in which they operate • Opportunities created through the application of advanced technology • The objective is to put in place a “360°” system so as never to be caught off guard
The methods we choose to employ in conducting the audit depend on: • Existing client relationships • Corporate culture • Time and money tradeoffs • Benefits of a comprehensive effort • Options for a phased-approach • Skill sets of staff involved
Tell-it-to-me … or Easter Eggs? • Common techniques involve a mix: • Interviews • Discussion (focus) groups • Surveys • Spot checks • But what people say isn’t always “the goods” • We may need more “evidentiary” methods • Logs (behavior trails) • Stats (behavior trends) • Observed behavior as-it-happens
All are designed to help us • Appreciate where we need / want / are able to go in order to meet evolving client needs • Create a plan for how we get there from here
Focus for today • Designing a process that will yield maximum insight into client perceptions and priorities • Involving the least effort - for us and for our clients • Resulting in the greatest degree of confidence in the findings
Case example • The Library’s Intranet presence is not cutting it! • It may once have been the cat’s meow, but time has gone by and “grafting” has resulted in a difficult-to-use site
For example, we know about these issues: • Information overload – difficult to present the volume in a consistent fashion • Users are confused • Navigation is not intuitive • Users “miss” important announcements and promotions • Users go elsewhere for information
What are the elements to be examined? • What specific “defects” must we address? • What content and services would be priority offerings in the eyes of our clientele? • What functionality do clients consider essential vs. ”nice to have”? • Any “sacred cows” we can drop? • What are “good” library intranets doing that we aren’t? • How are those intranets maintained and by whom? • What else?
Input harvesting options • Factual evidence (logs, stats) • Face to face interview (“tell me”) • Facilitated discussion groups – participants talk among themselves • Easter Egg Hunts • At-the-elbow “SENSE MAKING” • Surveys • “Usability-lite” testing
Audit Design Preamble • What do we already know? • What do we want to know? • Who can tell us? • Who on our staff should be involved in the process? • What outside assistance will we need? • How can we ask informants in a non-burdensome way? • For our clients • For ourselves • How do we choose the right method for each target user group?
Typical process • Review documented (web logs) & anecdotal evidence, and any marketing collateral distributed • Identify “regular user” informants • Identify “non-user who should be user” informants • Identify individuals others view as role models • Devise structures • Two-on-one interviews • Focus groups • Devise Egg Hunts • Test survey instruments • Set up logistics
A note on logistics • Must “ace” the invitation • What’s in it for them • Make it fun for them • Accommodate their schedules • Demonstrate later that input got results
Easter Eggs • A contest – challenges answerable through the library website • What is the name of the professional association for chiropractors in Norway? • Forcing respondents to not just say “looks nice” but to actually dive in and answer: • When did … • Who said … • Why was … • What path used to solve question? • Where found answer - or abandoned?
At-the-elbow SENSE MAKING • Show me a typical task you need to accomplish • What do you do first? Why? • Stop – why did you look there? • Stop – then what made you decide to look here? • Why did you not check here first? • Scenarios and personae
Case example • A Library Consortium’s Communications Program is not as effective as it could be • Too many vehicles are employed • Many services are under-utilized
Project design • Initial orientation meeting • Survey of members • Focus groups • Needs assessment and usability-lite test for the Consortium’s website • Audit and benchmarking • Synthesis of data
Dear Colleague, The XXX Library is surveying our members to evaluate the effectiveness of our communication efforts, including our e-Newsletter and Web site. This XXX Communications Audit is being conducted by an independent firm. Please visit <insert XXX survey URL> to take the survey. This survey is one in a series of efforts geared toward improving and facilitating communication between NSLS and our member libraries, and among members themselves. The resulting analysis will enable us to assure that members receive the information, products, and services they need in a timely fashion and easy-to-use format. Your input will help us reach you in more effective ways so that you receive the information you need in the most convenient way for you. Please take 15-20 minutes to respond to our survey at <insert XXX survey URL> . The deadline to completing the survey is <insert day and date>. As a way of saying thank you for your valued participation, XXX will have several drawings for valuable gift cards redeemable at local retail chain establishments. <Insert link to Drawing Rules> Know that your responses to this survey instrument will remain confidential; the information provided will be reported only in aggregate form. If you have any questions about the survey, please contact: <Insert full contact information for survey manager> Sincerely yours,<Insert Library manager’s name, title, and full contact information> Sample Announcement
Survey • Identify survey content, design survey, and coordinate Consortium’s review of draft survey • Identify survey pretest participants, complete survey pretest, and revise questionnaire • Develop and implement Web-based survey • Address security & privacy issues • Attain buy-in and announce survey • Develop professional protocols for information collection • Host and monitor Web-based collection tools and systems • Monitor survey completionand follow-up • Analyze data • Overall • Type of library • Portraits of _____ Library
Focus groups • Obtain information and clarification on: • Issues that are a priority • Features that are important • Extent to which members will have influence • Ways to measure success • Role of interactivity • Conduct exercises to discern: • Awareness of the services offered to and valued by the participants • Adequacy of communication about those services • Availability and accessibility of services • Opportunities for improvement • Present Results
“Usability-lite” testing • For when you don’t want to completely overhaul the site, but do want to make changes that are warranted • Usability-lite tests will help you determine: • Actuals / Optimals • Drivers / Incentives • Barriers / Potential solutions • A combination of telephone interviews (for pre-screening candidates) and in-person interviews in the participants’ normal work environment • Ask participants to “think aloud” as they explore the website • Ask some follow-up questions
Audit and benchmarking: Objective website review • Reviewed the site in terms of stated goals and from a member’s perspective • Analyzed extant data • Web logfile data • Evaluated navigation used • Examined usability/human factors • Assessed the calls to action and flow of copy • Identified interactive techniques • Provided recommendations for: • Navigation, technical, and usability functions • Marketing copy • Interactive techniques • Access to other information systems and services
Tips for conducting successful interviews and focus groups (1) • Let them talk, but facilitate the discussion • May have a set of questions … but do not force a slavish go-through • Assure complete confidentiality – notes are aggregated, no names ever given out • If need to “prime the pump”, refer to observations (“we noticed …”) and ask for comments
Tips for conducting successful interviews and focus groups (2) • Be aware of interpersonal dynamics and politics • Recognize they may not want to “look bad” & may tailor comments to what they think is “correct” • Validate: Interesting, you are not the first to say so • Use “others-find” technique (you too?)
Tips for conducting effective surveys • Short - Fast – Easy – did we mention short! • Clear, unambiguous • Ranking of personal priorities (What means more to you?) • “How much do you love us on a 1-10” yields less valuable insight (no one wants to offend) • Minimize the number of open-ended questions • Do you agree with these statements made by your peers?
Tips for conducting the on-site portion of a web usability test • Explain that the findings from the evaluation will be used to redesign and improve the website • Explain that you will be collecting data by taking written notes • Stress that the website is being tested - not them as users • Remind interviewees to articulate their thoughts • Stay neutral • Help users in distress • Ask if they have any questions before the interview begins
Audit Report sets out (in one place)… • Drivers for the Audit (why did we) • Goals (what looked for) • Methodology / Informants (how did we) • Findings (broken down by major topic area) – factual, dispassionate • Conclusions that take the findings and group them into themes • Recommendations flowing from the findings (not just a crazy idea) • Business Case that includes a “how we will” implement change & risks of not implementing
Thank You …feel free to be in touch! • Barbie Keiser • barbieelene@att.net • Ulla de Stricker • ulla@destricker.com