410 likes | 642 Views
28/03/2012.
E N D
1. Measuring the impact of information literacy programmes using evidence-based practice Andrew Booth, Director of Information Resources & Reader in Evidence Based Information Practice, School of Health and Related Research (ScHARR), University of Sheffield
2. 28/03/2012 © The University of Sheffield / School of Health and Related Research Plan An overview of Evidence Based Practice
Evidence Based Library and Information Practice (EBLIP)
The contribution of EBLIP to Measuring Impact of Information Literacy
Some Challenges
3. An overview of Evidence Based Practice
4. 28/03/2012 © The University of Sheffield / School of Health and Related Research The Context for Evidence Based Practice
5. 28/03/2012 © The University of Sheffield / School of Health and Related Research The Policy Imperative The UK government’s vision of modernised policy making was set out in Professional Policy Making (SPMT, 1999). Nine core features included:
Evidence-based: uses best available evidence from a wide range of sources
Evaluates: builds evaluation into the policy process
Learns lessons: learns from experience of what works and what does not
The Cabinet Office described evidence as:
“Expert knowledge; published research; existing statistics; stakeholder consultations; previous policy evaluations; the Internet; outcomes from consultations; costings of policy options; output from economic and statistical modelling.” (SPMT 1999)
6. 28/03/2012 © The University of Sheffield / School of Health and Related Research Evidence-based practice (EBP) "an approach to health care wherein health professionals use the best evidence possible, i.e. the most appropriate information available, to make clinical decisions for individual patients….It involves complex and conscientious decision-making based not only on the available evidence but also on patient characteristics, situations, and preferences. It recognizes that health care is individualized and ever changing and involves uncertainties and probabilities”.
McKibbon, 1998
7. 28/03/2012 © The University of Sheffield / School of Health and Related Research The evidence-based practice process Define the problem
Find evidence
Appraise the evidence
Apply results of appraisal
Evaluate change
Redefine problem
8. 28/03/2012 © The University of Sheffield / School of Health and Related Research Similarities between EBP (cyclical) and Information Literacy (modular) formulating a question [Pillars 1 & 2]
searching the literature [Pillars 3 & 4]
critically appraising the evidence [Pillar 5]
making changes [Pillar 6]
evaluating and reflecting [Pillar 7]
9. 28/03/2012 © The University of Sheffield / School of Health and Related Research But that is another story!… Today we are looking at information literacy as the OBJECT of Evidence Based Practice
NOT Evidence Based Practice as an exemplar of the Information Literacy PROCESS
10. Evidence Based Library and Information Practice (EBLIP)
11. 28/03/2012 © The University of Sheffield / School of Health and Related Research Evidence Based Library and Information Practice (EBLIP) Formerly known as “Evidence Based Librarianship” (Eldredge, 1997)
Or “Evidence Based Information Practice” (Haines, 1995)
Attempt at consensual all-embracing term that includes related fields and activities
Need to remain allied with “Evidence Based Practice”
12. 28/03/2012 © The University of Sheffield / School of Health and Related Research Evidence Based Library and Information Practice Includes
Librarianship
Information Systems
Informatics
Information Literacy
etcetera
13. 28/03/2012 © The University of Sheffield / School of Health and Related Research Evidence Based Library and Information Practice…. “….seeks to improve library and information services and practice by bringing together the best available evidence and insights derived from working experience, moderated by user needs and preferences. EBLIP involves asking answerable questions, finding, critically appraising and then utilising research evidence from relevant disciplines in daily practice. It thus attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making. ”.
Source: Booth, 2006a
14. 28/03/2012 © The University of Sheffield / School of Health and Related Research Why should EBP resonate with SCONUL members? EBP is about improving the quality of day-to-day decision-making by consciously and explicitly integrating:
professional expertise
informed consumer choice
the best available research evidence
15. 28/03/2012 © The University of Sheffield / School of Health and Related Research Not just research-derived evidence! user-reported: what do users consider the most important aspects of performance (e.g. SERVQUAL, LIBQUAL)
librarian-observed: building a body of evidence through use of common measures/measurement tools
16. 28/03/2012 © The University of Sheffield / School of Health and Related Research The value of EBLIP “Research that can provide rigorous evidence of outcomes is needed for managers to make decisions that will maximise the impact of library and information services……The Evidence Based Librarianship movement proposes new standards for research that can be applied to outcomes research and also to the extensive work being done on service quality and satisfaction”
Source: Cullen, 2001
17. The contribution of EBLIP to Measuring Impact of Information Literacy
18. 28/03/2012 © The University of Sheffield / School of Health and Related Research SPICE up your life! (Booth, 2006b) SETTING – in which context are you addressing the question?
PERSPECTIVE – who are the users/potential users of the service?
INTERVENTION – what is being done to them/for them?
COMPARISON – what are your alternatives?
EVALUATION – how will you measure whether the intervention has succeeded?
19. 28/03/2012 © The University of Sheffield / School of Health and Related Research Worked Example of SPICE What is the impact of information literacy skills training on Undergraduate Engineering students in the view of Faculty staff in terms of improving students’ confidence and competence in information and IT skills?
20. 28/03/2012 © The University of Sheffield / School of Health and Related Research SPICE up your life! SETTING – in Undergraduate Engineering students at a Northern England University
PERSPECTIVE – Faculty of Engineering staff
INTERVENTION – Information literacy skills training
COMPARISON – No specific training
EVALUATION – Improving students’ confidence and competence in information and IT skills
21. 28/03/2012 © The University of Sheffield / School of Health and Related Research The best available evidence will be: Comparative
Prospective/Longitudinal
Clearly Described Intervention
Specific and Measured Outcome
In a relevant/comparable Study Population
22. 28/03/2012 © The University of Sheffield / School of Health and Related Research The K-A-B-O Framework
23. 28/03/2012 © The University of Sheffield / School of Health and Related Research From output to outcome to impact
24. 28/03/2012 © The University of Sheffield / School of Health and Related Research From what is done to what works to what works & how
25. 28/03/2012 © The University of Sheffield / School of Health and Related Research Critical Appraisal – a key tool for EBLIP Checklists for reading research articles (e.g. ReLIANT: Reader’s guide to the Literature on Interventions Addressing the Need for education and Training)
CRISTAL - Information Needs Analysis and Use Studies
www.newcastle.edu.au/service/library/gosford/ebl/ toolkit/appraise.html
26. 28/03/2012 © The University of Sheffield / School of Health and Related Research Systematic Reviews – a key tool for EBLIP “a review of a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant research, and to collect and analyse data from the studies that are included in the review”.
27. 28/03/2012 © The University of Sheffield / School of Health and Related Research Example #1 - Effective Methods for Teaching Information Literacy Skills to Undergraduate Students (Koufogiannakis & Wiebe, 2006) METHODS
To assess which library instruction methods are most effective at an undergraduate level
4356 citations retrieved from 15 databases. From 257 full articles, 122 unique studies underwent data extraction and critical appraisal. 55 met pre-defined quality criteria. 16 provided sufficient information for meta-analysis.79 studies (65%) used experimental/quasi-experimental methods. Most focused on traditional teaching, followed by computer assisted instruction (CAI), and self-directed independent learning. Outcomes correlated with Bloom’s lower levels of learning (Remember, Understand, Apply).
28. 28/03/2012 © The University of Sheffield / School of Health and Related Research Example #1 - Effective Methods for Teaching Information Literacy Skills to Undergraduate Students (Koufogiannakis & Wiebe, 2006) RESULTS
16 compared traditional vs no instruction, and 12 found a positive outcome. Meta-analysis of 4 studies favoured traditional instruction. 14 studies compared CAI vs traditional instruction with neutral result confirmed by meta-analysis. 6 compared self-directed independent learning with no instruction, with positive result confirmed by meta-analysis.
CAI is as effective as traditional instruction. Traditional instruction and self-directed independent instruction are more effective than no instruction. Future research needs to compare active learning, computer assisted instruction, and self-directed independent learning. Further studies utilizing appropriate methodologies and validated research tools would enrich the evidence base.
29. 28/03/2012 © The University of Sheffield / School of Health and Related Research Example #2 - Information skills training (IST): a systematic review of the literature (Brettle, 2003) METHODS
To determine effectiveness of IST, to identify effective methods of training and whether IST affects patient care.
From 1357 references, 41 potentially relevant studies were identified. Further reading left 24 studies for critical appraisal. Tables used to summarise and synthesise results.
30. 28/03/2012 © The University of Sheffield / School of Health and Related Research Example #2 - Information skills training (IST): a systematic review of the literature (Brettle, 2003) RESULTS
Study designs included randomised controlled trials, cohort designs and qualitative studies. Most took place in US medical schools. Wide variations in course content and training methods. 8 studies used objective testing, 2 compared training methods and 2 examined effects on patient care.
Limited evidence to show training improves skills, insufficient evidence to determine most effective methods of training and limited evidence to show training improves patient care. Further research is needed.
31. 28/03/2012 © The University of Sheffield / School of Health and Related Research So what use are such systematic reviews? For practice – indicate strengths of specific methods
For research – enable prioritisation of investigation, identification of outcomes, choice of instruments
“It’s not re-invention of the wheel we want to prevent – it is re-invention of the flat tyre!” – Muir Gray
32. Some Challenges
33. 28/03/2012 © The University of Sheffield / School of Health and Related Research The Challenges Building a body of evidence
Identifying and measuring meaningful outcomes
Developing valid measurement tools
Tackling “core” not marginal business
Making EBLIP a part of routine practice
34. 28/03/2012 © The University of Sheffield / School of Health and Related Research Building a body of evidence from systematic review
35. 28/03/2012 © The University of Sheffield / School of Health and Related Research Conclusions - 1 EBP recognises important drivers for University Librarians
EBLIP acknowledges a wide variety of evidence sources
EBLIP requires asking focused, relevant answerable questions (SPICE)
The more significant an item is, the harder it will probably be to measure (K-A-B-O-Impact)
36. 28/03/2012 © The University of Sheffield / School of Health and Related Research Conclusions - 2 Nothing should be sacrosanct (cp. nursing “ritual”) – challenge the “core” not just the “margins”
Systematic reviews of the literature can handle both what works and how it works
Evidence Based Library and Information Practice helps unpack the black box, identify candidate interventions and to equip practitioners
However it is only one tool in a wider toolbox of reflective practice………
37. 28/03/2012 © The University of Sheffield / School of Health and Related Research Reflective practice “Evidence based practice is about best practice and reflective practice, where the process of planning, action, feedback and reflection contributes to the cyclic process of purposeful decision making and action, and renewal and development”. (Todd, 2003)
Stimulus for reflective practice can be research, user views, practitioner observation, benchmarking, performance measurement etcetera
But must make a difference
38. 28/03/2012 © The University of Sheffield / School of Health and Related Research …..Above All We should always ask the “Why question”!
Why are we doing it?
Why are we measuring it?
39. 28/03/2012 © The University of Sheffield / School of Health and Related Research Ones to Watch! Evidence based practice in information literacy : ANZIIL Research Working Group (Australian and New Zealand Institute for Information Literacy). [Forthcoming, 2006]
Evidence Based Library and Information Practice (Open Access Journal) ejournals.library.ualberta.ca/index.php/EBLIP
4th International EBLIP Conference, May 6-11 2007 www.eblip4.unc.edu
40. 28/03/2012 © The University of Sheffield / School of Health and Related Research References - 1 Booth, A (2006a). Counting what counts: performance measurement and evidence-based practice · save this. Performance Measurement and Metrics, 7(2) 63-74
Booth, A (2006b). Clear and present questions : formulating questions for evidence based practice. Library Hi Tech , 24, 3: 355-3
Brettle A. (2003) Information skills training: a systematic review of the literature. Health Info Libr J. 2003 Jun;20 Suppl 1:3-9.
41. 28/03/2012 © The University of Sheffield / School of Health and Related Research References - 2 Cullen, R. (2001). Setting standards for library and information service outcomes, and service quality. 4th International Conference on Performance Measurement.
Eldredge, J. D. (1997). Evidence based librarianship: a commentary for Hypothesis. Hypothesis, 11(3): 4-7
Haines, M. (1995), Librarians and evidence-based purchasing, Evidence-Based Purchasing , Vol. 8 No. 1
Koufogiannakis, D, and Wiebe, N (2006). Effective Methods for Teaching Information Literacy Skills to Undergraduate Students: A Systematic Review and Meta-Analysis. Evidence Based Library and Information Practice 1(3): 3-43.
42. 28/03/2012 © The University of Sheffield / School of Health and Related Research References - 3 McKibbon KA (1998). Evidence based practice. Bulletin of the Medical Library Association 86 (3): 396-401
Strategic Policy Making Team (SPMT) (1999) Professional Policy Making for the Twenty First Century (London, Cabinet Office)
Todd, R (2003) Learning in the Information Age School:Opportunities, Outcomes and Options. International Association of School Librarianship (IASL) Annual Conference Durban, South Africa, 7-11 July 2003 .