250 likes | 261 Views
This study examines the use of research evidence among policy analysts working in health and non-health ministries. It identifies significant correlates of research use and provides empirical evidence on the association between direct interactions with researchers and research use. The study has some limitations, such as self-reported data and observational nature of the study.
E N D
Studying the use of research knowledge in public bureaucracies Mathieu Ouimet, Ph.D. Department of Political Science Faculty of Social Sciences CHUQ Research Center KT National SeminarSeries December 9, 2010 12:00- 13:00 ET
Learning objectives • To define research knowledge and use • To report the results of a cross-sectional study of the use of research evidence in health and non-health ministries • To invite researchers to use a variety of methodological approaches
Defining scientific knowledge • Seldom conceptually defined in RU studies • Unresolved demarcation problem • KKV* definition of scientific research • The goal is inference - causal or descriptive • The procedures are public - to allow assessment • The conclusions are uncertain – this should be made explicit • The content is the method - rather than the subject matter • Object of contention - Nomothetic vs Idiographic • * King G, Keohane RO, Verba S. (1994). Designing Social Inquiry. Princeton (New Jersey): Princeton University Press.
Defining research utilization (RU) • Instrumental / conceptual / symbolic • RU standards (Knott & Wildavsky, 1980) – Outcomes to measure • Reception - when studies reach users • Cognition - when studies are read, digested and understood • Reference – when studies change users’ frame of reference • Effort – when users fight for the adoption of studies’ recommendations • Adoption – when studies influence policy adoption • Implementation – when studies influence policy implementation • Impact – when policy stimulated by studies yields tangible benefits (outcomes)
2. Cross-sectional study of policy analysts in health and non-health ministries
STUDY AIM • OVERALL OBJECTIVE: to identify significant correlates of research use among policy analysts working at the ministerial level. • SPECIFIC OBJECTIVE: to provide empirical evidence on the magnitude of the association between direct interactions with researchers and research use, while adjusting for other correlates
STUDY LIMITATIONS • Cross-sectional nature of the data • Self-reported data (social desirability bias, recall bias) • Study does not document what determine which research articles, reports or books the policy analysts read • Observational rather than experimental (study misses the required step of demonstrating experimentally that changes in the correlates will have the desired effects and are not simply manifestations of some deeper cause).
METHODS /1 • DESIGN:A random-digit-dialing telephone cross-sectional survey. • PARTICIPANTS:Policy analysts defined as civil servants belonging to 14 professional groups. • SETTING: 17 ministries, including the Health & Social Services • DATA COLLECTION: Questionnaire administered by a small survey firm between 26 September and 25 November 2008 using the CATI technology, which allows for simultaneous data entry and data coding. • N= 1614 (response rate = 62.48%)
METHODS /2 • SURVEY QUESTIONS:Onlyclosed-ended questions • MAIN OUTCOMES: • Consultation of scientific articles • Consultation of academicresearch reports • Consultation of academic books (chapters)
METHODS /3 • 3 ordinal regression models (3 outcomes) • Modifiable correlates (eg direct interactions with researchers, perceived relevance of academic research, etc.) • Unmodifiable correlates (eg Formation type, displinary fields of training, gender, age, policy sectors, policy stages, etc.) • Post-estimation simulations
Percentage distribution for the correlates considered in the study /1
Percentage distribution for some correlates considered in the study /2
Percentage distribution for the types of documents consulted monthly or weekly (n= 1614)
Consultation of different types of documents – Health and Social Services (n= 100) (Monthly & Weekly consultation combined)
Consultation of scientific articles across policy sectors (monthly & weekly consultation combined)
Percentage distribution for the three outcome variables acrosspolicy sectors (monthly and weekly consultation combined)
Correlates positively and significantly associated with the three outcome variables - - - - - - - - - - • Only two correlates were not significantly associated with any outcome variable • gender and being solely involved in policy evaluation rather than in policy formulation
Statistical simulations /1 • Eachunmanipulablecorrelatewasfixed to a specific value usingdescriptive statistics as a guideline. • Fixedunmanipulablecorrelates: • disciplinaryfield of training (fixedat = human and social sciences); • English reading (fixedat = yes); • type of studiespreferred (fixedat = quantitative studies); • age (fixed at = 40–49 years); • gender (fixed at = men); • production of written advice (fixed at = yes); • proportion of working time spent in meetings (fixed at = second quartile,6.68%–14.28%); • policy stages (fixed at = two policy stages).
Statistical simulations /2 • Four manipulable correlates shifted simultaneously from their minimum (0) to their maximum value (1): • interactions with researchers in human and social sciences; • continuing professional development involving scientific content; • reported access to electronic bibliographic databases from own workstation; • perceived relevance of academic research evidence. • Combined marginal effect computed 16 times, for each policy sector (then the average was calculated). • Same procedure repeated by changing the training type value from ‘undergraduate’ to ‘research Master’s/PhD’ • Also reported: lowest and highest combined marginal effects observed in specific policy sectors.
Percentage points increase or decrease in the probability of weekly consultation of scientific articles* *Simulated marginal effect of a simultaneous change in modifiable correlates on weekly consultation of scientific articles
Research challenges • Measuring research use objectively in public bureaucracies • Measuring research use according to other standards (eg benefits, health outcomes, etc.) • Documenting the research knowledge infrastructure found in ministries and studying its effect on policy analysts’ utilization behaviour • Opening up the black box of research knowledge • Opening up the black box of direct interactions • Conducting experimental research in ministries and agencies