270 likes | 282 Views
This paper discusses the evaluation of large scientific research initiatives at the National Institutes of Health (NIH). It explores the changing nature of science, interdisciplinary collaboration, and the use of large center grants as a funding mechanism. The paper also presents a case study on the evaluation of the Transdisciplinary Tobacco Use Research Centers (TTURCs) and identifies key evaluation questions for these initiatives.
E N D
Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University American Evaluation Association November 4, 2006
The Context • Changing nature of science • Interdisciplinary, collaborative • Large initiatives for complex problems • Expansion of use of large center grants as research funding mechanism • Similar issues reported in the European Union (EU) in connection with the evaluation of Science, Technology, and Innovation (STI) policies • Government wide accountability expectations • GPRA • PART • ExpectMore.gov • Good science requires good management
Evaluation of Large Initiatives • National Cancer Institute • Transdisciplinary Tobacco Use Research Centers (TTURCs), (2001 – 2003) • Centers for Disease Control • Prevention Research Centers Network, 2003-2005 • National Institute of Allergies and Infectious Diseases • AIDS Clinical Trials Network, Division of AIDS, National Institutes of Health (2005 – present)
Evaluation Approach • Culture change • Collaboration and involvement of researchers, funders, consultants • Understand initiative life-cycle • Develop initiative logic model • Link comprehensive measures and tools to model • Keep costs and respondent burden low • Assure scientific objectivity and credibility • Address multiple purposes and audiences • Design for re-use where possible • Report and utilize results • Provide an opportunity for reflection and learning
The context includes the organizational structures and organizational constraints that delimit evaluation activities. Issues include: At each stage there are a variety of evaluation questions with more prospective questions earlier in the life-cycle and more retrospective ones later. Processes are needed for for prioritizing which questions will be addressed at each stage. Development Implementation Dissemination Planning Questions • Motivation • Capacity • Structure • Expertise • Support Context Conceptual Model At each stage a wide variety of stakeholders need to be involved both in helping determine what questions should be addressed in evaluation and in providing their assessments of initiative performance and outcomes. Stakeholders Measures Evaluation is an empirical activity. Consequently, measures that are related to the constructs in the conceptual model needed at every stage. Initiative Life Cycle Model
Summative/Ex Post Methods Formative/Ex Ante Conceptual Model Meta-Evaluation Implementation Evaluation Structured Conceptualization Process Evaluation Secondary Analysis Evaluability Assessment Cost-Effectiveness & Cost Benefit Evaluation Outcome Evaluation Needs Assessment Impact Evaluation Evaluation Methods Disseminate Plan Develop Implement New Initiatives Policy Context Policy Implications Plan Develop Implement Disseminate Strategic Goals Strategic Impact
The TTURC Case Study • Transdisciplinary Tobacco Use Research Centers • History • RFA released 12/98 • Grants reviewed 7/99 • First award 9/99 • Reissuance 9/04 • Approximately $75 million in first phase • TTURC Life Cycle Model
1 2 3 4 5 6 7 ……. 1 2 3 4 5 6 7 ……. 1 2 3 4 5 6 7 ……. Evaluation System Plan Community Health change Technical Assistance Engage the Community Logic Model Training Core Expertise & Resources Research Agenda Active Dissemination Engage the Community Diversity & Sensitivity Inputs Activities Outputs Outcomes Relationships & Recognition Concept Map Research Methods Training Research Agenda Technical Assistance Core Expertise & Resources Active Dissemination Model Development
Researcher Form Survey Analysis Content Analysis Progress Report Summary Peer Evaluation Evaluation Analysis Bibliometrics Publications Peer Evaluation Personnel Report Personnel Analysis Budget & Justification Financial Analysis Expenditures & Carryover Measures Analyses Measures & Analyses Conceptual Map & Logic Model Progress Report (PHS2590) Financial Report (SF259a)
Evaluation Questions 1. How well is the collaborative transdisciplinary work of the centers (including training) accomplished? 2. Does the collaborative transdisciplinary research of the centers lead to the development of new or improved research methods? 3. Does the collaborative transdisciplinary research of the centers lead to the development of new or improved scientific models and theories? 4. Does TTURC research result in scientific publications that are recognized as high-quality? 5. Does TTURC research get communicated effectively? 6. Are models and methods translated into improved interventions? 7. Does TTURC research influence health practice? 8. Does TTURC research influence health policy? 9. Does TTURC research influence health outcomes?
Evaluation Questions 1. How well is the collaborative transdisciplinary work of the centers accomplished? Subquestions: • What are TTURC researcher attitudes about collaboration and transdisciplinary research? • How do researchers assess performance of their centers on collaboration, transdisciplinary research, training, institutional support and center management? • What are examples of collaboration, transdisciplinary and training activities of the centers? • What is the quality and impact of the collaboration, transdisciplinary and training activities of the centers? • Do TTURC research publications provide evidence of collaboration and transdisciplinary research, and how do they compare with “traditional” research? • How effective and efficient is the management of the TTURCs?
Evaluation Questions 1. How well is the collaborative transdisciplinary work of the centers accomplished? • Data Sources: • Researcher Form • Attitudes about Transdisciplinary Research Scale (15 items) • Center Collaboration Scale (15 items) • Attitudes about Collaboration in Research Scale (8 items) • Institutional Support Index (12 items) • Overall Ratings of collaboration, transdisciplinary integration, training, institutional support • Content Analysis of annual progress reports for activities, results and barriers (code on collaboration, transdisciplinary integration, training, institutional support) • Peer evaluation • Annual progress reports • Publications • Bibliometric analysis of publications • Collaboration within and across institutions and centers • Numbers of fields represented by publications, cited and citing articles, weighted by impact of journals • Management analysis • Personnel • Budget and Financial
Each center responsible for generating measures for 3-4 clusters on the map (at least two centers reviewed each cluster) • Compiled into measure development database, draft measure produced • 25 closed-ended questions each with multiple subquestions • Overall performance ratings by outcome area • Open-ended Comments Researcher Form • 244 specific measurement items proposed across the 13 content clusters
Scales and Indexes • Attitudes about Transdisciplinary Research Scale (15 items) • Center Collaboration Scale (15 items) • Attitudes about Collaboration in Research Scale (8 items) • Institutional Support Index (12 items) • Methods Progress Scale (7 items) • Science and Models Scale (17 items) • Barriers to Communications Scale (8 items) • Center-to-Researcher Communications (5 items) • Center External Communications (2 items) • Progress on Development of Interventions Index (12 items) • Policy Impact Index (4 items) • Translation to Practice Index(9 items) • Health Outcome Impact Scale (6 items)
Researcher Survey 8. Collaboration within the center 3.4 3.8 4.2 4.6 3.6 4.0 4.4 a. Support staffing for the collaboration. b. Physical environment support (e.g., meeting space) for collaboration. c. Acceptance of new ideas. d. Communication among collaborators. e. Ability to capitalize on the strengths of different researchers. f. Organization or structure of collaborative teams. g. Resolution of conflicts among collaborators. h. Ability to accommodate different working styles of collaborators. i. Integration of research methods from different fields. j. Integration of theories and models from different fields. k. Involvement of collaborators from outside the center. l. Involvement of collaborators from diverse disciplines. m. Productivity of collaboration meetings. n. Productivity in developing new products (e.g., papers, proposals, courses). o. Overall productivity of collaboration. a. b. c. d. e. f. g. h. i. j. k. l. m. n. o. 95% CI
Content Analysis • Code approximately 80-90 project reports per year by the 13 outcome clusters • Did three rounds of reliability testing and refinement of coding definitions • Final reliability > .9
Progress Report Content Analysis – Years 1-3 External Recognition and Support Internal Recognition And Support Communication Training Publications Policy Implications Methods Health Outcomes Collaboration Interventions Translation to Practice Science & Models Transdisciplinary Integration (data from Content Analysis of Annual Progress Report Form PHS2590)
Peer Evaluation – Years 1-3 External Recognition and Support Internal Recognition And Support Communication Training Publications Policy Implications Methods Health Outcomes Collaboration Interventions Translation to Practice Science & Models Transdisciplinary Integration
Bibliometric Analysis • What is a TTURC publication? • Results from TTURC research • Cites TTURC Grant Number • Independent peer evaluation would identify the influence • Components of bibliometric analysis • Publications, citations, cited (references) • Journals of publication, citing, cited • Field (Current Contents) • Year
Bibliometric Analysis Indicators • Journal Impact Factor (JIF) – average number of citations of a journal of all articles published in previous two years • Journal Performance Indicator (JPI) – average number of publications to date for all publications in a journal in a particular year • Field Journal Performance Indicator – JPI for all journals in a field • Adjusted Journal Performance Indicator (Expected Citations) – JPI for a specific type of publication • 5-year Impact – Average number of citations to publications over a five year period
On average, there were .6 more citations of TTURC publications than for other publications in the same field. Bibliometrics Citation of TTURC publications is significantly higher than for journal and field comparison groups. On average, there were .64 more citations of TTURC publications than for other publications in the same journal.
Citations lower than expected in year 1, higher in year 2. Bibliometrics Citation of TTURC research publications is significantly increasing over time relative to expectation. Only the two complete years were used in this analysis.
Financial Analysis 100% 99% Center 1 94% 93% 91% 90% Center 2 87% 82% Center 3 80% 80% 78% 77% Center 4 74% 72% 70% 68% Center 5 66% 62% 60% Center 6 57% Center 7 50% 46% Total 40% Period 1 Period 2 Cumulative Percent of Federal Funds Spent by Grantee (data from Financial Status Reports of grantees)
Carryover Percent of subprojects by center and year that reported a carryover. (data from Budget Justification, Annual Progress Report Form PHS2590)
Causes of Delay or Unanticipated Obstacles Reasons for Carryover 0.4 0.6 0.35 0.3 0.5 0.25 0.4 0.2 0.15 0.3 0.1 0.05 0.2 0 0.1 Not stated Other-Specify Staffing issue Infrastructure issue Granting agency issue 0 Research/methods issue Delay of project Unanticipated Changes in Other-Specify Not stated Implementation or logistical issue start obstacles process - practical Reasons for Carryover (data from Budget Justification, Annual Progress Report Form PHS2590)
What Worked • Less Promising • Researcher Survey – one wave • Content Analysis – costly, time consuming • Peer Evaluation of publications • More Promising • Researcher Survey scales • Peer evaluation of progress reports • Financial Analysis • Bibliometrics
Conclusions • Sustainability Challenges • Funding challenges • Researcher motivation • Methodological Challenges • Peer review • Bibliometrics • Integrating results • Organizational Challenges • Agency resources • Grantee resources • External contractors • Utilization Challenges • Building over multiple time points • Building over multiple initiatives