130 likes | 142 Views
Contextualizing the Evaluand : Considering Context in Data Collection and Analysis. Jamie Weinstein, MPH The MayaTech Corporation, With CDC’s National Center for Injury Prevention and Control American Evaluation Association November 13, 2009
E N D
Contextualizing the Evaluand: Considering Context in Data Collection and Analysis Jamie Weinstein, MPH The MayaTech Corporation, With CDC’s National Center for Injury Prevention and Control American Evaluation Association November 13, 2009 The findings of this presentation are the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention
Qualitative Evaluation Design Document Review Site Visits ICRC Interviews CDC Interviews Data Analysis Success Story Development
Qualitative Evaluation Design • Context of program led evaluation team to utilize a qualitative approach: • Gain an understanding of evolving process of multi-disciplinary research and non-research in variety of contexts • Broad range of activities conducted by ICRCs • Lengthy history of the ICRCs and • volume of research
Data Collection: Site Visits • Acknowledged that site visits promote understanding of how center contexts affect production of outputs and outcomes relevant to CDC • Obtained information and insights related to center experiences, operations, and collaboration with communities and partners • Used information collected during site visits in final data analysis
Data Collection: ICRC Interviews • In-depth, 2 hour telephone interviews with ICRC directors and up to two other key center staff • The interviews yielded highly qualitative and detailed data about: • Research priorities and activities • Funding • Factors that promote injury research • Center evaluation • Future directions
Data Collection: ICRC Interviews • Center contexts influence the type of work produced: • Geographic location • Personal relationships • Themes of research
Data Collection: ICRC Interviews • List of15 most influential publications submitted • Centers defined influential based on their context
Data Collection: CDC Interviews • Nine former and current CDC staff participated in 30- to 60-minute interviews • To build an understanding of the “insider” perspective on the role and functioning of the ICRC program • Provided an historical perspective of the program, as well as recommendations for the future
Data Collection: Success Stories • Identified five examples of significant ICRC contributions to the injury field • Success stories developed based on data collected through site visits, interviews, and publicly available materials • In-depth, 2 page narratives
Data Analysis • Data analysis focused on extracting common and divergent themes and correlating themes with center activities, outputs, and outcomes. • Combination of QSR NUD*IST software and Microsoft Excel to manage and analyze date. • Limited statistical analyses conducted
Bibliometric Analyses • Conducted with the top 15 most influential publications of each center, as identified by the centers • Enabled the team to assess the reach of the ICRC research into the injury field • Used impact factors to measure the frequency a typical article in a particular journal is cited within a given year • Limitations of impact factors noted in findings
Challenges • Clarifying research questions early in the process • Limiting data collection to those questions that answer research question • Highlighting centers’ work without providing an inventory of all activities • Avoiding comparisons, while providing comprehensive data
Contact Information Jamie Weinstein, MPH The MayaTech Corporation jweinstein@mayatech.com