1 / 13

Contextualizing the Evaluand : Considering Context in Data Collection and Analysis

Contextualizing the Evaluand : Considering Context in Data Collection and Analysis. Jamie Weinstein, MPH The MayaTech Corporation, With CDC’s National Center for Injury Prevention and Control American Evaluation Association November 13, 2009

lotus
Download Presentation

Contextualizing the Evaluand : Considering Context in Data Collection and Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contextualizing the Evaluand: Considering Context in Data Collection and Analysis Jamie Weinstein, MPH The MayaTech Corporation, With CDC’s National Center for Injury Prevention and Control American Evaluation Association November 13, 2009 The findings of this presentation are the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention

  2. Qualitative Evaluation Design Document Review Site Visits ICRC Interviews CDC Interviews Data Analysis Success Story Development

  3. Qualitative Evaluation Design • Context of program led evaluation team to utilize a qualitative approach: • Gain an understanding of evolving process of multi-disciplinary research and non-research in variety of contexts • Broad range of activities conducted by ICRCs • Lengthy history of the ICRCs and • volume of research

  4. Data Collection: Site Visits • Acknowledged that site visits promote understanding of how center contexts affect production of outputs and outcomes relevant to CDC • Obtained information and insights related to center experiences, operations, and collaboration with communities and partners • Used information collected during site visits in final data analysis

  5. Data Collection: ICRC Interviews • In-depth, 2 hour telephone interviews with ICRC directors and up to two other key center staff • The interviews yielded highly qualitative and detailed data about: • Research priorities and activities • Funding • Factors that promote injury research • Center evaluation • Future directions

  6. Data Collection: ICRC Interviews • Center contexts influence the type of work produced: • Geographic location • Personal relationships • Themes of research

  7. Data Collection: ICRC Interviews • List of15 most influential publications submitted • Centers defined influential based on their context

  8. Data Collection: CDC Interviews • Nine former and current CDC staff participated in 30- to 60-minute interviews • To build an understanding of the “insider” perspective on the role and functioning of the ICRC program • Provided an historical perspective of the program, as well as recommendations for the future

  9. Data Collection: Success Stories • Identified five examples of significant ICRC contributions to the injury field • Success stories developed based on data collected through site visits, interviews, and publicly available materials • In-depth, 2 page narratives

  10. Data Analysis • Data analysis focused on extracting common and divergent themes and correlating themes with center activities, outputs, and outcomes. • Combination of QSR NUD*IST software and Microsoft Excel to manage and analyze date. • Limited statistical analyses conducted

  11. Bibliometric Analyses • Conducted with the top 15 most influential publications of each center, as identified by the centers • Enabled the team to assess the reach of the ICRC research into the injury field • Used impact factors to measure the frequency a typical article in a particular journal is cited within a given year • Limitations of impact factors noted in findings

  12. Challenges • Clarifying research questions early in the process • Limiting data collection to those questions that answer research question • Highlighting centers’ work without providing an inventory of all activities • Avoiding comparisons, while providing comprehensive data

  13. Contact Information Jamie Weinstein, MPH The MayaTech Corporation jweinstein@mayatech.com

More Related