1 / 1

Evaluation results have been used to: Report outcomes for grants and to institutional leaders.

neorah
Download Presentation

Evaluation results have been used to: Report outcomes for grants and to institutional leaders.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD,1,2 B. Rucker, MPH3,4 D. Morhardt, PhD, LCSW,3,5 J. Brown, MPH,3 W. Healey, PT, EdD, GCS,6 V. Rivkina, MPH,7 K. WatsonMS, MPH8, G. Greene, PhD91Northwestern University Feinberg School of Medicine (Feinberg), Depts. of Pediatrics and Preventive Medicine, 2Alliance for Research in Chicagoland Communities, Center for Community Health, Northwestern University Clinical and Translational Sciences Institute, 4Chinese Mutual Aid Association, 5Feinberg, Cognitive Neurology and Alzheimer's Disease, 6 Feinberg, Department of Physical Therapy and Human Movement Sciences, 7Feinberg,Center for Healthcare Studies, 8Apostolic Faith Church  9 Feinberg, Department of Medical Social Sciences Results Collaborative process for development of evaluation tools Background • Evaluation results have been used to: • Report outcomes for grants and to institutional leaders. • Guide discussions/decision making of SC and staff • Shape reporting to ARCC partners/clients • Publish peer review article/podcast: Progress in Community Health Partnerships journal: volume 7, issue 3. • Inform funding application to the National Institutes of Health • EWG members gained experience working together using CEnR principles, including co-learning, negotiation and collaboration • Collaborative process added to the development of multiple tools as well as results analysis. E.g., CBOs pushed group to include community perceptions of NU, something academics were not as comfortable with. • Using the logic model kept us focused and on track. • EWG internal capacity for analysis did not match the richness of the data. EWG struggles with timely analysis and results communication. EWG changes to address this include: • Anticipating reporting needs when creating tools • Simplifying/limiting data collected. • Response rate is an issue. EWG changes to improve this: • Greater reliance on networks/colleagues to promote survey • Avoiding survey fatigue by using the modular approach. • Focusing on richness of data vs. # of completed surveys. • There is no one best approach for all evaluation needs and stakeholders involved. In the future EWG will explore how the modular survey fits with alternate evaluation tools/methods. • Evaluate our strategies to increase response rates. • Increase reporting and application of findings. • Consider a broader mix of evaluation tools for future efforts. • Alliance for Research in Chicagoland Communities (ARCC www.ARCConline.net) is part of Northwestern University’s (NU) Clinical and Translational Sciences Institute and the Institute for Public Health and Medicine   • ARRC supports community engaged research (CEnR) through seed grants, capacity building, technical assistance, partnership facilitation, and advocacy for supportive institutional policies. • Guided by a steering committee (SC) of community- and faith-based organizations, NU faculty, and public agencies. • ARCC’s Evaluation Working Group (EWG) charged with developing and implementing ARCC evaluation strategy. Members include SC members (2 CBOs, 3 faculty, 1 staff) and ARCC staff including a program evaluation specialist. Lessons learned inform this process Lessons learned inform this process Lessons Learned Lessons learned inform this process Methods • EWG leads evaluation process through meetings, teleconference calls, and emails. • EWG created and periodically reviews and updates “evaluation matrix” of ARCC evaluation data needs including funder reports, quality improvement & program development, and potential data sources. • EWG consults regularly with full SC to ensure it is addressing critical issues and to provide evaluation findings. • This method of collaborative evaluation differs from traditional models in that: multiple perspectives are engaged equitably from the start and throughout the process. The impetus for measures are tied directly to immediate program purposes, but are intended to serve as standardized tools beyond these uses as well. • In January 2014, EWG fielded its first annual “modular” survey including the four survey tools in Table 1. The survey modules include screening questions so participants complete only relevant components. The survey is expected to close in May 2014. Next Steps Acknowledgement: This work is supported by the National Institutes of Health Clinical and Translational Award UL1RR025741, & Northwestern University.

More Related