1 / 14

Building Evaluation Capacity (BEC)

Building Evaluation Capacity (BEC). Beatriz Chu Clewell , Urban Institute Patricia B. Campbell, Campbell- Kibler Associates, Inc. BEC : The Background

kalona
Download Presentation

Building Evaluation Capacity (BEC)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc.

  2. BEC: The Background Evaluation Capacity Building (ECB) is a system for enabling organizations and agencies to develop the mechanisms and structure to facilitate evaluation to meet accountability requirements. ECB differs from mainstream evaluation by being continuous and sustained rather than episodic. It is context-dependent; it operates on multiple levels; it is flexible in responding to multiple purposes, requiring continuous adjustments and refinements; and it requires a wide variety of evaluation approaches and methodologies (Stockdale et al, 2002).

  3. BEC: The Project The goal of BEC was to develop a model to build evaluation capacity in three different organizations: the National Science Foundation (NSF), the National Institutes of Health (NIH), and the GE Foundation. More specifically, the project’s intent was to test the feasibility of developing models to facilitate the collection of cross-project evaluation data for programs within these organizations that focus on increasing the diversity of the STEM workforce.

  4. BEC Guide I: • Designing A Cross-Project Evaluation • Evaluation design & identification of program goals • Construction of logic models • The evaluation approach, including: • Generation of evaluation questions • Setting of indicators • Integration of evaluation questions and indicators • Measurement strategies including: • Selection of appropriate measures • The role of demographic variables

  5. Highlights from Guide I: • Constructing a Logic Model • The basic components of a simplified logic model are: • Inputs (resources invested) • Outputs (activities implemented using the resources) • Outcomes/impact (results)

  6. Highlights from Guide I: Constructing a Logic Model (continued…)

  7. Highlights from Guide I: Questions & Indicators

  8. BEC Guide II: • Collecting and Using Cross Project Evaluation Data • The strengths and weaknesses of various types of formats that can be used in data collection • Data collection scheduling • Data quality and methods of ensuring it • Data unique to individual projects • Confidentiality and the protection of human subjects in data collection • Ways of building data collection capacity among projects • Rationales, sources, and measures of comparison data • Issues inherent in the reporting and displaying of data • The uses to which data might be put

  9. Highlights From Guide II: Data Collection Formats

  10. Highlights From Guide II: • Available Information on Comparison Databases • URL • Availability: Public Access/ Restricted Use (fees/permission needed) • Data Format: Web Download/Other Electronic • Student Demographic Variables: Race/Ethnicity, Sex, Disability, Citizenship • Data Level: National, State, Institution, Student • Student Population: Pre-College, College, Graduate School, Employment • Survey Population: First Year; Most Recent Year Available • Other Variables: Attitudes, Course-taking, Degrees, Employment, etc

  11. Highlights From Guide II: Making Comparisons

  12. Other Sources of Comparison Data The WebCASPAR database (http://caspar.nsf.gov) provides free access to institutional level data on students from surveys as Integrated Postsecondary Education Data System (IPEDS) & the Survey of Earned Doctorates. The Engineering Workforce Commission (http://www.ewc-online.org/) provides institutional level data (for members) on bachelors, masters and doctorate enrollees & recipients by sex by race/ethnicity for US students & by sex for foreign students. Comparison institutions can be selected from the Carnegie Foundation for the Advancement of Teaching’s website, (http://www.carnegiefoundation.org/classifications/) based on Carnegie Classification, location, private/public designation, size and profit/nonprofit status.

  13. Some Web-based Sources of Resources OERL, the Online Evaluation Resource Library http://oerl.sri.com/home.html User Friendly Guide to Program Evaluationhttp://www.nsf.gov/pubs/2002/nsf02057/start.htm AGEP Collecting, Analyzing and Displaying Datahttp://www.nsfagep.org/CollectingAnalyzingDisplayingData.pdf American Evaluation Association http://www.eval.org/resources.asp

  14. Download the Guideshttp://www.urban.org/publications/411651.htmlor Google on “Building Evaluation Capacity Campbell”

More Related