1 / 21

Accelerating Opportunity Evaluation

This webinar will discuss the goals, design, and plans for the evaluation of Accelerating Opportunity (AO) program, focusing on the implementation analysis, impact analysis, and cost analysis. The evaluation team will also explain the site visits, community college surveys, and quarterly check-in calls that will be conducted throughout the grant period.

yagil
Download Presentation

Accelerating Opportunity Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Accelerating Opportunity Evaluation Planning the Evaluation with the Accelerating Opportunity States February 10, 2012 11:30 a.m. – 1:00 p.m.

  2. Webinar Agenda • Introduction of the Evaluation Team • Goals of the Evaluation • Evaluation Design • Plans for the Evaluation • Q&A with the Evaluation Team

  3. AO Evaluation Team • Randall Wilson, JFF, Evaluation Director • Robert Lerman, Urban Institute, Principal Investigator and Cost Analysis Lead • Lauren Eyster, Urban Institute, Project Director and Implementation Analysis Co-Lead • Maureen Conway, Aspen Institute, Implementation Analysis Co-Lead • Burt Barnow, George Washington University, Impact Analysis Lead

  4. Goals of the AO Evaluation • To generate evidence for state and federal policymakers, college administrators, funders, and other stakeholders about: • The process of implementing integrated college and career pathway designs and taking these designs to scale • Their impact for ABE and ESL students in college and in the labor market • Their cost effectiveness and financial sustainability

  5. AO Evaluation Design

  6. Implementation Analysis Questions • How did states and community colleges establish integrated pathway designs? • What shifts in the culture (attitudes and behavior) occurred within states and colleges concerning ABE/ESL students? • How did states and community colleges move toward scale and the sustainability of pathways through policy changes, innovative financing, the use of data to support continuous improvement, and other means?

  7. Site Visits to States and Community Colleges • Twice during the grant period (Fall 2012 and Spring 2014) • Three-day visits with two evaluation team members • Two community colleges will be visited during each site visit • Selection of the colleges will be determined with the state team

  8. Community College Surveys • Used to collect systematic data on implementation, financing, and sustainability across ALL community colleges • Fielded once after the end of the first year of the grant (December 2012) and once at the end of the grant period (October 2014) • Web-based survey but will be available in paper if requested • Will work with the states in fielding the survey – reaching the right contacts at each college and following up with non-responders

  9. Quarterly Check-in Calls • To keep updated on changes to AO program design and implementation • To provide states with an opportunity to ask questions about the evaluation activities • Will occur March, June, September, and December each year during the grant period

  10. Impact Analysis Questions • What impacts do integrated career pathway designs have on student progress and outcomes in college and in the labor market? • How do outcomes for participating students change relative to comparison groups of students similar to the population in AO, but not participating in the program?

  11. Requirements for the Impact Analysis • Ensure that AO education and support services differ significantly from services received by those who do not participate • Have a well-articulated selection process, which is consistently applied by colleges • Identify an appropriate comparison group whose outcomes will be compared to those of AO participants • Have enough participants and comparison group members to be able to detect statistically significant differences in outcomes

  12. Planned Quasi-Experimental Designs • Regression discontinuity design • Compares the treatment group to non-participating applicants to the AO program who are close to a test cutoff score used to determine program eligibility • Propensity score matching • Uses a large pool of similarly prepared and motivated ABE/ESL students in the state to compare to the treatment group on outcomes of interest

  13. Data for Impact Analysis • Data need to be of high quality and collected for both treatment and comparison group members • Data will be submitted twice per year to the Urban Institute through secure file transfer protocols • Test and retest scores on ABE/ESL tests are necessary for developing comparison groups • Data collected by states will include student characteristics, services received, educational progress, and labor market outcomes

  14. Cost Analysis Questions • What are the benefits and costs of implementing and scaling up integrated pathway designs to states, community colleges, and students? • What methods were used to finance the pathways, and how financially sustainable is the initiative in the state? • The key issue is how much more resources must be devoted to the AO model than do standard ABE/ESL programs and who bears any extra costs

  15. Data for Cost Analysis • Template will be provided for documenting state financing plans/budgets and college costs. • College cost data will also be collected through the implementation surveys. • Student cost and foregone earnings data be collected through semi-annual data reporting. • Data on financial sustainability issues will be collected in surveys and during site visits.

  16. One-Day Planning Visits to States(now scheduled for March) • To learn more about states’ program design, including selection procedures, test scores, and career pathways • To gain an understanding of data availability to help the evaluators develop analysis plans for the evaluation • To work closely with the states in identifying comparison groups • To continue developing the states’ understanding of the evaluation and its requirements

  17. Who Should Participate in Planning Visits? • From states: • State team leads • State staff responsible for data submission • From one or two community colleges: • College representatives who are knowledgeable about pathway design, recruitment and selection processes, and student data • Burt or Bob will each visit two states and will be accompanied by JFF or other evaluation team staff

  18. Main Topics for Visits • Continued discussion on the evaluation design and help the states and community colleges meet evaluation requirements • AO program design, specifically: • Recruitment and intake • Eligibility requirements • Educational milestones for students • Contrast with services received by other adult education students • Availability of data • Student characteristics at intake, participation in AO pathways, educational progress, and labor market outcomes • Data confidentiality and security and FERPA • Costs and benefits to students, community colleges, and states

  19. Tools for Collecting Student Data • Intake form (draft provided for review and comment) • Data template for semi-annual reporting (draft provided for review and comment) • Template for collecting state and college cost data will be provided by spring 2012

  20. Overall Evaluation Activity Timeline

  21. Q&A Accelerating Opportunity Evaluation Plans

More Related