400 likes | 522 Views
Path to Accelerated Completion and Employment. Evaluation Meeting July 31, 2012. New Growth Group. New Growth is a full-service evaluation firm specializing in postsecondary education and workforce development. Christopher Spence, Evaluation Project Manager
E N D
Path to Accelerated Completion and Employment Evaluation Meeting July 31, 2012
New Growth Group • New Growth is a full-service evaluation firm specializing in postsecondary education and workforce development. • Christopher Spence, Evaluation Project Manager • Joel Elvery, PhD, Data Analysis • Partnering with Corporation for a Skilled Workforce for implementation assessment • Holly Parker, Ed Strong, LeiseRosman
Goals • Measure the impact of strategies on student outcomes • Capture the variety of approaches implemented for each strategy in the state • Contribute to continuous improvement • Comply with USDOL evaluation requirements
Approach • Two parts: • Impact assessment • Implementation assessment • Two audiences • USDOL – Reporting and compliance • PACE colleges – More detail to ID best practices • Approach tailored for each strategy in the original proposal
Impact Assessment • Before-and-after research design Measures - Before Measures - After • Key Measures • Academic Progress Measures • Program Completion Rates • Employment Outcomes
Implementation Assessment • Documentation of approaches at each college and how implemented • Interviews, questionnaires, etc. • Informs initiative continuous improvement efforts in later stages • Sets the stage for future student success agenda
First Year Timeline Aug - 13 Nov - 12 Feb - 13 Sep - 12 May - 13 • 6/21 College data reports • 8/14 Quarterly report • 8/16 Second semester roll-up • 9/21 College data reports • 5/15 Quarterly report • 8/3 Surveys Employer Engagement, Developmental Education, Streamlining • 9/1 Comparison cohorts defined • Program Launch • 11/14 Quarterly Report • 11/ 14 Annual Report • 1/21 College data reports • 2/14 Quarterly report • 3/15 First semester roll-up • Notes: • Colleges still provide monthly progress reports to NWACC • Expect implementation assessment activities closer to end of first semester
Contact Information • Project Manager: Chris Spence, 216.767.6262, cspence@newgrowthplanners.com • Impact Assessment: Joel Elvery, PhD: 216.375.6777, jelvery@newgrowthplanners.com • Implementation Assessment: Holly Parker, 734.769.2900, hparker@skilledwork.org
Data plan • Quantitative evaluation design • Comparison cohort plan • Data requested • Data submission
Quantitative evaluation design • Two purposes • Meet DOL requirements • Inform stakeholders whether new approaches are increasing student success • Using before-and-after comparison • Focusing on cohorts engaged in targeted programs in Fall 2012 vs. those in Fall 2010 • More than what’s needed for DOL requirements
Comparison cohort plan • Where possible, will use past cohorts from targeted programs as comparison group • Gathering comparison data from ADHE • Except for some developmental education metrics not in ADHE data • New programs or dramatically shortened programs will have to be matched to other similar programs • DOL convening in early August
Comparison cohort plan • Next steps on comparison cohort plan • Learn about targeted programs & their duration • Develop groupings of programs • Write up cohort strategy for DOL • Get DOL approval • Inform colleges of any additional data need to provide
Data required from colleges • Test scores & placement of students involved in assessment test preparation • Prior Learning Assessments • Demographics of students in targeted programs of study • Completion of developmental education requirements for students in targeted programs • Historic data on developmental education progress for past cohorts
Data on PREP Participants • Who should be included • Every student who uses assessment test preparation provided in conjunction with PACE grant, regardless of whether in targeted program • What need to know • Identifying variables • Type of assessment test, placement before & after readiness course • For math, reading, & English assessments
Data on PLA Participants • Who should be included • Everyone who attempts to get credit through a prior learning assessment • What need to know • Identifying variables • Total credit hours earned through PLA • Credit hours earned through each of the following • Portfolio • Standardized test • Local test • Training
1 time student data • Who should be included • All students enrolled in a targeted program of study • Includes students who began prior to Fall 2012 who are still enrolled • What need to know • Identifying variables • Student demographics from intake form • Developmental ed. placements • Whether have completed developmental ed.
Term data • Data to be reported each term for each student • Who should be included • All students enrolled in a targeted program of study • Includes students who began prior to Fall 2012 who are still enrolled • What need to know • Identifying variables • Whether taking Technical Math & number of modules have to take • Whether changed program of study & what new program of study is • Whether completed developmental ed. requirements
Program-level data • What programs should be included • Each targeted program included in PACE • A separate row for each different one • What need to know • Identifying variables • Credit hours before & after redesign • 2-year dev. ed. math, reading, & English completion rates for cohorts from Fall 2008, Fall 2009, & Fall 2010 • 2-year college-level math, reading, & English completion rates fro cohorts from Fall 2008, Fall 2009, & Fall 2010
Developmental education worksheet • One for each targeted program of study • Need to know course numbers for • Redesigned dev. ed. classes • Technical math • Past courses that students would have taken in place of these courses • Will be used to gather data on student progress through developmental courses
How is your college using technical math? • Will you have a modular technical math course this Fall? • Is it replacing only developmental math? • Is it replacing only college-level math? • Is it replacing both? • If it is replacing both, will some students have to do remediation prior to Technical Math? • Do your programs have additional math requirements on top of Technical Math?
Spreadsheets • 1st sheet has list of variables, their definitions, & required format • Other sheets are data table shells to be completed by colleges
Submission • Data will contain confidential data • Each college will be given a password & will use password protection built into Excel • Submission via secure Drop Box • Timing of submissions • Fall semester data – January 21 • Spring semester data – June 21 • Summer semester data – September 21
Wrap up data plan • Only asking colleges for information can not get from other sources • Your help is crucial because changes to dev. ed. large part of PACE initiative • Especially true of historic dev. ed. completion data & PLA data
Why do an Implementation Evaluation? • Tell the story behind the data • Contribute to Continuous Improvement • Share learning across locations • Stay on track with goals and funding requirements • USDOL requirement
Overall Objectives • Ultimate objective: capture lessons and best practices from your experiences that contribute to your ongoing efforts and the field in general • Understand how you plan to implement the strategies • Track early outcomes (findings and challenges) from initial implementation • Describe and share adaptations made in response to these early outcomes • Document lessons learned from modifications and final outcomes
Our Approach to Evaluating Implementation • Greater focus on qualitative information • Evaluation plan must be fluid and responsive • Each phase builds on the prior phase • Start up and end of grant period usually reflects heaviest information gathering push • Timing is frequently subject to course corrections
Key Topics of Inquiry • For each of the three strategies outlined to USDOL: how has the strategy been implemented and how have students utilized/experienced it? • Describe key redesign features and approaches used in implementing them, for example: • Personnel changes/additions • Professional development and peer learning activities • Specific models employed (i.e., CAEL, El Paso PREP, etc.) • Curricula and/or delivery innovations • New uses of technology • Involvement of external partners (employers, WIBs, etc.) • New roles for staff or faculty
Methods of Evaluating Implementation • Document review • Relevant institutional policies • Curricula materials • Scheduling information • Informational/outreach materials • Surveys • Interviews • Phone and/or in person • On-site observation • Focus Groups • On site
Implementation Evaluation Information Gathering Timeline • Fall 2012 semester • Analyze information from initial surveys (due Aug. 3) • Document review • Winter 2013 semester • Second round of surveys on planning progress (first half of semester) and early lessons/challenges • Site visits (end of semester) • Academic year 2013-14 • Surveys to track implementation progress, adaptations • Phone interviews or other follow up if needed • Fall 2014 semester • Final document review • Final surveys and close-out site visit
Before we go to lunch… • Any questions about the implementation evaluation approach? • Lunch discussion topics: • Reflect on data plan • What are the key student success priorities at your institution? • What would be most useful (for your institution) to learn during and after PACE implementation?
Next Steps • Updates based on today’s discussion • Questions and clarifications • Cohort definitions • Rolling out analyses during the semester
Contact Information • Project Manager: Chris Spence, 216.767.6262, cspence@newgrowthplanners.com • Impact Assessment: Joel Elvery, PhD: 216.375.6777, jelvery@newgrowthplanners.com • Implementation Assessment: Holly Parker, 734.769.2900, hparker@skilledwork.org