410 likes | 528 Views
A Better Way to Measure Community College Performance: An Achieving the Dream Cross-State Data Initiative. 2009 SHEEO/NCES Network Conference and IPEDS Workshop May 21, 2009 J. Keith Brown North Carolina Community College System. Presentation Summary :.
E N D
A Better Way to Measure Community College Performance: An Achieving the Dream Cross-State Data Initiative 2009 SHEEO/NCES Network Conference and IPEDS Workshop May 21, 2009 J. Keith Brown North Carolina Community College System
Presentation Summary: • Overview of Achieving the Dream and the Cross-State Data Workgroup • Recommendations for an Alternative Set of Measures for Community College Performance • Introduction of Intermediate Milestones and Final Measures of Student Performance • Implications for Policy and Practice
Achieving the Dream Overview • National initiative to help more community college students succeed (earn degrees, earn certificates, or transfer) • Particularly concerned about student groups that have faced the most significant barriers to success, including low-income students and students of color
82 Institutions in 15 StatesAR, CT, FL, HI, MA, MI, NC, NM, OH, OK, PA, SC, TX, VA, WA
Achievingthe Dream Values • Student-centered • Equity and excellence • Culture of evidence, inquiry, accountability, and shared responsibility
Cross-State Data Workgroup Initial States Connecticut Florida North Carolina Ohio Texas Virginia States Joining Arkansas Massachusetts New Mexico Oklahoma South Carolina Washington
Data Workgroup Goals • Develop a set of indicators to: • More effectively track student performance • Evaluate the effectiveness of interventions • Learn from the strengths of other community college systems
Test Drive: Six States Pilot Better Ways to Measure and Compare Community College Performance
Recommended Changes: Prior Enrollment • Enrollment Status • Intent at time of Enrollment
Recommended Changes: • Timeframe • Success Outcomes
Tracking transfer students within the 2-year sector Recommended Changes:
Controlling for factors associated with success Recommended Changes:
Next Steps: Developing Intermediate Benchmarks to Measure Student Progress
Intermediate milestones to track students • First-Year Milestones • Persisted fall to spring • Passed 80% or more of attempted hours • Earned 24 or more hours • Second-Year Milestones • Persisted fall to fall • Completed developmental math by year 2 • Earned 48 or more hoursThird-Year Milestones • Passed gatekeeper English or higher by year 3 • Passed gatekeeper math or higher by year 3
First-Year Milestones • Returned spring semester • 16% increase in final success outcomes for full-time students; 28% for part-time students • Earned 24 credits (full-time) or 18 credits (part-time) by the end of the first year • 25% increase in final success outcomes for full-time students; 66% for part-time students • Passed 80% of credits attempted • 12% increase in final success outcomes for full-time students; 46% for part-time students
Second-Year Milestones • Returned in fall of second year • 23% increase in final success outcomes for full-time students; 53% for part-time students • Returned and earned 42 credits (full-time) or 24 credits (part-time) by the end of the second year • 32% increase in final success outcomes for full-time students; 49% increase for part-time students
Second-Year Milestones • Passed developmental mathematics course by the end of the second year • 84% increase in final success outcomes for full-time students; 110% increase for part-time students • Passed developmental English course by the end of the third year • 17% increase in final success outcomes for full-time students; 39% increase for part-time students
Third-Year Milestones • Passed “gatekeeper” mathematics course by the end of the third year • 45% increase in final success outcomes for full-time students; 147% increase for part-time students • Passed “gatekeeper” English course by the end of the third year • 17% increase in final success outcomes for full-time students; 59% increase for part-time students
Tracking toward final success measures • Fourth- and Sixth-Year Measures • Award of less than associate’s degree w/o transfer • Award of associate’s degree or higher w/o transfer • Award of less than associate’s degree and transferred • Award of associate’s degree or higher and transferred • Transferred w/o an award • Still enrolled with 30 or more college hours • Total success rate
Next Steps…finishing data runs • Run benchmarks at state and institutional levels • Disaggregate and analyze performance by: • academic readiness • income • ethnicity • gender • Identify and document promising interventions
What does all this mean and what are the policy implications?
Context of the Data • Reflects the mission of colleges/state systems • Examples: age distribution, award distribution • Reflects differences in state priorities/policies • Example: college transfer policy • Not all measures pertain to all students • Example: gatekeeper math course
Context of the Data • Reflects the mission of colleges/state systems • Examples: age distribution, award distribution • Reflects differences in state priorities/policies • Example: college transfer policy • Not all measures pertain to all students • Example: gatekeeper math course
Impact of state policy on outcomes • Differences in transfer patterns reflect policy differences • Encouraging transfer after earning a degree • e.g. FL – 69% transfer after degree & 7% before • Encouraging transfer without a degree • e.g. TX – 25% transfer without degree • Absence of strong transfer policies • e.g. OH – 22% transfer after degree & 6% before • Balanced approach to transfer • e.g. NC – 16% transfer after degree & 14% before
Context of the Data • Reflects the mission of colleges/state systems • Examples: age distribution, award distribution • Reflects differences in state priorities/policies • Example: college transfer policy • Not all measures pertain to all students • Example: gatekeeper math course
Implications of the Data: College Perspective • Identification of at-risk students • Student advising • Review of policies/practices • Examples: • Course taking sequence/timing • Drop/add policies
Implications of the Data: State/System Perspective • Policy development to improve student success • Review of regulations • Performance indicators • Development/refinement of student database • Benchmarking
Implications of the Data: National Perspective • More appropriate measure of student success: accountability • Financial aid policy • Expand the body of knowledge on successful community colleges/practices
The following individuals are gratefully acknowledged for their contributions to this presentation: Chris Baldwin: Jobs for the Future Pat Windham: Florida Donna Jovanovich: Virginia Corby Coperthwaite: Connecticut Acknowledgements
Achieving the Dream Success is what counts. ww.achievingthedream.org