380 likes | 548 Views
The Utility of High School Achievement Data for Placement in Foundational Skills Courses John Hetts Mark Taylor Andrew Fuenmayor Karen Rothstein Long Beach City College Daniel Lamoree Mt . San Antonio College RP Group Annual Conference, April 1, 2013.
E N D
The Utility of High School Achievement Data for Placement in Foundational Skills Courses John Hetts Mark Taylor Andrew Fuenmayor Karen Rothstein Long Beach City College Daniel Lamoree Mt. San Antonio College RP Group Annual Conference, April 1, 2013 Promise Pathways: 2012-2013 Cohort Board of Trustees Retreat July 24, 2012
Five goals • Initiate reflection on student transition to college, assessment, & placement. • Describe our research. • Introduce its implementation: F2012 LBCC Promise Pathways. • Briefly showcase tools Andrew, Dan, and Karen have built to help you. • Help collectively unlock the potential to better understand our students and improve our colleges.
Transition to College:Assessment and Placement • CCCs are open enrollment • Requires assessing and planning for students’ educational needs. • Most colleges rely almost entirely on standardized assessment • Ease of administration • Transparency of administration • Impartiality of administration • Pre-approval
Consequences for student transition to college • Majority of students placed into basic skills • Placement is significant barrier to completion • Colleges’ first interaction with students is “You are not ready for college” • Courses take first few weeks to convince students • Substantial long-term systemic changes: • More, longer, slower remediation sequences • Reduction of access, especially at bottom of sequences
Conventional Wisdom Explaining Assessment Results • It is a problem with today’s students • Students are simply, vastly unprepared for college • Kids these days …. • It is a problem with public education • Public education is failing to prepare high school students • Teachers these days…
What If the Conventional Wisdom is Wrong? • Flynn Effect – substantial, long-term increase in IQ • 18-24 year olds with HS degree: ~90% • NAEP: Scores are at or near highs in virtually every age category for every demographic group • Growing body of research questions effectiveness of standardized assessment for placement • Little relation to college course outcomes. (e.g.,. Belfield & Crosta, 2012; Edgescombe, 2011; Scott-Clayton, 2012; Scott-Clayton & Rodriguez, 2012; Xu, forthcoming) • NAGB, 2012: Highly variable cutscores, with 2-year colleges often using HIGHER cutscores than 4-year
We sought local answers to three questions: • What predicts how students assess and place into developmental courses? • What predicts how students perform in those courses? • How well are placement and performance aligned?
Our Research • Five longitudinal cohorts tracking more than 30,000 students built with help of Cal-PASS • Focusing on 7,000 HS grads who attend LBCC directly after high school • Examined predictive utility of wide range of high school achievement data • most notably 11th grade California Standards Test (CST) scores and high school grades • Assessment and placement into developmental skills courses • Performance in those classes
Alignment in English * p <.05 **, p <.01, *** p<.001, x = p< 1 x 10-10
Alignment in Math * p <.05 **, p <.01, *** p<.001, x = p< 1 x 10-10
The Key Takeaways • We want to use assessments that predict how students will perform at our college • Standardized tests predict standardized tests • Classroom performance predicts classroom performance • More information tells us more about our students than less information • Significant opportunitiesexist to improve placement, student achievement, and students’ college experience
Reimagining the Transition to College: Fall 2012 Promise Pathways • Predictive placement model in English & Math using multi-method, evidence-based assessment • English: A or B in 12th Grade English (as proxy) • Math: Predicted rate of success using all variables ≥ average success rate in course • Prescriptive, full-time course load via pre-populated first-semester success plan • Emphasis on English, Reading, Math • Required group Counselingworkshop • Student success course • Additional Experimental pilots (Contextualized Reading, academic achievement coaches)
Success rates in transfer-level courses Fall 2012 Neither of these differences approach significance, p >.30
How can it affect time to college-level work? 1270 years of student time saved, or approximately 2.6 years per student
Can it be done? • Can your college do this research? • Could this model apply at your college too? • Can your college afford to implement something like this?
If you’re interested in doing this research too:When you start this research:Step 1: Acquiring Cal-PASS Data • Log in at mycalpass.org and click on the data menu. Step 1
Raw Cal-PASS Data Step 1
How is STEPS 2.0 different? • Interface • Redesigned the initial STEPS Database using VBA, SQL, and Shell scripts to fully automate data management • Forms to facilitate data validation • High School Course Id (CBEDS code) • Prior to College Level (MIS: CB21) • To previous Spring to Fall direct matriculants, dataset adds students: • enrolled in summer term immediately after graduation • enrolled for first time in subsequent terms • not enrolling in MATH or ENGL in their first term at your institution *Note: You may restrict the dataset to fully mirror the original methodology! Step 2
Who is in the dataset? Step 2
Final Analysis: Placement Step 3
Predicting Placements Step 3
Using the Model to Develop Alternative Placements • To compute this measure we capture all the coefficients from our Predicting Success model. Step 3
Using the Model to Develop Alternative Placements (continued) .6 .4 .6 .4 Step 3
So, can it be done? • Can your college do this research? • YES! (Still skeptical? Come see Dan demo his even newer and more improved tool at 1:45 tomorrow in Mountain Vista) • Could this model apply at your college too? • YES! (Not sure? Join us for a panel discussing similar research and two statewide pilots, 10:15 AM Campus Vista) • Can your college afford to implement something like this? • Can your college afford not to? • But wait, there’s more… • Tools to help understand effects of other interventions • Broad new understandings of our students and education
Improved tools to understand student success interventions(Come find out more at 9AM, Valley Vista tomorrow)
Powerful start but self-selection questions remain. However, if supplemented by data generated by these tools… Outcome: Success in courses in the following term
What did LBCC gain through prescriptive, evidence based approach to transition to college • Dramatic increases in students attaining early educational outcomes, & shorter times to do so • New discussion of research and instructional pedagogy, kick-starting experimentation and innovation • Strong challenge toconventional wisdom and perceptions of our students by administration, staff, faculty, and students themselves • Most importantly, concrete achievable steps that any college can take to dramatically improve allof our students’ futures
Can you get in touch with us? • General questions • Mark Taylor, mtaylor@lbcc.edu or (562)938-4206 • Research questions • John Hetts, jhetts@lbcc.edu or (562) 938-4052 • STEPS Participation • Terrence Willett , twillett@rpgroup.org or (510) 527-8500, ext. 109 • STEPS syntax/output questions • Andrew Fuenmayor, afuenmayor@lbcc.edu, or (562) 938-4133 • STEPS 2.0/Illuminous questions? • Dan Lamoree, DLamoree@MtSAC.edu or (909)274-4708
Resources • More information about our research • http://bit.ly/PathwaysResearch • Achieving the Dream/Jobs for the Future summary of alternative assessment • http://bit.ly/AlternativeAssessment • CCRC research on Assessment, Placement, and Progression in Developmental Education • http://bit.ly/CCRCAssess • More information about the RP Group’s Student Transcript-Enhanced Placement (STEPS) Project • http://bit.ly/RPSTEPS • Step by Step process for replication • http://bit.ly/RPSTEPS2