260 likes | 273 Views
Developmental Education Assessment, Placement, and Progression. Thomas Bailey Based on Research by Katherine Hughes, Shanna Jaggars, Judith Scott-Clayton. National Context. For many (most?) entering CC students, assessment center is one of first places they will visit
E N D
Developmental EducationAssessment, Placement, and Progression Thomas Bailey Based on Research by Katherine Hughes, Shanna Jaggars, Judith Scott-Clayton
National Context • For many (most?) entering CC students, assessment center is one of first places they will visit • For the majority of students sitting for these exams, the result is placement into developmental education • Yet research has not consistently found that this process actually improves student outcomes
CCRC Literature Review(Hughes & Scott-Clayton) • Examined three questions: • Is there consensus regarding the proper purpose and role of assessment in CCs? • Are the most commonly used assessments valid for their intended purpose? • Are there alternative models of assessment that may improve outcomes for underprepared students? • CUNY study brings new data to bear on similar set of questions
No Consensus on Meaning of College Ready • Many assessments • Many cut off scores • Many policies with respect to • Mandatory Testing • Mandatory Placement
Figure 3: Educational Outcome by Math CPT Score and Estimated Discontinuity
Are Dev Ed Assessments Valid? • CUNY uses COMPASS math & reading tests (published by ACT, Inc.; one of two most common assessments) • There are lots of different ways to think about validity: • Construct validity: does the test measure what you think it does? • Predictive validity: does the test predict some measure of later success? • Argument-based approach to validity: “It is the interpretation of test scores required by proposed uses that are evaluated, not the test itself” (Standards for Educational and Psychological Testing) • Focus here is on predictive validity • This is a necessary, but not sufficient component of overall validity of the test • “[U]ltimately, it is the responsibility of the users of a test to evaluate this evidence to ensure the test is appropriate for the purpose(s) for which it is being used” (College Board, 2003, p. A-62) • Broadest analysis of validity eventually requires a program evaluation: when students are assigned to some treatment on the basis of a score, do better outcomes result?
Predictive Validity Analysis • Research questions: • How well do placement test scores predict “success” in the relevant gatekeeper course? • How well do other measures (such as high school performance) predict success, either instead of or in addition to placement test scores? • How many students are “correctly placed” using current placement test cutoffs to divide students, versus assigning all students to the same level?
What is “Gatekeeper Success”? • “Gatekeeper” course: first college-level course • We look at three measures: • Completed course with B or higher • Completed course with C or higher • Passed course (D- or higher) • These measures of success are all conditional upon actually enrolling in a gatekeeper course
Research Method Overview • Focus on first-time 2004-2007 entrants at two-year colleges only, who have CAS and placement test data • First, estimate statistical relationships between placement test scores (and/or other predictors) and gatekeeper success • Restrict sample to students who took gatekeeper without taking developmental coursework (“estimation sample”) • Then, regress gatekeeper success on placement test scores (and/or other predictors) to estimate relationships • Examine two summary measures: R-squareds and correlation coefficients • Second, use logistic regression to predict which students are likely to be “correctly placed” using different placement criteria
Methodological Concerns • Restriction of range • R-squareds, correlations are measured only for those who were placed directly into gatekeeper course • In general this tends to depress r-squareds and correlations • Extrapolation • For placement accuracy analysis, we must use relationships estimated on about 25% of the data to predict likelihood of “success” for the other 75% • So we must hope that the other 75% aren’t that different (not totally implausible)
Placement Accuracy Rates • We know who will be placed in dev ed or gatekeeper based on test scores • We can estimate whether or not given individual is predicted to succeed based on test scores • Can then assign each person to one of four cells • Placement accuracy rate is sum of bottom left/upper right cells • Can also compare this to accuracy rates without using test at all
Figure 1 False positives Acc. placed Accurately placed False neg.
Caveats • Maximizing placement accuracy rates may not be the goal • Our computation treats false positives and false negatives equally, but may care more about one than the other • Values about which type of error is worse can be inferred from where the cutoff is placed • Ex: Pr(passingGK) for math at cutoff is 67% • This means those that are just below cutoff are wrongly placed—false negatives—67% of the time • Could increase placement accuracy by lowering cutoff • But if we think failing someone in GK is 2x worse than making someone take developmental unnecessarily, then cutoff is in the right spot
Predictive Validity:Take-Away Messages • Placement tests are much better at predicting who is likelyto do well in gatekeeper than at predicting who is likelyto fail • Placement tests are more predictive of gatekeeper success in math than in english • High school academic measures are almost as predictive as math test scores, and more predictive than english test scores • Placement accuracy rates are only modestly higher in some cases, and substantially worse in others, than what would result if no tests were used • But weighting false positives and false negatives differently may change this conclusion • Analysis of effectiveness of remediation still to come
For more information: Please visit us on the web at http://ccrc.tc.columbia.edu, where you can download presentations, reports, CCRC Briefs, and sign-up for news announcements. Community College Research Center Institute on Education and the Economy, Teachers College, Columbia University 525 West 120th Street, Box 174, New York, NY 10027 E-mail: ccrc@columbia.edu Telephone: 212.678.3091 CCRC is funded in part by: Alfred P. Sloan foundation, Bill & Melinda Gates Foundation, Lumina Foundation for Education, The Ford Foundation, National Science Foundation (NSF), Institute of Education Sciences of the U.S. Department of Education