1 / 38

Session 353 Braden J. Hosch, Ph.D. Director of Institutional Research & Assessment

Time on Test, Student Motivation, and Performance on the Collegiate Learning Assessment: Implications for Institutional Accountability. Session 353 Braden J. Hosch, Ph.D. Director of Institutional Research & Assessment Central Connecticut State University AIR Annual Forum, Chicago, IL

Download Presentation

Session 353 Braden J. Hosch, Ph.D. Director of Institutional Research & Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Time on Test, Student Motivation, and Performance on the Collegiate Learning Assessment: Implications for Institutional Accountability Session 353 Braden J. Hosch, Ph.D. Director of Institutional Research & Assessment Central Connecticut State University AIR Annual Forum, Chicago, IL June 1, 2010

  2. Overview • Voluntary System of Accountability and the Collegiate Learning Assessment • Institutional Profile • Methodology and Limitations • Findings • Implications for Accountability

  3. Voluntary System of Accountability • Initiative among public colleges led by APLU and AASCU to provide public with comparable information in same format • Pre-emptive response to findings issued by Spellings Commission (2006) • Learning outcomes must be posted by Spring 2011

  4. VSA Learning Outcomes • Results from cross-sectional administration to first-year students and seniors of one of three tests: • Collegiate Learning Assessment (CLA) • Collegiate Assessment of Academic Proficiency (CAAP) • Measure of Academic Proficiency and Progress (MAPP) • Template reports scaled assessment scores, SAT/ACT scores of tested students, and an institutional relative-to-expected (RTE) score

  5. Collegiate Learning Assessment • Constructed response test that measures: • Critical Thinking • Problem Solving • Analytical Reasoning • Writing • Two tasks: • Performance task (90 minute time limit) • Analytic writing task (60 minute time limit) • Scored holistically and converted to scaled score,percentile score, and RTE score.

  6. Central Connecticut State University • Public – part of Connecticut State Univ. System • Carnegie 2005 Master’s-Larger Programs • New Britain, CT (Hartford MSA: ~ 1.2 million pop.) • Fall 2009 Enrollment: • 12,461 headcount (9,989 undergraduate, 22% residential); 9,619 full-time equivalent enrollment • 52% female; 17% minority • Full-time, first-time students: 1,281 (56% residential) • Mean SAT score: 1025 (F 2009) • Six-year graduation rates of full-time, first-time students entering in Fall 2003: 49%

  7. CCSU CLA Percentile Scores

  8. Detailed Results by Semester

  9. Test Administration • Test administration procedures evolved over time because of difficulty in recruitment. • First-year students recruited through FYE courses • Incentives varied by instructor • Seniors recruited primarily through email • Incentive = graduation regalia (~$25 - $40)

  10. First-Year Students

  11. Seniors

  12. Methodological Issues & Limitations • Methodological Issues • Cross-sectional design • Different proctors in 2007-08 than 2008-09 and 2009-10 • Different incentives for FY students and seniors • Within administrations • Across administrations • Hand-timing • Time on test as an imperfect proxy for motivation

  13. Minutes Spent on Test by Scaled CLA Score (All Available)

  14. Minutes Spent on Test by Scaled CLA Score (2007-08)

  15. Minutes Spent on Test by Scaled CLA Score (2008-09)

  16. Minutes Spent on Test by Scaled CLA Score (2009-10)

  17. Minutes Spent on Test by Relative-to-Expected CLA Score (All Available)

  18. Minutes Spent on Test by Relative-to-Expected CLA Score (2007-08)

  19. Minutes Spent on Test by Relative-to-Expected CLA Score (2008-09)

  20. Minutes Spent on Test by Relative-to-Expected CLA Score (2009-10)

  21. Correlations (First-Year Students)

  22. Correlations (Seniors)

  23. Regressions (First-Year Students)

  24. Regressions (Seniors)

  25. Survey Results • Differences in test scores and time spent on test by self-reported motivation were suggestive but not statistically significant. • Scores and time usage aside, the percentage of students who agreed or strongly agreed they were highly motivated to participate in CLA: • 34% First-Year Students • 70% Seniors

  26. Self-Reported Motivation I feel highly motivated to participate in this activity today Differences are suggestive but NOT statistically significant

  27. Test Modality Preference I perform better on essay tests than on multiple choice tests. Differences are suggestive but NOT statistically significant

  28. Assessment Modality Preference I prefer to take a test rather than write a paper.

  29. Test Anxiety I get so nervous when I take tests that I don't usually perform my best work.

  30. Student Responsibility for Learning Students are responsible for learning material assigned by their professors * Sig. at p<0.05 ANOVA test.

  31. Institution Responsibility for Learning Colleges and universities are responsible if students don't learn what they need to be successful after they graduate.

  32. Mandatory College Exit Tests All college students should be required to pass a standardized exit test in order to graduate Differences are suggestive but NOT statistically significant

  33. Publications of College Rankings Students should use published college rankings (like US News and World Report) when deciding which school to attend Differences are suggestive but NOT statistically significant

  34. Overall Findings • Time spent on test MATTERS • What students say about their motivation may not matter (much) • The relationship between time and test scores is generally missing from discussions about accountability

  35. Implications (1) • Acknowledge that test scores may be influenced by motivation/time spent on test; support further research into these effects. • Longitudinal testing may help control for some of the effects of motivation / time spent on test.

  36. Implications (2) • Multi-year moving averages might improve meaningfulness of test-score information. • Statistical adjustments based on time spent on test should be explored but may not be technically or politically feasible.

  37. Implications (3) • Explore portfolio or other contextual assessment strategies for accountability, esp. among consortia of institutions (not unlike athletics conferences) • Recognize that motivation and time effects are also likely present in elementary and secondary education; consider extent to which performance reflects cognitive vs. behavioral/motivational outcomes.

  38. Questions Contact Information Session 353 Dr. Braden J. Hosch Director of Institutional Research & Assessment Central Connecticut State University hoschbrj@ccsu.edu Paper, handout, and slides online at: http://www.ccsu.edu/page.cfm?p=1973 (see “Research and Presentations”)

More Related