1 / 30

George D. Kuh Council of Graduate Schools Washington, DC November 22, 2010

Learning Outcomes Assessment: A National Perspective. George D. Kuh Council of Graduate Schools Washington, DC November 22, 2010. Advance Organizers. What kind of information about student learning is compelling and useful for: (a) guiding improvement efforts?

psyche
Download Presentation

George D. Kuh Council of Graduate Schools Washington, DC November 22, 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Outcomes Assessment: A National Perspective • George D. Kuh • Council of Graduate Schools • Washington, DC • November 22, 2010

  2. Advance Organizers • What kind of information about student learning is compelling and useful for: (a) guiding improvement efforts? (b) responding to accountability demands? • What can be done to prepare the next generation of faculty and motivate the current generation to collect and use assessment results to enhance student learning? • And what about assessing learning in graduate school?!? 

  3. Context Global Competitiveness in Degree Attainment The New Majority and Demographic Gaps Questionable Levels of Student Performance

  4. NOLOA “Colleges… do so little to measure what students learn between freshman and senior years. So doubt lurks: how much does a college education – the actual teaching and learning that happens on campus – really matter?” David Leonhardt, NYTimes, Sept 27, 2009

  5. Context Global Competitiveness in Degree Attainment The New Majority and Demographic Gaps Questionable Levels of Student Performance In a Most Challenging Fiscal Environment …  We Need Higher Levels of Student Achievement

  6. Assessment 2010 • Greater emphasis on student learning outcomes and evidence that student performance measures up

  7. “It’s the Learning, Stupid”

  8. Working Definition Assess: (v.): to examine carefully Assessment is the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development (Palomba & Banta, 1999, p. 4)

  9. Assessment Purposes • Improvement • Accountability

  10. Indicators learning outcomes educational attainment (persistence, graduation) course retention transfer student success success in subsequent courses degree/certificate completion graduate school employment/employer evaluations capacity for lifelong learning

  11. Assessment Tools • Direct (outcomes)measures -- Evidence of what students have learned or can do • Indirect (process)measures -- Evidence of effective educational activity by students and institutions

  12. Occasional Paper #1 Assessment, Accountability, and Improvement Peter T. Ewell Assessments of what students learn during college are typically used for either improvement or accountability, and occasionally both. Yet, since the early days of the “assessment movement” in the US, these two purposes of outcomes assessment have not rested comfortably together.  www.learningoutcomeassessment.org/OccasionalPapers.htm

  13. Two Paradigms of Assessment Ewell, Peter T. (2007). Assessment and Accountability in America Today: Background and Context. In Assessing and Accounting for Student Learning: Beyond the Spellings Commission. Victor M. H. Borden and Gary R. Pike, Eds. Jossey-Bass: San Francisco.

  14. Assessment 2010 • Greater emphasis on student learning outcomes and evidence that student performance measures up • Demands for comparative measures • Increased calls for transparency ---public disclosure of student and institutional performance

  15. Templates • APLU/AASCU Voluntary System of Accountability • NAICU’s U-CAN • College Navigator (NCES) • Transparency by Design/College Choices for Adults (WCET) • AACC (yet to be named) • Degree Qualifications Inventory • Alliance Guidelines • NILOA Transparency Framework

  16. Assessment 2010 • Greater emphasis on student learning outcomes and evidence that student performance measures up • Demands for comparative measures • Increased calls for transparency ---public disclosure of student and institutional performance • Assessment “technology” has improved markedly, but still is insufficient to document learning outcomes most institutions claim

  17. Sample Data Sources • Locally-developed measures • National instruments • National Survey of Student Engagement (NSSE) • Beginning College Survey of Student Engagement (BCSSE) • Faculty Survey of Student Engagement (FSSE) • Cooperative Institutional Research Program (CIRP) • Your First College Year (YFCY) • College Student Experiences Questionnaire (CSEQ) • Noel Levitz Student Satisfaction Inventory • ETS MAPP and Major Field Tests • ACT Collegiate Assessment of Academic Proficiency • Collegiate Learning Assessment (CLA) • Institutional data -- GPA, financial aid, transcripts, retention, certification tests, alumni surveys, satisfaction surveys… • Electronic portfolios

  18. Valid Assessment of Learning in Undergraduate Education (VALUE) Rubrics Inquiry and analysis Critical thinking Creative thinking Written communication Oral communication Reading Quantitative literacy Information literacy Teamwork Problem solving Civic knowledge and engagement Intercultural knowledge and competence Ethical reasoning and action Foundations and skills for lifelong learning Integrative learning

  19. AAC&U VALUE Project – 15 Rubrics

  20. Measuring Quality in Higher Education (Vic Borden & Brandi Kernel, 2010) Web-based inventory hosted by AIR of assessment resources. Key words can be used to search the four categories: • instruments (examinations, surveys, questionnaires, etc.); • software tools and platforms; • benchmarking systems and data resources; • projects, initiatives and services. http://applications.airweb.org/surveys/Default.aspx

  21. Do we measure what we value?orDo we value what we measure? Wise decisions are needed about what to measure in the context of campus mission, values, and desired outcomes.

  22. Summary • Perhaps more assessment underway than some acknowledge or wish to believe • More attention needed to using and reporting assessment results • Involving faculty is a major challenge • More investment likely needed to move from data to improvement

  23. “high importance” 85% Regional 80% Specialized According to Provosts, what is the driving force for assessment? a. Institutional Commitment to Improvement b. Accreditation c. Faculty & Staff Interest d. Governing Board Mandate

  24. Summary • Perhaps more assessment underway than some acknowledge or wish to believe • More attention needed to using and reporting assessment results • Involving faculty is a major challenge • More investment likely needed to move from data to improvement • Accreditation is a major force shaping assessment

  25. Regional accreditors cite deficiencies in student learning outcomes assessment with greater frequency • Middle States - 2/3 of institutions have follow-up; number one reason being assessment • NEASC - 80% of institutions asked for follow-up on student learning outcomes assessment • HLC - 7 out of 10 institutions are being monitored; the vast majority for student learning outcomes assessment.

  26. Looking Back: What’s Been Accomplished? • Assessment Seen as Legitimate • Goals for Learning Established • A “Semi-Profession” for Assessment • Much Better Instruments and Methods

  27. Looking Back: What Remains to be Done? • Authentic Faculty Ownership • Assessment Still an “Add-On” • Use of Information for Improvement is Underdeveloped • Sincere Institutional Engagement with Accreditors in Assessment

  28. Advance Organizers • What kind of information about student learning is compelling and useful for: (a) guiding improvement efforts? (b) responding to accountability demands? • What can be done to prepare the next generation of faculty and motivate the current generation to collect and use assessment results to enhance student learning? • Do we care about assessing learning in graduate school?!? 

  29. Questions & Discussion

More Related