270 likes | 379 Views
Different Perspectives on the Assessment Mandate: The Results of a Survey . Neil Pagano Associate Dean Columbia College Chicago npagano@colum.edu. 139 th Belmont Stakes. Question Posted to Assess Listserv (2004).
E N D
Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia College Chicago npagano@colum.edu 2007 NASPA Assessment & Retention Conference
139th Belmont Stakes 2007 NASPA Assessment & Retention Conference
Question Posted to Assess Listserv (2004) “Is there any evidence that a higher education assessment/evaluation of student learning program has indeed produced positive (or negative) change in the quantity or quality of what students actually learn. There seems to be a lot of anecdotal information about how assessment/evaluation programs were created and implemented, but not any actual empirical support. Considering the logistical and personnel-related ramifications of such an undertaking, any success in getting a program off the ground and moving is certainly noteworthy and to be commended. However, I am trying to prepare a report on the actual effectiveness of assessment/evaluation programs. Is there a program that is doing what it is purported to do: improving student learning. If so, is there any (weak, so-so, or solid) empirical evidence to this effect? Any good studies?” 2007 NASPA Assessment & Retention Conference
Two Responses: • “Assessment done well is effective and assessment done poorly is not effective.” • Three possible explanations: • Assessment is still relatively new • Assessment is “decentralized or course-embedded” • “Faculty already do a darned good job teaching, and their assessment results simply document that.” 2007 NASPA Assessment & Retention Conference
Prior Research • Peterson, Einarson, Augustine, & Vaughan (1999), Institutional Support for Student Assessment: Methodology and Results of a National Survey • Survey - ISSA: Purposes, Methods, Structures, & Impact • Preparing for self-study or accreditation (1st in importance) • Improving the achievement of undergraduate students (2nd in importance) 2007 NASPA Assessment & Retention Conference
Conclusions… “Institutions do not routinely use student assessment data in internal decision-making or monitor its impact on important areas of institutional and student performance. Given the extensive claims made for the value of students’ assessment and the substantial human and financial resources invested in student assessment activities, institutions need to give greater priority to examining how student assessment data is used, and how it impacts the performance of individual students and the institution itself.” 2007 NASPA Assessment & Retention Conference
Follow-Up Study • Peterson, Vaughan, & Perorazio (2002). Student Assessment in Higher Education: A Comparative Study of Seven Institutions • “Exemplary” institutions for “benchmarking” • Ten domains, including Initiating Conditions, Institutional Approach, Culture, and Utilization • Only one institution (Wake Forest University) used assessment results “extensively” 2007 NASPA Assessment & Retention Conference
Research Questions • What are the reasons for undertaking assessment? • What assessment methods are used and which are valued? • How effective have these assessment efforts been? • What variables (institution-type, control, respondent position) impact responses to Qs 1, 2 & 3? 2007 NASPA Assessment & Retention Conference
Survey: Four Sections • Purpose • Methods Used • Methods Valued • Effect of Assessment Efforts 2007 NASPA Assessment & Retention Conference
Survey Distribution and Responses • Two Listservs: Assess (University of Kentucky) and Communities of Practice • Snowball Sampling for Further Coverage • 331 Total Completes 2007 NASPA Assessment & Retention Conference
Limitations • Sampling Method not Random • Purposive method likely to recruit the “choir” • Mixture of Respondents • Some from same institution • Basic Statistical Analysis – ANOVA, T-Tests, and Chi Square 2007 NASPA Assessment & Retention Conference
Survey Respondents by Institutional Type and Position 2007 NASPA Assessment & Retention Conference
Purposes of Assessment by Institution Type - ANOVA 2007 NASPA Assessment & Retention Conference
Purposes of Assessment by Position - ANOVA 2007 NASPA Assessment & Retention Conference
Purposes: A Comparison to the ISSA 2007 NASPA Assessment & Retention Conference
Assessment Methods Used Scale: 1 = not used; 2 = used in some areas; 3 = used in most areas; 4 = used in all areas 2007 NASPA Assessment & Retention Conference
Assessment Methods Used by Institution - ANOVA 2007 NASPA Assessment & Retention Conference
Assessment Methods Used by Institution – ANOVA (cont.) 2007 NASPA Assessment & Retention Conference
Assessment Methods Valued 2007 NASPA Assessment & Retention Conference
Methods Used vs. Methods Valued 2007 NASPA Assessment & Retention Conference
Methods Valued by Institution Type - Chi Square • Associate Institutions placed less value in: • Alumni Surveys (p = .008) • Capstone Courses (p = .002) • Baccalaureate Institutions placed less value in Employer Surveys (p = .008) 2007 NASPA Assessment & Retention Conference
Methods Valued by Position – Chi Square • Faculty placed relatively less value in 9 of the 12 Methods. Statistically significant differences in: • Departmental Exams (p = .006) • Student Papers (p = .001) • Student Portfolios (p = .016) • Capstone Courses (p < .000) • Commercial Exams (p = .046) • Student Interviews/Focus Groups (p = .023) • Faculty placed more value in Anecdotal Evidence (p = .132) 2007 NASPA Assessment & Retention Conference
Perspectives on the Effects of the Assessment Mandate: 4 Survey Items 1. Our institutional assessment efforts have been effective. 2. Our institutional assessment efforts have identified areas where we need to make curricular/programmatic changes. 3. We have made curricular/programmatic changes as a result of our assessment. • It is important that every institution have an assessment plan. • No differences in Institutional Type or Control 2007 NASPA Assessment & Retention Conference
Perspective on Effect of Assessment: by Position - ANOVA 2007 NASPA Assessment & Retention Conference
Closing Comments • Institution Type Matters • Different institutions have different priorities and purposes • Position Matters • Faculty, Assessment Leaders and Administrators differ on purposes, methods valued, and the ultimate effect of the mandate 2007 NASPA Assessment & Retention Conference
Closing Comments • Accreditation is an Important Lever • Effects of revised expectations • Need to Know More • US Higher Ed Post-Spellings Commission • What is “Assessment?” 2007 NASPA Assessment & Retention Conference
Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia College Chicago npagano@colum.edu 2007 NASPA Assessment & Retention Conference