290 likes | 400 Views
Implementing Sustainable, Useful Assessment. So What’s the Problem?. The Perception.
E N D
Implementing Sustainable, Useful Assessment So What’s the Problem?
The Perception • “Seasoned observers have pointed out the irony of the academy, as an institution dedicated to discerning the truth through evidence, being so seemingly resistant to measuring quality through evidence. It is an irony that puzzles—and frustrates—a widening circle of stakeholders.” Perspectives, American Association of State Colleges and Universities – Spring 2006
The Perception • “Too many decisions about higher education—from those made by policymakers to those made by students and families—rely heavily on reputation and rankings derived to a large extent from inputs such as financial resources rather than outcomes.” Report from Dept. Of Education’s National Commission on Education, 2006
The Perception Accountability Agenda: • "Expand and refine evidence of institutional performance and student achievement, assuring that this vital evidence is used as the key factor in determination of the quality of higher education;" Council for Higher Education Accreditation
Question to Ponder • Why, after over a decade of intensive effort by accrediting agencies and leaders in higher education, are we still struggling with outcomes assessment implementation? • How we answer this question determines what solutions we seek.
Technical Deficit Hypothesis • We haven’t yet found the necessary technical tools – software, measuring instruments, procedures – that will help us satisfy our accrediting agencies without disrupting our teaching and research practices.
Motivational Hypothesis • Although there is a need for technical improvements, we already have the necessary tools to successfully proceed. • The primary problem is that past and current incentives in higher education favor the status quo – input based assessment.
Interaction Hypothesis • Technical and motivational factors interact. • To succeed, faculty must: • Have the necessary tools • Have the knowledge to use them • Be motivate to do so. Deficiency in any area can impede progress.
ToolsKnowledgeMotivation • Necessary to • Define & structure the process • Minimize clerical requirements • Provide institution-wide coordination • Provide common recording and reporting formats. • TracDat, iWebfolio, Measures …
ToolsKnowledgeMotivation • Training • Startup Consultation • Technical & Conceptual – all levels • Resident Trainers/Advocates • On-going requirement • Must be formalized • Tied to job-role not person
ToolsKnowledgeMotivation • Observation suggests that motivation presents the greatest challenge and has received the least systematic attention. • Note to Audience: Pay attention to what people do, and not what they say they do.
The Fundamental Issue Inputs Outputs What We Provide Teaching Practices Teacher Qualities Courses Programs Facilities Equipment Credentials Independent Variables Results We Produce Student learning Alumni achievements Community Impact Dependent Variables
Consider … • Classroom Visits • Student Evaluations • Accreditation reviews • Program reviews • Where’s the focus – Inputs or Outputs?
How it is $ equipment $ $credentials $ Inputs Outputs Faculty/Staff Focus $ facilities $ $ teaching $
What’s Expected Student Learning Alumni Achievements Inputs Outputs Faculty/Staff Focus Community Impact
Consequences of Input Focus • Program effectiveness remains unknown • Drives up program costs • Impedes search for more effective inputs.
So … • I’ve suggested that: • Progress toward effective outcome-focused assessment requires: • Appropriate tools • Knowledge to effectively use them • Motivating conditions • The motivation factors present the greatest challenge and have received least systematic attention.
Factor 1 • Relative economic advantage • Public impressed by inputs • The package sells, not the content • Economic advantage favors inputs, but events signal impending change.
Factor 2 • Social Prestige • Impressive inputs = high prestige • More PhDs • Bigger labs • Famous faculty
Factor 3 • Vested Interests • Faculty want their degrees to be valued. • Wealthy schools want their lavish resources to be valued. • Faculty feel safer with input assessment • Input focus allows grandiose outcome claims • Input focus enriches our environments • But strain our budgets
Factor 4 • Ease of observing advantages • Inputs – advantages are obvious – easily measured • Outcomes – advantages are nebulous or unknown – difficult to measure
Score: 4 to 0 • Economic advantage still favors inputs • Prestige still based largely on inputs • Vested interests support inputs • Advantage-recognition favors inputs
Transition • We are currently in a transition phase. • Measurement of outcome is being demanded, but inputs are still being used as the primary indicators of quality.
Transition • Assessment based on learning outcomes holds the promise of clear advantages for students and society. • Hence the growing external demand • But tangible advantages for educational institutions and educators are still lacking. • Hence the grudgingly slow response
Transition • Most of us do assessment to avoid aversive consequences, not to achieve positive outcomes. • We must come to recognize that pursuing outcome improvement is ultimately in our best interest.
The Challenge • Convince boards, administrators and faculty that the rules of survival are changing. • Identify and put in place in-house incentives for outcome-focused assessment.
Possibilities • Promulgation of positive outcomes and improvements • Provides recognition • Provides exemplars • Outcome-based resource allocation • Outcome-based personnel evaluation
The Choices • Wait for the public consequences to be put in place and risk not being prepared. • Pro-actively implement in-house incentives favoring outcome-focused assessment. • What affects faculty & staff matters.
Heads Up! The ball’s in your court … Don’t Blow It! Thank You