320 likes | 503 Views
Assessing a First-Year Seminar. FYS Leadership Institute Columbia, SC April 2012. FAITH-BASED?. “Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.”
E N D
Assessing a First-Year Seminar FYS Leadership Institute Columbia, SC April 2012
FAITH-BASED? • “Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.” • RICHARD HERSCH (2005). ATLANTIC MONTHLY
Assessment Defined • Any effort to gather, analyze, and interpret evidence which describes program effectiveness. • Upcraft and Schuh, 1996 • An ongoing process aimed at understanding and improving _______. • Thomas Angelo
3) Interpret Evidence 2) Gather Evidence 4) Implement Change 1) Identify Outcomes Assessment Cycle Maki, P. (2004).
Two Types of Assessment 1) Summative – used to make a judgment about the efficacy of a program 2) Formative – used to provide feedback in order to foster improvement.
Word of Caution Assessment only allows us to make inferences about our programs, not to draw absolute truths.
Multiple Lenses of Assessment • Criterion • Peer referenced • Longitudinal • Value added Suskie, L (2004)
Hypothetical Scenario • The average score on an end of writing assessment was 80. • How’d we do? • NEED MORE INFORMATION! • Need a lens to help us make a judgment.
Lens 1: Criterion Referenced(eg: placement tests, exit exams, externally mandated assessments) Key Question: How did students do against a pre-determined standard? Example: • 80% of students scored above 80, the minimum threshold for proficiency.
Lens 2: Peer Referenced (benchmarking) Key Question: How do we compare with our peers? • Gives a sense of relative standing. Example: • 80% of students were proficient. • 90% for our Peer Group
Lens 3: Longitudinal Key Question: Are we getting better? Example: • 80% of students were proficient • But 3 years ago, only 60% were proficient • Showed great improvement. • Is that due to our efforts? • Maybe we just admitted better students!
Lens 4: Value Added Key Question: Are our students improving? Example: • Proficiency level of freshman class during the first week was 60%. At the end of the semester, 80% of same cohort were proficient.
Astin’s Value-Added I – E – O Model E Environments IO Inputs Outcomes “outputs must always be evaluated in terms of inputs” Astin, A. (1991)
Common Mistakes Just looking at inputs E Environments IO Inputs Outcomes
Common Mistakes Just looking at environment E Environments IO Inputs Outcomes
Common Mistakes Just looking at outcomes E Environments IO Inputs Outcomes
Common Mistakes E Environments IO Inputs Outcomes E-O Only (No Control for Inputs)
Summary of Value Added • Outputs must always be evaluated in terms of inputs • Only way to “know” the impact an environment (treatment) had on an outcome.
Methods to Assess Outcomes • Indirect • Proxy measures that stand for the construct of interest; Self-reported data • Direct • Demonstration of abilities, information, knowledge, etc. as the result of participation in a program or utilization of a service
Indirect Measure • An indirect measure is something a student might tell you he or she has gained, learned, experienced, etc. • Aka: self-reported data • Ex: surveys, interviews, focus groups, etc.
Indirect Assessment Methods • Examples • Satisfaction measures • Program evaluations • Self-ratings of skills • Self-assessment of change • Agreement with statements • Inventories • Informal peer-to-peer conversations • Use existing data to every extent possible
Survey Examples for Indirect Measures • End of course evaluation (local) • College Student Experiences Questionnaire (CSEQ) • National Survey of Student Engagement (NSSE) • Community College Survey Student Engagement (CCSSE) • Your First College Year (YFCY) • First-Year Initiative Survey (FYI) http://nrc.fye.sc.edu/resources/survey/search/index.php
Qualitative Examples for Indirect Measures • Interviews • Focus groups • Secret shopper • Advisory council
Direct Measures • A direct measure is tangible evidence about a student’s ability, performance, experience, etc. • Ex: performances (papers), common assignments, tests, etc.
Ways to assess direct measures • Course embedded (essays, assignments) • Portfolios (electronic or hard copy) • Writing sample at beginning of course v. end of course. • Pre-and post-testing on locally developed tests (of knowledge or skills) • National tests • http://www.sc.edu/fye/resources/assessment/typology.html
Caution! • Be careful not to overextend by directly measuring multiple outcomes each year. • Indirectly assess all outcomes each year, but limit direct measures to 1-3 per year.
Factors to consider when deciding which outcome to directly measure • What matters most to overall effectiveness? • Which outcomes would position us well politically? • What’s doable? • What’s usable? • Which outcomes will set us up for a successful first effort?
Motivation (for direct measures) How do we ensure students take assessment seriously? Is there a hook? Is growth due to our interventions? How do you control for all the variables that could influence the outcomes? Challenges with Value Added Approach