1 / 52

Assessing First-Year Seminars

Assessing First-Year Seminars. First-Year Assessment Conference San Antonio, TX October 13, 2008. Dan Friedman, Ph.D. Director, University 101 University of South Carolina. Agenda. What is Assessment? Assessment Lenses Who & What to Assess? I-E-O model applied Goals v. outcomes

Anita
Download Presentation

Assessing First-Year Seminars

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing First-Year Seminars First-Year Assessment Conference San Antonio, TX October 13, 2008 Dan Friedman, Ph.D. Director, University 101 University of South Carolina

  2. Agenda • What is Assessment? • Assessment Lenses • Who & What to Assess? • I-E-O model applied • Goals v. outcomes • Formulating learning outcomes

  3. Agenda 4. How to Assess • Direct v. Indirect Measures • Elective v. Required Courses • Assessing Pedagogies 5. Sharing & Utilizing the Results 6. Final Advice

  4. FAITH-BASED? • “Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.” • RICHARD HERSCH (2005). ATLANTIC MONTHLY

  5. What is Assessment?

  6. Assessment Defined • Any effort to gather, analyze, and interpret evidence which describes program effectiveness. • Upcraft and Schuh, 1996 • An ongoing process aimed at understanding and improving _______. • Thomas Angelo

  7. Interpret Evidence Gather Evidence Implement Change Identify Outcomes Assessment Cycle Maki, P. (2004).

  8. Two Types of Assessment 1) Summative – used to make a judgment about the efficacy of a program 2) Formative – used to provide feedback in order to foster improvement.

  9. Word of Caution Assessment only allows us to make inferences about our programs, not to draw absolute truths.

  10. The Prescription R x

  11. Rx for Assessing a 1st-year Seminar • Relevance • Content (doing the right things) • Excellence • Effectiveness (doing things right)

  12. Assessment Lenses

  13. Multiple Lenses of Assessment • Standards based • Peer referenced • Longitudinal • Value added Suskie, L (2004)

  14. Hypothetical Scenario • If Dan made a 55 on some sort of exam, how did he do? • NEED MORE INFORMATION! • Need a lens to help us make a judgment.

  15. Lens 1: Standards Based(local or external) Key Question: How do results compare to some internal or external standard? Example: • Dan made a 55 • A score of 45 is considered proficient • 80% of students at our institution scored above a 45 • Is that good?

  16. Lens 2: Peer Referenced (benchmarking) Key Question: How do we compare with our peers? • Gives a sense of relative standing. Example: • 80% of students at our institution scored above a 45. • For our Peer Group, 90% scored above 45.

  17. Lens 3: Longitudinal Key Question: Are we getting better? Example: • 80% of students at our institution scored above a 45. • But 3 years ago, only 60% scored above a 45. • Showed great improvement. • Is that due to our efforts? • Maybe we just admitted better students!

  18. Lens 4: Value Added Key Question: Are our students improving? Example: • Dan scored a 35 when he first took the test as a freshman. After three years of college, Dan scored a 55. • Proficiency level of freshman class was 40%. Three years later, 70% of same cohort were proficient.

  19. Astin’s Value-Added I – E – O Model E Environments IO Inputs Outcomes “outputs must always be evaluated in terms of inputs” Astin, A. (1991)

  20. Common Mistakes Just looking at inputs E Environments IO Inputs Outcomes

  21. Common Mistakes Just looking at environment E Environments IO Inputs Outcomes

  22. Common Mistakes Just looking at outcomes E Environments IO Inputs Outcomes

  23. Common Mistakes E Environments IO Inputs Outcomes E-O Only (No Control for Inputs)

  24. Summary of Value Added • Outputs must always be evaluated in terms of inputs • Only way to “know” the impact an environment (treatment) had on an outcome.

  25. Who & What toAssess

  26. Inputs • An input would be any pre-enrollment variable regarding our students that could conceivably impact the outcome. • What are our inputs? • Academic preparedness (high school performance; SAT scores, etc) • Demographics (gender, race, parental education, etc) • Attitudes & Behaviors • Motivation • Expectations regarding level of engagement in college • Study habits

  27. Sources of Input Data • Admissions • Registrar • Institutional Research • Surveys • College Student Inventory (CSI) • College Student Expectations Questionnaire (CSXQ) • Beginning College Survey of Student Engagement (BCSSE) • Freshman Survey (Higher Education Research Institute – UCLA). • Survey of Entering Student Engagement (SENSE) –for C.C. http://www.sc.edu/fye/resources/assessment/typology.html

  28. Environment • The environment = intervention or treatment.

  29. Environment • Is the FYS really just one treatment? • What are the individual variables in a FYS that could contribute to our outcomes?

  30. Environment Individual factors comprising a first-year seminar that could contribute to outcomes: • Small class size • Out-of-class engagement • Faculty-student interaction • Peer connections • Use of peer leader • Specific content • Time management • Academic skill development

  31. Outcomes • Academic Outcomes • Grades, Persistence, Graduation • Writing • Personal Development Outcomes • Social, emotional, ethical, physical • Attitudinal & behavioral • Satisfaction • Engagement in learning experience • Time management • Cognitive • Knowledge of specific content • Wellness, campus policies, school history, etc.

  32. We should measure our specific program goals & outcomes!

  33. General, broad, and abstract Ex: Help students achieve academic success Goal Outcome • Specific and concrete • Ex: Students will strengthen their note-taking skills.

  34. Learning Outcomes (aka Objective) • A statement that “identifies what students should be able to demonstrate or represent or produce as a result of what and how they have learned at the institution or in a program” (p. 61). Maki, P.L (2004). Assessing for Learning.

  35. A good learning outcome is… • Observable – action words – what should students be able to DO • Focused on outcomes – what students should be able to do after the course • “as a result of this course, students should….” • Clear – no fuzzy terms (appreciate!) • Use active verbs (create, develop, evaluate, apply, identify, formulate, etc) • Maki, P.L (2004). Assessing for Learning.

  36. Examples • As a result of this course, students should be able to: • Locate and evaluate electronic information in the university’s library. • Identify appropriate campus resources • Articulate the purpose of general education

  37. Evidence of Learning • What evidence is necessary to sufficiently infer that a student has met or achieved a specific outcome? Students will strengthen their note-taking skills. • What does this look like? • Need to develop standards, criteria, metrics, etc

  38. 4 How to Assess? Direct v. Indirect Measures

  39. Indirect Measure • An indirect measure is something a student might tell you he or she has gained, learned, experienced, etc. • Aka: self-reported data • Ex: surveys, interviews, focus groups, etc. • Use existing data to every extent possible

  40. Survey Examples for Indirect Measures • College Student Experiences Questionnaire (CSEQ) • National Survey of Student Engagement (NSSE) • Community College Survey Student Engagement (CCSSE) • Your First College Year (YFCY) • First-Year Initiative Survey (FYI) http://nrc.fye.sc.edu/resources/survey/search/index.php

  41. Qualitative Examples for Indirect Measures • Interviews • Focus groups • Advisory council

  42. Direct Measures • A direct measure is tangible evidence about a student’s ability, performance, experience, etc. • Ex: performances (papers), common assignments, tests, etc.

  43. Ways to assess direct measures • Course embedded (essays, assignments, etc) • Portfolios (electronic or hard copy) • Writing sample at beginning of course v. end of course. • Pre-and post-testing on locally developed tests (of knowledge or skills) • National tests • http://www.sc.edu/fye/resources/assessment/typology.html

  44. Motivation (for direct measures) How do we ensure students take assessment seriously? Is there a hook? Is growth due to our interventions? How do you control for all the variables that could influence the outcomes? Challenges with Value Added Approach

  45. Making Comparisons • For elective courses – compare with students who did not enroll (control group). • For REQUIRED courses – can only compare with Peer Institutions (benchmarking) or with prior years (longitudinal).

  46. Other Considerations • Do all types of students and sub-populations experience or benefit from the course in the same way? • Disaggregate data by sub-populations • Ex: • Minority • First-generation • Gender • Ability level • When looking at GPAs, it might be wise to factor out FYS grade.

  47. Sharing & Utilizing the Results

  48. “You Can’t Fatten A Pig by Weighing It” -T.Angelo

  49. Ways to Share Results • Host forum to process what the data mean • Standing assessment committee for FYS • Newsletters • Website • Chain of command

  50. Final Advice

More Related