330 likes | 454 Views
Using Assessment Data for Continuous Improvement in General Education. AAC&U Conference Atlanta, Georgia February 19, 2005. Presented by…. Linda Pomerantz-Zhang, Ph.D. lpomerantz@csudh.edu C. Edward Zoerner, Jr., Ph.D. ezoerner@csudh.edu Sue Fellwock-Schaar, Ed.D. sschaar@csudh.edu
E N D
Using Assessment Data for Continuous Improvement in General Education AAC&U Conference Atlanta, Georgia February 19, 2005
Presented by… Linda Pomerantz-Zhang, Ph.D. lpomerantz@csudh.edu C. Edward Zoerner, Jr., Ph.D. ezoerner@csudh.edu Sue Fellwock-Schaar, Ed.D. sschaar@csudh.edu California State University, Dominguez Hills
Is your institution collecting data for your General Education Program?
What are you hoping to gain from this session on assessment?
CSUDH Profile • One of 24 CSU campuses, 4 in LA metro • 13,000 students, 61% undergrad • Small but growing number of FTF (5900 are upper division transfer students) • Approx 90% of incoming frosh need remediation in math, English, or both • High degree of cultural and racial diversity • ECLP grant to improve critical student skills in writing, reading, critical reasoning
Background to GE Assessment Activities… • Administrators unable to encourage faculty to initiate & sustain program review for GE • System-wide pressure to reduce size of GE package & streamline curriculum • Fear of “turf wars” • Inconsistent oversight from departmental level on up • General distrust of administration motives • 15 years of unsuccessful administrative efforts to reduce size of GE & BA/BS
And then came… • Growing faculty concern about student achievement levels • Creation of Student Learning Outcomes Assessment Committee • Creation of half-time Assessment Coordinator (faculty) • WASC focus on student learning outcomes and “culture of evidence” • ECLP grant aimed at improving student writing • GE Syllabus Analysis sponsored by ECLP • Embedded Assessment initiatives • Willingness to review GE in 5-year cycle
The tensions… Who controls the process? (Administration vs. Faculty) How are the data used? (Course improvement vs. faculty/program faultfinding)
Departments were notified in Spring, 2003, that materials should be gathered in Fall, 2003 and submitted in February, 2004
Questions we asked in planning our GE Assessment… • What types of assessment data should/could be collected? • What types of data analyses & who should conduct them? • Who should see the collected data and analyses? • How should assessment data be utilized and by whom? • What types of obstacles may be encountered in utilizing data? • How can the tension between assessment and faculty fears be managed? • How can an operative feedback loop be insured?
Initial procedures in Fall, 2003… Forms for the department review coordinators (chair or other as appointed) were distributed at initial meeting • Overview of the process • Course Assessment Record form for each Area A Objective • Coordinator Report forms • Instructor Forms to be attached to graded student work
Types Syllabi Student work with grades Sample exams with grades Analyzed for… Alignment w/ GE objectives Student centered, measurable course objectives Appropriate materials Academic rigor Commensurability across sections Data collected from faculty
Description of ways assessment methods were selected & aligned with GE objectives Description of how department ensured commensurability of course reqs & student learning across sections Description of how the results of the review will be used to improve the instructional program Recommendations for changes in the Area A learning outcome objectives Feedback about the GE review process Data collected from departments…
Collected from Registrar Grades of all courses under review, by section but without faculty names attached
Review Team (4-5 members) met to analyze documentation and data in March, 2004 Rubric was developed and review procedures were designed Team reviewed findings with Administrator who was working with us (LPZ) Administrator wrote letter regarding Team’s findings to GE Committee
GE Committee reviewed findings of Review Team GE Chair wrote letter to Department and attached letter from the Team (Only the letter from the GE Committee went to the Deans and Provost) Team requested preliminary update in Fall, more complete update in Spring 2005
Department’s Original Response • Fear that some would employ results to argue for only one semester of Freshman Comp • General mild resentment over process; some duplication of labor • Mild hope that some good could come from the review
Morals… • Make sure things are non-threatening—people will see it that way despite efforts to the contrary • Ask about local assessment practices and consider using (parts of) them before implementing a system-wide practice
Original Department Actions and Results… • Chair attended meeting to learn about process and forms • Chair attempted to gather required data and provide analysis • Chair underestimated time and energy required and turned in an originally inadequate report
Moral… • If possible, provide a model response to guide those writing reports
Departmental next steps… • Director of Composition assumed responsibility for data accumulation and analysis • Report much more closely conformed to desired parameters
Moral… • Get the right person for the job!
Principal responses from Review Team… • Concern over course objectives in Freshman Comp II • Concern about syllabus uniformity and match to catalog description, especially in Freshman Comp II • Concern about grading practices—higher than expected percentages in A and B range • Concern about the way individual instructors mark papers
Departmental response… • Director of Comp and Chair review syllabi more carefully; return to faculty for revision as needed • Memo of “Review of Objectives” and “Learning Outcomes” given to all instructors • Director of Comp held Grading standards Workshop • Memo to/conferencing with instructors regarding grading standards • Rebuttal to Review Team’s concern regarding instructors’ marking of papers
Morals… • Seemingly large problems have (potentially) simple solutions • Reviewers need to be careful about commenting on things outside their professional expertise • Dialog can be fruitful
Short-term results… • Generally improved syllabi, with greater clarity and more explicit focus on analytic writing • Little change in grade distribution
Morals… • Things can improve as a result of the assessment process • Attempts at improvement must be sustained
Longer-range results… • Heightened awareness of need to simplify Basic Skills Objectives in Catalogue • General departmental satisfaction with Freshman Composition sequence • Understanding of need to move students to analytical writing more quickly Changes in way pre-Freshman course is run
Longer-range results… (cont.) • Understanding that department needs to remain vigilant against grade inflation • Department needs to consider more systematic mechanism of review to ensure that course offerings uniformly meet desired objectives
Things we learned about the process… • Although not perfect, the process was workable • Rubrics were not a perfect fit but a reasonable guide • Provide rubrics BEFORE departments begin the review • Inter-rater training is important • Consider what kind of report you will give to departments—quantitative or narrative
Things we learned about the process… (cont.) • Five years is too long to wait for recommended changes to be reviewed • Departments who used coordinators generally turned in stronger portfolios • Need to increase awareness of the expectation that the departments should be doing their own internal review and evaluation as a part of the process
Things we learned about the process… (cont.) • Start early enough to catch the exceptions to typical scheduling • Plan for contingencies regarding team members • Should consider ways to have direct contact with students or information from them • Mixed results are likely, but waiting until everyone is ready is not in the best interest of the students