200 likes | 411 Views
An Institutional Writing Assessment Project. Dr. Loraine Phillips Texas A&M University Dr. Yan Zhang University of Maryland University College October 2010. Agenda. Why a Writing Assessment Project? Data Source of the project College Participation Assessment Rubric
E N D
An Institutional Writing Assessment Project Dr. Loraine Phillips Texas A&M University Dr. Yan Zhang University of Maryland University College October 2010
Agenda • Why a Writing Assessment Project? • Data Source of the project • College Participation • Assessment Rubric • Scoring of the papers • Inter-rater agreement • Results • Discussion and future of the project.
Why a Writing Assessment Project? • Undergraduate core competencies were established for Texas A&M graduates. • Effective communication skills are crucial to student success. • Using this project to provide evidence of the quality of student writing allows participating faculty to understand their students performance more comprehensively. • The project includes the Office of Institutional Assessment in conjunction with the University Writing Center.
Data Source • The data in this writing assessment project were student papers from: • Upper-Division • Capstone • or Upper-Division “W” (writing intensive) courses. • Assignments were approximately 1-20 pages in length. • Assignments that appealed to a general academic audience were preferred. • Examples given included: persuasive or argument papers, summary papers, analysis papers, letters or correspondence, lab or other reports, and case studies.
Assessment Rubric • The writing assessment rubric was developed in conjunction with the University Writing Center, the assessment liaisons, and the Core Curriculum Council. • The rubric was designed to promote validity, uniformity, and consistency in the grading process. • The assessment rubric was categorized into four specific criteria to help manage grading. • After feedback on the rubric from the project pilot, the rubric was adapted to the following figure.
Rater Calibration and Scoring • All identifiable information for students and faculty was redacted from the papers. • All day grading sessions were conducted with Dr. Valerie Balester, Executive Director of the University Writing Center, and Dr. Candace Schaefer, Associate Director of the University Writing Center, serving as facilitators. • Faculty members were calibrated by paper genre for the scoring session. • The scoring sheet is provided in the following slide.
Writing Assessment Project Scoring Sheet • The grading was done by faculty members of the institution from across disciplines. • Grader participation is included in the following slide.
Interrater Agreement • Each writing assessment assignment was scored by two independent graders, with a third if large disagreement. • Interrater agreement was judged to be statistically substantial (.624). • As the intraclass correlation coefficient (6.24) approaches 1.0, there is less variance within item ratings.
Interrater Agreement (Continued) • The rate at which two graders agreed on an assessed item by giving that item the same score was reviewed. • Simple agreement between raters on the scores of the items assessed showed a descriptive mean of .676. • Thus, approximately 67% of the time, two independent graders assessed an item and then scored that item the same value.
Results • As previously noted, the scoring of each category was on a scale of 1 to 3 (3 being highest quality). • The following table displays the university averages based on the departments that participated. • Each category scored has a mean and standard deviation.
Texas A&M University Overall Writing Scores (Scale of 1-3) 2009-2010
Texas A&M University Overall Writing Scores (Scale of 1-3) 2008-2009
Strengths of the Project • Faculty engagement and participation • Cross-disciplinary approach • Helps faculty define student writing quality • Helps faculty calibrate expectations for the quality of student writing
Challenges for the Project • Getting the papers! • Representative sample • Calibrating faculty • Long day of scoring—stay nourished!
Discussion and Future of the Project • As a component of Vision 2020, the Academic Master Plan highlights effective communication as a necessary student ability. • Participating departments can take the information given from this project to better understand the performance of their students. • Steps have also been taken to assess areas of potential improvement and enhancement of this project. • Consider VALUE Rubrics from AAC&U
February 20-22, 2011 College Station, Texas • http://assessment.tamu.edu/conference Plenary Speakers: Dr. Carol Geary Schneider Dr. Peter Ewell Call for Proposals now open!
One Minute Evaluation • What was the most valuable thing you learned? • What is one question that you still have? • What do you think is the next step that your program needs to take?