370 likes | 497 Views
Assessment Strategies, Assessment networks Session II. Preliminary Findings from Virginia Tech. Using Assessment Data to Improve Teaching & Learning . Introductions. Office of Academic Assessment 101 Hillcrest Hall (0157) Ray Van Dyke, 231-6003, rvandyke@vt.edu
E N D
Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning
Introductions • Office of Academic Assessment 101 Hillcrest Hall (0157) • Ray Van Dyke, 231-6003, rvandyke@vt.edu • Steve Culver, 231-4581, sculver@vt.edu • Kate Drezek, 231-7534, kmdrezek@vt.edu • Yolanda Avent, yavent@vt.edu • Others here today
Today’s agenda • Review of Results from Office of Academic Assessment (OAA) SACS 3.3.1.1 Departmental Assessment Report • Suggestions for using identified tools/strategies for assessment to more explicitly incorporate direct assessment of student learning outcomes into program-level changes/improvements • Open discussion
Overview: What is Assessment of Learning Outcomes? • “Assessment of student learning is the systematic gathering of information about student learning, using the time, resources, and expertise available, in order to improve the learning.” – Walvoord • A student learning outcome states a specific skill/ability, knowledge, or belief/attitudestudents are expected to achieve through a course, program, or college experience. • Example: Upon completion of a B.A. degree in English, a student will be able to read critically and compose an effective analysis of a literary text.
What is The Process for Assessing Student Learning Outcomes?
Big Question: • How do we turn this… Into a concrete plan?
Departmental Assessment Report • Part of SACS reaccreditation process: • Standard 3.3.1. The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)
Departmental Assessment Report • Part of SACS reaccreditation process: • 3.3.1.1 educational programs, to include student learning outcomes • 3.3.1.2 administrative support services • 3.3.1.3 educational support services • 3.3.1.4 research within its educational mission, if appropriate • 3.3.1.5 community/public service within its educational mission, if appropriate
Departmental Assessment Report • Process: • Interview all department heads late fall/early spring 2008-2009 academic year • Approximately 1 hour in length • Conducted by OAA graduate research assistant Yolanda Avent (Educational Psychology)
Departmental Assessment Report • Process: • Focus on concrete changes implemented by departments at related to overall programmatic improvements, improvements in advising, & improvements in specific courses • Participants asked to identify specific assessment tools/strategies/approaches they used that provided the justification for implementing identified changes
Departmental Assessment Report • Process: • Yolanda Avent used detailed notes from interviews to synthesize information provided by participating departments • Preliminary results reported here today based on interviews with 50+ departments representing all colleges at the University • Common trends/themes identified by Ms. Avent/Kate Drezek for purposes of preliminary report
Departmental Assessment Report • Caveat: Current Context • Assessment not done in vacuum • Pressures facing departments that also contributed to decisions to make certain program, advising, course changes • OAA Argument – Systematic assessment equally if not more key to departments’ abilities to innovate/improve in tough times as it can highlight strategic areas, help prioritize efforts
Departmental Assessment Report Preliminary Results: Program Changes
Changes: Programs • Curricular Mapping • Reconfiguration of majors • Elimination of duplication • Re-sequencing of courses
Changes: Programs • Programmatic Learning Outcome Identification • Explicitly embedding essential learning outcomes/core competencies (e.g., critical thinking, information literacy) in multiple classes • Incorporation of VIEWS requirements in program • Creative incorporation of “non-traditional” learning outcomes within curriculum (e.g., global awareness)
Changes: Programs • Development of Standardized Measures of Student Performance Across Program • Common Rubrics for Project Evaluation (Undergraduate & Graduate) • Common Measurable Outcomes for Thesis/Senior Capstone Students
Changes: Programs • Program Innovations/Incorporation of Current Pedagogical “Best Practices” – Undergraduate: • Undergraduate Research Opportunities, including field experiences • Service Learning Opportunities
Changes: Programs • Program Innovations/Incorporation of Current Pedagogical “Best Practices” – Graduate: • Teaching Mentoring Programs for Grad Students • Incorporation of “high demand” skills – presentation skills, peer review writing process, ethics, grant-writing – into existing seminars • Creation of new courses and programs around similar topics
Departmental Assessment Report Preliminary Results: Advising Changes
Changes: Advising • Change in Advising Structure • From advising professional to distribution among faculty • From distribution among faculty to advising professional • Single faculty model
Changes: Advising • Change in Advising Structure • Hybrid • Use of introductory courses as opportunities to advise
Changes: Advising • Change in Advising Philosophy/Culture • Informal advising opportunities, e.g., Brown Bag Lunches • Creation of advising centers to make advising more visible, holistic, student-friendly • Plans of Study submitted to Advisor and Chair of Department
Changes: Advising • Leveraging of Technology to Enhance Advising • On-line “self-help” • On-line “tracking” of students for “force-adding” into courses • Carrot/Stick approach – blocking course registration unless see your advisor
Departmental Assessment Report Preliminary Results: Course Changes
Changes: Courses • Revision/Reinvention of Instructional Design in Specific Courses • “Special Topics” courses • Move to online instruction • Use of best available technology as PEDAGOGICAL tool (e.g., Tablets)
Changes: Courses • Revision of Course Objectives to Ensure Alignment with Larger Learning Goals
Departmental Assessment Report Preliminary Results: Assessment Tools/Strategies that Provided Justification for Change
Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: PROGRAM-LEVEL DATA • Enrollment numbers • Retention rates • Course-taking patterns
Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: STUDENTS • Informal feedback • Course evaluations • Focus groups
Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: STUDENTS • Senior Survey • In-class surveys • Exit surveys (students leaving major as well as students graduating)
Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: FACULTY • Informal Feedback – Observation, Reflection • Faculty study group feedback • Feedback via Assessment Committee, Curriculum Committee members • Guided Faculty Reflection Pieces
Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: EXTERNAL CONSTITUENCIES • Alumni: surveys, Alumni Advisory boards • Professionals in Industry: informal feedback from employers, graduate schools; Advisory Boards
Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: DIRECT, SYSTEMATIC ASSESSMENT OF STUDENT LEARNING • Infrequently cited as tool/strategy justifying programmatic changes • Significance? • Mentioned more often as part of changes to courses based on assessment • Not explicitly acknowledged, utilized to fullest potential for program review
OAA Preliminary Conclusion: Draw better connections between existing practices, tools
Connecting the dots Importance of Networking Across Departments, Colleges - Proven Best Practices Office of Academic Assessment – tools like national survey data, VALUE metarubrics How can we best facilitate this sharing of workable strategies???
Final Thought: “We are being pummeled by a deluge of data and unless we create time and spaces in which to reflect, we will be left with only our reactions.” – Rebecca Blood