1 / 37

Assessment Strategies, Assessment networks Session II

Assessment Strategies, Assessment networks Session II. Preliminary Findings from Virginia Tech. Using Assessment Data to Improve Teaching & Learning . Introductions. Office of Academic Assessment 101 Hillcrest Hall (0157) Ray Van Dyke, 231-6003, rvandyke@vt.edu

afia
Download Presentation

Assessment Strategies, Assessment networks Session II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning

  2. Introductions • Office of Academic Assessment 101 Hillcrest Hall (0157) • Ray Van Dyke, 231-6003, rvandyke@vt.edu • Steve Culver, 231-4581, sculver@vt.edu • Kate Drezek, 231-7534, kmdrezek@vt.edu • Yolanda Avent, yavent@vt.edu • Others here today

  3. Today’s agenda • Review of Results from Office of Academic Assessment (OAA) SACS 3.3.1.1 Departmental Assessment Report • Suggestions for using identified tools/strategies for assessment to more explicitly incorporate direct assessment of student learning outcomes into program-level changes/improvements • Open discussion

  4. Overview: What is Assessment of Learning Outcomes? • “Assessment of student learning is the systematic gathering of information about student learning, using the time, resources, and expertise available, in order to improve the learning.” – Walvoord • A student learning outcome states a specific skill/ability, knowledge, or belief/attitudestudents are expected to achieve through a course, program, or college experience. • Example: Upon completion of a B.A. degree in English, a student will be able to read critically and compose an effective analysis of a literary text.

  5. What is The Process for Assessing Student Learning Outcomes?

  6. Big Question: • How do we turn this… Into a concrete plan?

  7. Departmental Assessment Report • Part of SACS reaccreditation process: • Standard 3.3.1. The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)

  8. Departmental Assessment Report • Part of SACS reaccreditation process: • 3.3.1.1 educational programs, to include student learning outcomes • 3.3.1.2 administrative support services • 3.3.1.3 educational support services • 3.3.1.4 research within its educational mission, if appropriate • 3.3.1.5 community/public service within its educational mission, if appropriate

  9. Departmental Assessment Report • Process: • Interview all department heads late fall/early spring 2008-2009 academic year • Approximately 1 hour in length • Conducted by OAA graduate research assistant Yolanda Avent (Educational Psychology)

  10. Departmental Assessment Report • Process: • Focus on concrete changes implemented by departments at related to overall programmatic improvements, improvements in advising, & improvements in specific courses • Participants asked to identify specific assessment tools/strategies/approaches they used that provided the justification for implementing identified changes

  11. Departmental Assessment Report • Process: • Yolanda Avent used detailed notes from interviews to synthesize information provided by participating departments • Preliminary results reported here today based on interviews with 50+ departments representing all colleges at the University • Common trends/themes identified by Ms. Avent/Kate Drezek for purposes of preliminary report

  12. Departmental Assessment Report • Caveat: Current Context • Assessment not done in vacuum • Pressures facing departments that also contributed to decisions to make certain program, advising, course changes • OAA Argument – Systematic assessment equally if not more key to departments’ abilities to innovate/improve in tough times as it can highlight strategic areas, help prioritize efforts

  13. Departmental Assessment Report Preliminary Results: Program Changes

  14. Changes: Programs • Curricular Mapping • Reconfiguration of majors • Elimination of duplication • Re-sequencing of courses

  15. Changes: Programs • Programmatic Learning Outcome Identification • Explicitly embedding essential learning outcomes/core competencies (e.g., critical thinking, information literacy) in multiple classes • Incorporation of VIEWS requirements in program • Creative incorporation of “non-traditional” learning outcomes within curriculum (e.g., global awareness)

  16. Changes: Programs • Development of Standardized Measures of Student Performance Across Program • Common Rubrics for Project Evaluation (Undergraduate & Graduate) • Common Measurable Outcomes for Thesis/Senior Capstone Students

  17. Changes: Programs • Program Innovations/Incorporation of Current Pedagogical “Best Practices” – Undergraduate: • Undergraduate Research Opportunities, including field experiences • Service Learning Opportunities

  18. Changes: Programs • Program Innovations/Incorporation of Current Pedagogical “Best Practices” – Graduate: • Teaching Mentoring Programs for Grad Students • Incorporation of “high demand” skills – presentation skills, peer review writing process, ethics, grant-writing – into existing seminars • Creation of new courses and programs around similar topics

  19. Departmental Assessment Report Preliminary Results: Advising Changes

  20. Changes: Advising • Change in Advising Structure • From advising professional to distribution among faculty • From distribution among faculty to advising professional • Single faculty model

  21. Changes: Advising • Change in Advising Structure • Hybrid • Use of introductory courses as opportunities to advise

  22. Changes: Advising • Change in Advising Philosophy/Culture • Informal advising opportunities, e.g., Brown Bag Lunches • Creation of advising centers to make advising more visible, holistic, student-friendly • Plans of Study submitted to Advisor and Chair of Department

  23. Changes: Advising • Leveraging of Technology to Enhance Advising • On-line “self-help” • On-line “tracking” of students for “force-adding” into courses • Carrot/Stick approach – blocking course registration unless see your advisor

  24. Departmental Assessment Report Preliminary Results: Course Changes

  25. Changes: Courses • Revision/Reinvention of Instructional Design in Specific Courses • “Special Topics” courses • Move to online instruction • Use of best available technology as PEDAGOGICAL tool (e.g., Tablets)

  26. Changes: Courses • Revision of Course Objectives to Ensure Alignment with Larger Learning Goals

  27. Departmental Assessment Report Preliminary Results: Assessment Tools/Strategies that Provided Justification for Change

  28. Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: PROGRAM-LEVEL DATA • Enrollment numbers • Retention rates • Course-taking patterns

  29. Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: STUDENTS • Informal feedback • Course evaluations • Focus groups

  30. Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: STUDENTS • Senior Survey • In-class surveys • Exit surveys (students leaving major as well as students graduating)

  31. Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: FACULTY • Informal Feedback – Observation, Reflection • Faculty study group feedback • Feedback via Assessment Committee, Curriculum Committee members • Guided Faculty Reflection Pieces

  32. Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: EXTERNAL CONSTITUENCIES • Alumni: surveys, Alumni Advisory boards • Professionals in Industry: informal feedback from employers, graduate schools; Advisory Boards

  33. Question: Tools/Strategies for Justification • Tools/Strategies Explicitly Mentioned by Participants: DIRECT, SYSTEMATIC ASSESSMENT OF STUDENT LEARNING • Infrequently cited as tool/strategy justifying programmatic changes • Significance? • Mentioned more often as part of changes to courses based on assessment • Not explicitly acknowledged, utilized to fullest potential for program review

  34. OAA Preliminary Conclusion: Draw better connections between existing practices, tools

  35. Connecting the dots Importance of Networking Across Departments, Colleges - Proven Best Practices Office of Academic Assessment – tools like national survey data, VALUE metarubrics How can we best facilitate this sharing of workable strategies???

  36. Discussion

  37. Final Thought: “We are being pummeled by a deluge of data and unless we create time and spaces in which to reflect, we will be left with only our reactions.” – Rebecca Blood

More Related