1 / 36

UWM Assessment Council

UWM Assessment Council. Assessment Workshop May 30, 2007 Tony Ciccone Connie Schroeder. Defining Assessment. Assessment is the systematic collection of information about student learning, conducted within the time, resources, and knowledge that are available,

hallam
Download Presentation

UWM Assessment Council

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UWM Assessment Council Assessment Workshop May 30, 2007 Tony Ciccone Connie Schroeder

  2. Defining Assessment Assessment is the systematic collection of information about student learning, conducted within the time, resources, and knowledge that are available, and used to improve learning by the most effective possible use of available resources -Barbara Walvoord

  3. Our Purposes • 1. To obtain reliable information about student learning that can be used to identify student strengths and challenges, and thus to make informed decisions about curricula and pedagogy (program and course level) • 2. To develop reliable and efficient ways to collect and present this information for ourselves, for granting agencies, for parents and future students, for accrediting bodies (program and institutional level)

  4. WEAVEonline Tool Assessment Cycle: 1. Write expected outcomes/objectives 5. Effect improvements through actions 2. Establish criteria for success 4. View assessment results 3. Assess performance against criteria

  5. Our Model… 1. What are your Program Goals? 2. In which courses are these program goals addressedand at what level is each goal addressed? 3. What is looked at? What type of Direct/Indirect measures will you use? 4. What types of information are examined? How is this information evaluated for learning? 5. How are the results examined and used?– How will you create a feedback loop to the department?

  6. Content Goals vs. Program Outcomes Most existing program outcomes are actually “program objectives” vs. learning outcomes” and are exclusively focused on content. Example: A content goal or program objective: Students will know, understand, be aware of, learn:[insert topic] A content goal: Only tells you what students know, not what they’ll be able to do with that knowledge or content. is difficult to assess because the level of learning is not specified.

  7. Example: Anthropology • Program Learning Outcomes: • Identify trends or patterns in anthropological data • Formulate a testable explanation or reasonable interpretation • Identify data that constitute credible evidence for an explanation or interpretation • Analyze and interpret data in a systematic manner …Now you have outcomes around which to design assignments and performance to assess the learning…

  8. Example: • Program Learning Outcome: • Students can identify the role that cultural diversity plays in defining what it means to be a social being • Students can identify the origins, workings, and ramifications of social and cultural change in their own identity • Students can compare the distinctive methods and perspectives of two or more social science disciplines.

  9. Example: Natural Sciences • Program Learning Outcomes: • Students can apply scientific methodology • Students can evaluate the validity and limitations of theories and scientific claims in experimental results. • Students can identify the relevance and application of science in everyday life. You then ask, “Where in our program could students learn or could demonstrate that they’ve learned this?” • Next step: Curriculum Mapping

  10. 2. Identify Courses where one or more program goals are addressed • Curriculum Mapping • (handouts- Business Administration)

  11. Curriculum MAP Example (Handouts )Align Program Learning Goals to Courses (List the Program Experiences) Key: I = Introduced E = Emphasized R = Reinforced -OAPA Handbook Program-based Review and Assessment UMass Amherst, p. 21

  12. Curriculum Mapping Benefits • The mapping exercise creates a conversation among the program instructors that identifies the gaps and strengths of the program and curriculum and enables everyone to look at how the pieces fit together. Does our “learning map” make sense? Is it working? Where is it weak?

  13. 3. Identify Type of Measures (handouts) • Direct Measures (of student learning) • A student performance such as an exam or project • A set of criteria by which to evaluate the performance • Indirect Measures (of student learning) • Produces information (quantitative and qualitative) about factors that are associated with student learning, e.g., student satisfaction surveys, instructor/course evaluations. -Barbara Walvoord -http://www.umass.edu/oapa/oapa/publications/online_handbooks/program_based.pdf

  14. Worksheet: Help identifying existing measures of Program Learning Outcomes

  15. Direct Measures (*indicates UWM) • Capstone course products, projects, thesis, etc. • Capstone faculty assessment of student achievement of learning goals • A culminating assignment, Term papers, theses • Lab Reports • Homework papers • Portfolios of student work, pre-major portfolio review • Presentations, performances • Publications • Class grades • Research reports • Videotapes • Oral defense • Analysis of a text or field experience, student teaching, practicum • Final Exams • Proficiency/majors exams, accrediting/licensing exams • Results from student competitions • Faculty evaluation of ongoing majors’ performance

  16. Direct Measures include qualitative as well as quantitative measures • All assessment measures do not have to involve quantitative measurement. • A combination of qualitative and quantitative methods can offer the most effective way to assess program learning outcomes. • Use an assessment method that matches your culture. • For example, in a department where qualitative inquiry is particularly valued, these types of methods should be incorporated into the plan. The data you collect must have meaning and value to those who will be asked to make changes based on the findings.

  17. Direct Measure: Using Exams To use results of exams as diagnostic tools, questions are grouped according to which of the course and Program Learning [Outcomes] they address. Data can then be analyzed to determine which [Program Learning Outcomes] are being achieved and for which ones we need to modify pedagogy and/or curriculum to enhance student learning of those [Program Learning Outcomes]. -UWM Dept. of Biological Sciences

  18. Grading and Program Assessment • Percentages are not sufficient: “73% of our students get grades A or better” • Grades are not sufficient: A letter grade by itself does not give enough information about the learning that was tested or the criteria that were used. -Walvoord, p. 13

  19. Using Course-based Assessment of Embedded Exam Questions • Completion of embedded exam questions designed to evaluate selected knowledge and skills. • Test questions developed by a committee of faculty and embedded in the mid-term and final exams of three • Upper level classes: Calculus 3, Linear Algebra, and Advanced Calculus. Department of Mathematics University of Colorado at Boulder

  20. Indirect Measures • Retention data • Student Evaluations • Placement- exiting student plans/placement • Alumni Survey • Teaching Strategies • Career Development-career progress of alumni • Interviews Example: • how many of the students who take its introductory course go on to declare and/or complete the program or major?

  21. Using results to improve Core Literature Program using Indirect Measures • Survey administered to students at the end of each core lit. class, asking whether, during that semester, they have read literature not required in class • Student survey administered by Institutional Research to all seniors asking whether or not they have read books not required in class • Results reported annually to department for discussion and action

  22. 4. Evaluate Evidence - Direct Link to Learning • Course Assignments and Performances • Measure achievement of learning with Criteria and Rubrics

  23. Two Types of Rubrics • A. Primary Trait Analysis • B. Matrix

  24. Examples of Rubrics Primary Trait Analysis Trait: Synthesis of Ideas • 4 Presents a perspective that synthesizes the main ideas of several readings in a way that gives more meaning to the readings as a whole rather than if the main ideas were presented individually. • 3 Presents a perspective that synthesizes the main ideas of several readings. This perspective may be very general. • 2 The main idea of one reading is presented as the dominant perspective of the paper. • 1 There is no main idea of the paper.

  25. What is the value of a rubric? • Helps clarify to students what you expect • Saves time in the grading process • Allows for greater consistency and fairness • Helps students evaluate their own work • Helps students give constructive feedback to peers • Helps team teachers or teaching assistants grade consistently • Helps teachers of sequenced courses communicate with each other about standards and criteria • Assists with departmental/institutional assessment

  26. 5 Examine Results • Are students learning what we want them to learn? • What are our students’ strengths and weaknesses? • What are the strengths and weaknesses of our program? • What needs to change or improve? • What should we keep on doing? • What do our program learning outcomes look like? • How do we make learning visible?

  27. “Closing the Loop” • Use data to improve processes • Communicate results

  28. Direct Assessment of Student Academic Work • Where in the curriculum do multiple faculty/instructors examine student work, senior projects in the major, Ph.D. qualifying exams, or dissertations? Are there written criteria? Do students strengths and weaknesses get fed back in the Could it be? • Are aggregate back to the program? senior projects reviewed by external evaluators? Arecriteria written down? Could they be? Do evaluators give feedback to the program as a whole? If not, could they? -Walvoord, p. 56

  29. Use results to serve the program • Curricular changes [to improve learning]: • Adjust course content • Increase time spent on topics • Split courses • Increase effective strategies to integrate lecture and laboratory content • Allocations to improve resources [to improve learning]

  30. Example: Using Course-embedded Measures for Program Improvement • Political Science Dept.-Senior Thesis • Annual meeting for faculty thesis advisors to report to department in a systematic way the strengths and weaknesses in student work and recommendations of what needed to be done. • Students often did not know how to formulate an appropriate question for inquiry in the field. The department revised courses earlier in the curriculum to place more emphasis on building that skill. Walvoord, B. p. 60

  31. Example: Use of Direct Measure Data in Political Science Dept. • Each semester, the Capstone instructor will give the Chair (or Undergraduate Director) summaries of the evaluations of the students’ performances as indicated by the evaluation form filled out by faculty who supervise capstone papers. -UWM Political Science

  32. Assessment Makes a Difference… • “The capstone instructor(s) last year reported their impression of low graphing skills in seniors; we arranged with the mathematics department for greater emphasis on graphing in the required math course and for assessment of graphing skills during that course, working closely with the capstone instructor(s). The capstone instructor(s) will report next year whether graphing skills are stronger.” • A rubric is currently being developed to assess graphing skills more systematically. Walvoord, B. p. 8

  33. Example - Use of Direct Measure for Communication Dept. Improvement At the end of each year the UG committee analyzes the data, compares it with results of student data, constructs a report with recommendations for improvements and presents this to faculty at the beginning of the next academic year. -UWM Department of Communication ‘04

  34. Example: Psych. Dept. Using Laboratory Research Reports • All faculty teaching advanced laboratory courses submit component scores to the Undergraduate Program Committee (UPG). • The UPG reviews the data annually and makes recommendations to department for action.

  35. Example: Capstone Biology CourseProgram Learning Outcome: Graduating seniors will be able to conduct original scientific research and present it in writing and orally to a scientific audience. • Course Embedded: Senior Capstone • Explicit Criteria: Rubric • Feedback Loop: • end-of-year dept. annual report • grade results in aggregate • shares assignment, criteria, and aggregate results at department meeting, discussing strengths and weaknesses • recommends departmental action, members offer support • record kept of departmental action and necessary reports to outsiders (administrators, univ. assessment committee, etc.

  36. Example: Assignment inCore Lit Courses Using Direct Measures In all core lit courses: • Instructors assign an essay requiring students to apply literary critical methods to literature and to acknowledge alternative Interpretations. They evaluate students’ essays by explicit written criteria. Using the results to close the loop: • Annual meeting: core lit. instructors report student scores to colleagues for supporting instructors’ plans for improvement; taking appropriate action if needed at the department level; report results to the dean or other body.

More Related