1 / 55

Assessing for Transformation

Assessing for Transformation. Gary Brown The Center for Teaching, Learning & Technology.

royal
Download Presentation

Assessing for Transformation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing for Transformation Gary Brown The Center for Teaching, Learning & Technology

  2. We should not expect the guidance for change of this magnitude—in institutional culture and values—to come from the faculty ranks. After all, faculty are deeply rooted in the traditional values of higher education. Fundamentally, this is a leadership issue.                           --Carol Barone, Vice President Educause

  3. Agenda • Assumptions—the need for transformation • To arms, to arms… • About those jobs • What Transformation is NOT • What Transformative Assessment is NOT • Stories • Goals, Activities, & Processes (GAPs) • Inquisitions and Assessments • Critical Thinking—the rubric paradigm • The Sleeping Bear & Principles of Transformative Assessment • TAPS

  4. To Arms, To Arms… • PHOENIX • Phoenix Online is growing faster than any of the university's 40 physical campuses • 1,700 online support specialists • 7,000 mostly part-time faculty • 49,400 students • 70% increase • $64.3 million in net income

  5. Phoenix —from the Chronicle • “Some critics note that Phoenix strips faculty members of their central role in higher education.” “One of the greatest risks we face as a nation in the growth of new educational providers is the unbundling of teaching, research, and service functions”—Art Levine • However, counters Phoenix, “Professors at traditional universities who attempt online education are learning as they go, and often give students a bad experience as a result.”

  6. Cost • The USA currently has 3,500 colleges and universities with an enrollment of 14 million students. • We spend $175 billion on education. • $12,500 per student • But now, eleven International Mega-universities serve 3 million students.

  7. Corporate Tax Support ?

  8. Job Growth: 1992-2005* • Predicted Percentage Increase… *US Department of Labor Statistics (1993)

  9. Job Growth: 1992-2005* • Or in Total Numbers *US Department of Labor Statistics (1993)

  10. Job Market Projections* Need Higher Ed *US Department of Labor Statistics (1993)

  11. The Real Story We need: • 9 cashiers for every technical worker. • 1.5 janitors for all the lawyers, accountants investment bankers, stock brokers, & computer programmers combined. • “The projected shift … in educational requirements… can be accomplished if those entering the labor force have…one fourth grade level more education than those retiring from the labor force.” Economic Policy Institute

  12. College Graduates @ Work *US Department of Labor Statistics (1993)

  13. Already, In Fact… % College Graduates working in Blue Collar Jobs *US Department of Labor Statistics (1993)

  14. What Transformation Is NOT

  15. Attrition Rates in WSU Distance Courses %

  16. Student Enrollments in WSU Online Learning Spaces

  17. Transformation?

  18. What Transformative Assessment is NOT

  19. Evaluations as Barriers to Improvement • Many evaluation instruments are subtly biased in favor of traditional instruction. • They often penalize innovators, measuring what faculty aren’t doing but failing to measure what they are doing.

  20. Purpose of Evaluation is Not Clear • Most evaluation focuses on personnel decisions. • Inconsistent interpretation and use of evaluation results. • Little focus on areas where that might best benefit from feedback • Evaluation usually occurs at the very end of a semester —Too late for change

  21. Students Are in the Dark, Too • The traditional evaluation process is not designed to help students become more involved with what will really help them learn. • Students don’t take evaluations seriously.

  22. Teaching-Learning Practices are NOT Linked to Learning • Course evaluations rarely focus on practices that have been shown to produce better learning outcomes. • Assessment rarely improves learning.

  23. No Link Between Evaluation & Faculty Development • Inadequate support for addressing results. Literature from past 10-15 years emphasizes staff development as key to effective evaluation practices that promote teacher growth and improvement”(1997, Annunziata in Stronge, pp 289) • Administrators frequently calculate an average score for all evaluation items and then rank faculty. • The method does not foster dialogue among faculty about good practices.

  24. The Goals, Activities, and Practices GAPs • Invite faculty to join in the process of formulating the assessment. • A series of three short, online surveys designed to provide formative assessment: one instructor survey and two student surveys. TRIAD

  25. The 3 Survey Goals • To help faculty gather useful data about their teaching goals, values and strategies… • To help faculty learn how those strategies address and influence their students’ goals, values, and learning behaviors… • And to examine the interaction between goals and practices that faculty might share with each other —and so establish a culture of evidence.

  26. GAPs: Encouraging Participation • Brown Bags - to initiate the conversation on assessing teaching and learning. • We emailed each faculty approximately 10 times • Student surveys were linked to course sites of instructors who answered the instructor survey. • We analyzed and returned results promptly... • Regular updates were sent to instructors with their class results and overall results. • We offered and have co-authored papers with faculty... • We have talked with chairs, deans, and the assessment coordinator’s to include GAPS in teaching portfolios. • We wore out our shoes…

  27. Sample Findings • Significant mismatch between faculty and student goals • Online courses are significantly more effective than video based courses • Designed courses are significantly more likely than courses not formally designed to evidence principles of good practice • Faculty motivation predicts perceptions of efficacy • Faculty who want to keep abreast of the scholarship of T & L report that online learning is positive • Faculty who teach online primarily for the money … • Student age and gender predict perceptions of testing efficacy

  28. Time & CostEstimated Hours

  29. The Assessment Gold StandardParticipants Who Used GAPs Data to Transform….

  30. However…. • One of the two programs endured a budget related, faculty initiated inquisition….. • Demonstrated, by virtue of GAPS, systematic, formative data gathering and responsiveness to that data… • Demonstrated evidence of good practice. • Demonstrated the “creative use of technology” and gains in critical information literacy • Demonstrated the program was cost effective • The program improved freshman retention • Demonstrated the program improved student grades, including “special admits” expected to struggle.

  31. The Learning Context:Student Performance @ WSUCum GPA by Admissions Quartiles

  32. Which has resulted in…. • The faculty committee that initiated the inquiry was commended by administration for its attentiveness to issues of quality…. • They have subsequently been invited to examine other programs… • Which has transformed the CTLT • We are seeing the inklings of a culture of evidence sprouting in a rough, research dominated terrain…

  33. small steps….

  34. The Rubric Paradigm • Guide faculty grading • Guide student learning • Provide measures of growth

  35. Dimensions of Critical Thinking • Identifies and summarizes the problem/question at issue (and/or the source's position). • Identifiesand presents the STUDENT’S OWN perspective and position as it is important to the analysis of the issue. • Identifies and considers OTHER salient perspectives and positions that are important to the analysis of the issue. • Identifies and assesses the key assumptions. • Identifies and assesses the quality of supporting data/evidence and provides additional data/evidence related to the issue. • Identifies and considers the influence of the context * on the issue. • Identifies and assesses conclusions, implications and consequences.

  36. Identifies and summarizes the problem/question at issue (and/or the source's position). Critical Thinking and Measures of Growth Emerging____________________________ Mastering

  37. Critical Thinking:8 Courses: 4 with CT, 4 w/o CT

  38. Critical Thinking:8 Courses: 4 with CT, 4 w/o CT

  39. Critical Thinking:One course—two semesters

  40. Faculty Development • The critical thinking rubric was valuable for helping faculty articulate their goals and communicate expectations to students. • Faculty who used the rubric were enthusiastic and expressed plans to integrate the rubric more intensively in future courses.

  41. Critical Thinking Study—Results • Gains in courses when rubric is used—when the faculty in this project integrated the WSU Critical Thinking Rubric into their instruction and assessment, evidence of student gains in critical thinking increased dramatically. • Gains from freshmen to junior years—Critical thinking was significantly higher among juniors than among freshmen. • But even the writing of juniors had only a mean of 3.1 on a 6 point scale.

  42. Additional Findings & Implications • The dimension of least gain was in students’ abilities to articulate their own viewpoints. • The greatest gains by juniors reflect improved abilities to analyze issues from multiple perspectives. • Comparisons to WSU’s writing assessment— As critical thinking scores rise, writing placement scores and portfolio exam scores sink... • The faculty questionnaire revealed a focus on grading over fostering critical thinking for broader life-long learning.

  43. Transformation?

  44. The Sleeping Bear • Those most oblivious of the teeth are often the first to fodder… • The growling bear will elicit benchmarks & comparisons • The bear will feast on standardization • But bears are omnivorous…

  45. Emerging Principles of Transformative Assessment • Institutional leadership is imperative • Assessment focuses on institutional efforts to provide students with rich learning experiences • Assessment includes student reports of their own increasingly unique experiences • Qualitative measures are valued and may be supported by quantitative measures • Emphasizes student learning defined by development and change over time • Dissemination engages the responsibility for shaping as well as reflecting society needs • “High Standards” is not the same as Standardization

  46. Dimensions of Transformation • Purpose • Data Acquisition • Application • Dissemination

  47. The Scoring Form

  48. Pilot Findings

  49. Pilot Findings

More Related