220 likes | 393 Views
Assessing a Quality Education. Michael Fried: Director of Continuous Assessment Office (CAO) MSU-B. Talk Outline. Part I: Laws Connecting Dots: Examples of why this presentation has something to offer many instructors
E N D
Assessing a Quality Education Michael Fried: Director of Continuous Assessment Office (CAO) MSU-B
Talk Outline Part I: Laws Connecting Dots: Examples of why this presentation has something to offer many instructors Part II: Context and value of WebWorks: Crisis at Univ. of Rochester (example meaning for “Continuous Assessment”) Part III: I(nteractive)Q(uestionnaires): Step Thinking, analysis of reading, and writing Part IV: The project oriented university: Managing complicated interactions Part V: CAO Objectives: For students, faculty, the University’s community presence)
Need for Continuous Assessment • You can’t learn something just once • Classrooms that march through a curriculum leave students in their wake
A. Dots: Definition boxes B. Connecting: Step-Thinking and drawing conclusions C. Laws: Unsolved Problems/finding unknown principles A. Dots: Thematic puddles B. Connecting: Finding rules behind the outcomes you see C. Laws: Learning to live with the problems of life Part I: Laws Connecting DotsComparing Math/Science and Humanities/Social Science
A. Dots: The objects of Study • Math example: Locate the center of mass of a convex polygon • Humanities example: Locate habitats where communities of elk, wolves and songbirds could overlap
Math: When will a ball bouncing off inner walls of a convex polygon aproach the center of mass? Natural Science: What habitat would sustain a population of elk, wolves and songbirds? When would a body moving continually under constraints have stable motion? What is the cost to society to sustain habitats with elk, wolves and songbirds? B. Connecting dots C. Dot defining Laws
Dots Dot connecting concepts: Step thinking Connecting concepts to problems WebWorks: Mastery learning I(nteractive) Q(uestionnaire)s: Aided analyzing and writing Interactive Portfolio Management: Organizing student projects; organizing your data and projects Tools for Assessment
Part II: Context and value of WebWorks: Crisis at University of Rochester • Instructors then knew little about their students: WWs statistical data (time to solve, number of attempts, etc) • Impact on UR’s decreasing budget: Before WW, TAs only graded. After WW, TAs went back to instruction. • Under the Hood of WWs: UNIX Shell and Pearl programs; basic programs of unix written in the middle 70’s: DOS is a baby UNIX.
Why Students Like WebWorks • Access to assessments: anytime, anywhere • Feedback: Immediate, including the chance to correct mistakes when understanding is fresh • Assessment History: WW remembers problems each student got (s/he can download problems, seek help, then return to submit answers)
Why Faculty Like Webworks • Individualization: Unique assessments for each student (support collaborative efforts yet minimizing cheating) • Flexible Windowing of Assignments: Assignments can be arranged in n-weekly lecture (weekly) sets (UCI uses n=3 in Calculus) so students work with a collection of related material rather than one topic or concept at a time • External History: Assessments and their answers available after the assignment due date (students can use them for exam preparation; faculty can use the electronic record for classroom analysis)
Growth in use of WWs: 11 Major goals: Make sense of the relations between our classes starting with change in a given class. Even assessing in three week windows improves what we now have. Reliable/automatic ways to measure/report on student change/growth. • Grades: Significant, but a poor clue as to what happened with material in a course. • Year: Course, number of • 2001-2002: Math2a 10+6+4, Math2b 12+8+7 • 2002-2003: Math2a 10+7+5, Math2b 8+18+7, Physics3a 1, Physics 52b 18 • 2003-2004: Math2a 11+6+6 (912+645+544), Math2b 9+13+7 (1013+1245+770), Phys 3lc 7+19 (177+467), Chem H2la 4 (89), Chem 1la 15 (338), Chem 1lb 44+18 (1054+426), Chem 1lc 8+40 (168+1000), Phys 52b 18 (273), Phys 52c 9 (185)
Part III: I(nteractive)Q(uestionnaires): Step Thinking http://math.uci.edu/~mfried/#edInteractive E-Mail Assessment, MAA Vol. on Assessment, B. Gold, S.Z. Keith, and W.A. Marion, eds., Assessment in Undergraduate Mathematics, MAA Notes #49, Wash. DC, 1999, 80--84. • Students Assessment: IQs simplify teaching students to follow the use of the “Dots” of the course. Instead of facts alone, you are teaching them connecting facts. • Faculty Time: IQs allow student writing and analysis that is serious and simple to grade; like WWs there are many grading rubrics (ripping the blue-book-apart, RBBA, is most significant). • University Awareness: A whole course can consist of many topics, though some are more significant than others. IQs track these, and produce reports on class performances. If something works, you can locate it.
IQs as Tools: Designed to leverage HTML, a system of tags allow polling the portfolio • Grading: RBBA means having a list of responses to a piece of the IQ placed in front of you for batch grading. • Reporting: Leveraging HTML means you can create reports that transparently layer the IQ. • Educational Research: “I thought they were lying to me:” The story behind the “Dynamic Learning Curve” graphic. • It’s almost Free: Everything in a UNIX shell and related programs.
Part IV: The project oriented university: Program manage complicated interactions: 15 • Mailmerge suite: Send personalized e-mails to large groups or small • IPMS location suite: Locate your personal and class data better than any PC or Mac • IQ creation suite based on tagged data: Assessment data for grading and polling • Tag data report suite: Create reports from your personal and class data: Talk demonstration of crossword puzzle data
Students Faculty University A continuous picture of what courses are about A continuous picture of where students are in their classes A continuous picture how students with various profiles progress Part V: CAO Objectives
Progress for a student • Faculty look for students who know what their courses are about: who realize it will take some development for them to use the skills and knowledge for putting together a career for themselves. • Some may pick up detailed work commands of a course, and yet not get what the course is about. Others may not have the plug and chug down, and yet they do see what we are after. • We would like to know some earmarks for differentiating and helping those two different kinds of OK students.
Progress for Faculty and Staff • Justify the significance of courses. • Simplify the frustrations of difficult courses. • Help move students into upper division classes they and colleagues teach. • Have reliable assessment support deserving credit for handling well issues with courses that are part of many university curricula. • Track complicated data and easily, personally and personably interact with colleagues, contacts, conference groups, students, seminar attendees, donors, …
Progress for the University • Produce automatic reports that show the University understands what it can do with a student of a given profile. • Produce students who can feed the growth of Montana business and development. • Convince the legislature and accrediting agencies that the two goals above are related to what goes on in its classrooms. • Hand the faculty the best present you can: Lower teaching augmented by accomplishment in scholarship. It worked at University of Rochester.