180 likes | 196 Views
Explore how technology enhances student learning through authentic assessment in problem-solving contexts, focusing on discipline fundamentals and deeper understanding.
E N D
WIP – Using Information Technology to Author, Administer, and Evaluate Performance-Based Assessments Mark Urban-Lurain Guy Albertelli Gerd Kortemeyer Division of Science and Mathematics Education Michigan State University FIE 2005 Session T2E October 20, 2005
Acknowledgements • Based upon work supported in part by: • U.S. Department of Education, under special project number P342A030070 • National Science Foundation through the Information Technology Research (ITR 0085921) and the Assessment of Student Achievement (ASA 0243126) programs • Michigan State University • Any opinion, finding, conclusions or recommendations expressed in this presentation are those of author(s) and do not necessarily reflect the views of any of the supporting institutions
Performance-Based Assessments (PBA) • Students construct responses in “authentic” problem-solving contexts • Can support student learning by • Focusing on fundamentals of a discipline • Requiring deeper understanding • Exposing student thinking • Providing formative evaluation
Performance-Based Assessments • More expensive and time-consuming to score than “objective” tests • Scoring entails human judgment • Scoring requires detailed rubrics • Training for raters • Inter-rater reliability
Previous Work • CSE 101 • Non-major CS0 course • 2000 students semester • Goal: Fluency with Information Technology (FITness) • Modified-mastery PBA: Bridge Tasks (BT) • Database-driven system for creating, delivering and evaluating BTs • 14,000 BTs / semester
Creating Assessments Bridge Task (BT) Database Dim M Instance i Instance i+1 Instance i+2 Instance i+n • Each Bridge Task (BT) has dimensions (M) that define the skills and concepts being evaluated. • Within each dimension are some number of instances (n) of text describing tasks for that dimension. • A bridge task consists of one randomly selected instance from each dimension for that bridge task Dim 1 Dim 2 Instance i Instance i Instance i+1 Instance i+1 Instance i+2 Instance i+2 Instance i+n Instance i+n
Evaluation Criteria Bridge Task (BT) Database Dim 1 Dim 2 Dim M Instance i Instance i Instance i Instance i+1 Instance i+1 Instance i+1 Instance i+2 Instance i+2 Instance i+2 Instance i+n Instance i+n Instance i+n Dim MInstance i+n Dim 2Instance i+2 Dim 1Instance 1 Criteria i Criteria i Criteria i Criteria i+1 Criteria i+1 Criteria i+1 Criteria i+2 Criteria i+2 Criteria i+2 Criteria i+n Criteria i+n Criteria i+n Student EvaluationPASS or FAIL
Scoring Rubrics • Software • Retrieves each student’s files • Presents grader scoring rubrics for particular BT • Grader evaluates each criteria pass/fail • Adds open-ended formative feedback for the student • Backend computes outcome based on pass/fail scores as defined by instructor for each BT • Students view their scored BT with the detailed rubrics and comments on the Web
Porting to Open Source CMS • LON-CAPA • Open source course management system developed at MSU • Content sharing and reuse network • Randomized homework, quizzes, numeric problems
Development Effort Completed • Mastery model BT assessments • Multiple pass/fail criteria determine if pass or fail BT • Students repeat given level of BT until pass BT grading rubrics • Scoring rubrics for each part of each BT • Grader evaluates each pass/fail • Provide formative feedback to student • Support for proctored authentication • Online resources typically completed without proctors • BTs administered in scheduled labs with a proctor • Schedule BTs in particular labs at particular dates and times
Current Status • Available in current 2.0.2 release • Testing with one section in CSE 101 • Authoring and scheduling require editing/uploading files
Spring 2006 • Improve scheduling interface • Convert CSE 101 course • Teacher Education • Incorporate BT as part of the technology requirement for pre-service teachers
Future Work • Other CSE courses adopt LON-CAPA and implement PBAs • Working with faculty to implement and evaluate the impact of PBAs on student problem-solving across other disciplines
Questions • Mark Urban-Lurainurban@msu.edu • LON-CAPAhttp://www.lon-capa.org