1 / 18

Leveraging Information Technology for Performance-Based Assessments

Explore how technology enhances student learning through authentic assessment in problem-solving contexts, focusing on discipline fundamentals and deeper understanding.

zielinskij
Download Presentation

Leveraging Information Technology for Performance-Based Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WIP – Using Information Technology to Author, Administer, and Evaluate Performance-Based Assessments Mark Urban-Lurain Guy Albertelli Gerd Kortemeyer Division of Science and Mathematics Education Michigan State University FIE 2005 Session T2E October 20, 2005

  2. Acknowledgements • Based upon work supported in part by: • U.S. Department of Education, under special project number P342A030070 • National Science Foundation through the Information Technology Research (ITR 0085921) and the Assessment of Student Achievement (ASA 0243126) programs • Michigan State University • Any opinion, finding, conclusions or recommendations expressed in this presentation are those of author(s) and do not necessarily reflect the views of any of the supporting institutions

  3. Performance-Based Assessments (PBA) • Students construct responses in “authentic” problem-solving contexts • Can support student learning by • Focusing on fundamentals of a discipline • Requiring deeper understanding • Exposing student thinking • Providing formative evaluation

  4. Performance-Based Assessments • More expensive and time-consuming to score than “objective” tests • Scoring entails human judgment • Scoring requires detailed rubrics • Training for raters • Inter-rater reliability

  5. Previous Work • CSE 101 • Non-major CS0 course • 2000 students semester • Goal: Fluency with Information Technology (FITness) • Modified-mastery PBA: Bridge Tasks (BT) • Database-driven system for creating, delivering and evaluating BTs • 14,000 BTs / semester

  6. Creating Assessments Bridge Task (BT) Database Dim M Instance i Instance i+1 Instance i+2 Instance i+n • Each Bridge Task (BT) has dimensions (M) that define the skills and concepts being evaluated. • Within each dimension are some number of instances (n) of text describing tasks for that dimension. • A bridge task consists of one randomly selected instance from each dimension for that bridge task Dim 1 Dim 2 Instance i Instance i Instance i+1 Instance i+1 Instance i+2 Instance i+2 Instance i+n Instance i+n

  7. Evaluation Criteria Bridge Task (BT) Database Dim 1 Dim 2 Dim M Instance i Instance i Instance i Instance i+1 Instance i+1 Instance i+1 Instance i+2 Instance i+2 Instance i+2 Instance i+n Instance i+n Instance i+n Dim MInstance i+n Dim 2Instance i+2 Dim 1Instance 1 Criteria i Criteria i Criteria i Criteria i+1 Criteria i+1 Criteria i+1 Criteria i+2 Criteria i+2 Criteria i+2 Criteria i+n Criteria i+n Criteria i+n Student EvaluationPASS or FAIL

  8. Scoring Rubrics • Software • Retrieves each student’s files • Presents grader scoring rubrics for particular BT • Grader evaluates each criteria pass/fail • Adds open-ended formative feedback for the student • Backend computes outcome based on pass/fail scores as defined by instructor for each BT • Students view their scored BT with the detailed rubrics and comments on the Web

  9. Porting to Open Source CMS • LON-CAPA • Open source course management system developed at MSU • Content sharing and reuse network • Randomized homework, quizzes, numeric problems

  10. Development Effort Completed • Mastery model BT assessments • Multiple pass/fail criteria determine if pass or fail BT • Students repeat given level of BT until pass BT grading rubrics • Scoring rubrics for each part of each BT • Grader evaluates each pass/fail • Provide formative feedback to student • Support for proctored authentication • Online resources typically completed without proctors • BTs administered in scheduled labs with a proctor • Schedule BTs in particular labs at particular dates and times

  11. Current Status • Available in current 2.0.2 release • Testing with one section in CSE 101 • Authoring and scheduling require editing/uploading files

  12. Spring 2006 • Improve scheduling interface • Convert CSE 101 course • Teacher Education • Incorporate BT as part of the technology requirement for pre-service teachers

  13. Future Work • Other CSE courses adopt LON-CAPA and implement PBAs • Working with faculty to implement and evaluate the impact of PBAs on student problem-solving across other disciplines

  14. Questions • Mark Urban-Lurainurban@msu.edu • LON-CAPAhttp://www.lon-capa.org

More Related