1 / 90

Maximizing Online Assessment Success: Tools, Techniques, Time Management

Explore the 3 T's of online assessment: Tools, Techniques, Time Management for effective evaluations and student success. Discover strategies for improved learner outcomes and instructor efficiency. Enhance your online assessment practices today!

pbyrd
Download Presentation

Maximizing Online Assessment Success: Tools, Techniques, Time Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part II: The 3 T’s of Online Assessment: Tools, Techniques, and (Saving) Time Curtis J. Bonk, Professor, Indiana University President, CourseShare http://php.indiana.edu/~cjbonk; cjbonk@indiana.edu Vanessa Dennen, Assistant Professor Florida State University vdennen@fsu.edu

  2. Do you have a strategic plan for evaluation and assessment?

  3. Bonk et al.’s (in press) Online Learning Assessment and Evaluation Model Bonk, C. J., Wisher, R. A. & Champagne, M. V. (in press). Toward a comprehensive model of e-learning evaluation: The Components.

  4. What to Evaluate? • Learner—attitudes, learning, jobs. • Instructor—popularity, course enrollments. • Training—internal and external. • Task--relevance, interactivity, collaborative. • Tool--usable, learner-centered, friendly, supportive. • Course—interactivity, completion rates. • Program—growth, long-range plans. • Organization or University—cost-benefit, policies, vision.

  5. Measures of Student Success(Focus groups, interviews, observations, surveys, exams, records) • Increased Comprehension & Achievement • High Student Attitudes • High Retention, Completion Rates in Program • Jobs Obtained, Internships • Enrollment Trends for Next Semester • Grades, Achievement, Certifications • Computer Log Activity; e.g., Number of Posts, Participation, Messages/day, Time in System

  6. 1. Student High-End Success • Message complexity, depth, interactivity, questioning • Collaboration skills • Problem finding/solving and critical thinking • Challenging and debating others • Case-based reasoning, critical thinking measures • Portfolios, performances, PBL activities

  7. Assessments Possible • Quizzes and Tests • Peer Feedback, Mentoring, Responsiveness • Tasks Attempted or Completed, Usage, etc. • Discussion/Forum Participation • Writing, Blogs, Weekly Reflections • Cases and Problems • Group Work • Web Resource Explorations & Evaluations • Performances, Portfolios, etc.

  8. Issues to Consider… • Bonus pts for participation? • Pts for peer evaluation of work? • Assess improvement? • Are tests timed? Allow retakes if lose connection? How many retakes? • Give unlimited time to complete? • Cheating? Is it really that student? • How measure competency and demonstrate learning online?

  9. 2. Instructor Success • High student evals, more signing up, student recommendations • High student completion rates • Utilize Web to share teaching • Course recognized with awards

  10. 3. Training: Outside Support • Training (FacultyTraining.net) • Courses & Certificates (JIU, e-education) • Reports, Newsletters, & Pubs • Aggregators of Info(CourseShare, Merlot) • Global Forums (FacultyOnline.com; GEN) • Resources, Guides/Tips, Link Collections, Online Journals, Library Resources

  11. 3. TrainingInside Support… • Instructional Consulting • Mentoring (strategic planning $) • Small Pots of Funding • Laptops • Summer and Year Round Workshops • Colloquiums, Tech Showcases, Guest Speakers, Awards, Recognitions • Newsletters, guides, active learning grants, annual reports, faculty development, brown bags

  12. RIDIC5-ULO3US Model of Technology Use 4. Tasks (RIDIC): • Relevance • Individualization • Depth of Discussion • Interactivity • Collaboration-Control-Choice-Constructivistic-Community

  13. RIDIC5-ULO3US Model of Technology Use 5. Tech Tools (ULOUS): • Utility/Usable • Learner-Centeredness • Opportunities with Outsiders Online • Ultra Friendly • Supportive

  14. 6. Course Success • Few technological glitches/bugs • Adequate online support • Increasing enrollment trends • Course quality (interactivity rating) • Monies paid • Accepted by other programs

  15. 7. Online Program or Course Budget(i.e., how pay, how large is course, tech fees charged, # of courses, tuition rate, etc.) • Indirect Costs: learner disk space, phone, accreditation, integration with existing technology, library resources, on site orientation & tech training, faculty training, office space • Direct Costs: courseware, instructor, help desk, books, seat time, bandwidth and data communications, server, server back-up, course developers, postage

  16. 7. Program:Online Content Considerations • Self-Paced or Live mentors? • Interactive or content dumping? • Individual or Collaborative? • Lecture or problem-based learning? • Factual or performance assessment?

  17. 8. Institutional Success • E-Enrollments from • new students, alumni, existing students • Additional grants, new State monies • Press, publication, partners, attention • Cost-Benefit model, ROI • Faculty attitudes, training, support • Acceptable policies, ADL compliant

  18. Let’s Focus at the Individual LevelWhat Online Testing Tools Do You Use?

  19. What Can Online Tests Do? • Assess student progress • Allow for self-assessment • Provide standards for success • Timed testing and retesting • Opportunity for instructor commenting

  20. Test Selection Criteria (Hezel, 1999) • Easy to Configure Items and Test • Handle Symbols • Scheduling of Feedback (immediate?) • Provides Clear Input of Dates for Exam • Easy to Pick Items for Randomizing • Randomize Answers Within a Question • Weighting of Answer Options

  21. More Test Selection Criteria • Recording of Multiple Submissions • Timed Tests • Comprehensive Statistics • Summarize in Portfolio and/or Gradebook • Confirmation of Test Submission

  22. More Test Selection Criteria(Perry & Colon, 2001) • Supports multiple items types—multiple choice, true-false, essay, keyword • Can easily modify or delete items • Incorporate graphic or audio elements? • Control over number of times students can submit an activity or test • Provides feedback for each response

  23. More Test Selection Criteria(Perry & Colon, 2001) • Flexible scoring—score first, last, or average submission • Flexible reporting—by individual or by item and cross tabulations. • Outputs data for further analysis • Provides item analysis statistics (e.g., Test Item Frequency Distributions).

  24. How Use Technology to Assess?

  25. What Assess Online? • Traditional Online Exams • Self-Test Exams • Learner-Content Interactions • Guided Explorations: Virtual Tours and Timelines • Cases and Vignettes • Blogs • Chats with Visual Representations • Soft Skill Simulations • Virtual Reality and Role Play Simulations • Cyber Fashion Shows & Music Performance

  26. 1. Traditional Online Testing

  27. Using WebCT Quizzes in a High-Demand Environment(Brothen & Wambach, Technology Source, May/June 2003) “Several reviews and meta-analyses…have found superior student learning in PSI compared to traditional lecture/discussion methods.” Here, students read a textbook and when they are ready, they take chapter quizzes; after they master one chapter, they move on to the next.”

  28. 2. Online Self-Testing

  29. 3. Learner-Content Interactions (Option 6 Designers)

  30. 4. Guided Explorations: Virtual Tours and Timelines

  31. 4. Guided Explorations: Interactive Adventure Content (Andrew Revkin, New York Times, May 25, 2003)

  32. 5. Case-Based Learning: My Patient.com and SimTeacher

  33. 6. Blogs (diaries, writing)

  34. 7. Visual with Chat: Learningbydoing.net Participants: a facilitator of online therapy, students at all levels, a doctoral candidate in DE, administrators, teachers, lecturers, researchers, a physicists, a professor of Psychology, a professor of Mathematics, a consultant in training, an HR trainer, and a psychotherapist. We were located in Herzelia, a beach town north of Tel Aviv, Stanford California, Baltimore, Montreal, and Ismir, Turkey.

  35. Games and Simulations “There’s something new on the horizon, though: computer-based soft skills simulations, which let learners practice skills such as negotiation and team building.” Clark Aldrich, The State of Simulations, Sept. 2001, Online Learning

  36. 8. Simulations: Virtual Univ Adminstrator & Virtual Leader

  37. 9. Virtual Worlds/Virtual Reality • Avatars--representations of people • Objects--representations of objects • Maps--the landscape which can be explored • Bots--artificial intelligence

  38. 10. Online Performances(e.g., Cyber Fashion Shows)

  39. Which might you use? How would you use?

  40. Online Survey Tools for Assessment

  41. Sample Survey Tools • Zoomerang (http://www.zoomerang.com) • SurveyMonkey (http://surveymonkey.com) • QuestionMark(http://www.questionmark.com/home.html) • Survey Solutions from Perseus (http://www.perseusdevelopment.com/fromsurv.htm) • Infopoll (http://www.infopoll.com)

  42. Sample Survey Tools • Active Feedback • (http://www.activefeedback.com/af) • SurveyKey • (http://www.surveykey.com) • EZSurvey from Raosoft • (http://www.raosoft.com/) • SurveyShare (http://SurveyShare.com; from Courseshare.com)

  43. Survey Student Opinions (e.g., InfoPoll, SurveySolutions, Zoomerang, SurveyShare.com)

  44. Online Survey in Blackboard

  45. Web-Based Survey Advantages • Faster collection of data • Standardized collection format • Computer graphics may reduce fatigue • Computer controlled branching and skip sections • Easy to answer clicking • Wider distribution of respondents

More Related