1 / 47

Online Peer Evaluation System Team Green Apple

Online Peer Evaluation System Team Green Apple. Team Members Ada Tse Amber Bahl Tom Nichols Matt Anderson. Faculty Mentor Prof. M Lutz. Project Sponsor Richard Fasse RIT Online Learning. Agenda. Project Overview Current System Our Product Features Requirements Process

ellery
Download Presentation

Online Peer Evaluation System Team Green Apple

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Peer Evaluation SystemTeam Green Apple Team Members Ada Tse Amber Bahl Tom Nichols Matt Anderson Faculty MentorProf. M Lutz Project SponsorRichard Fasse RIT Online Learning

  2. Agenda • Project Overview • Current System • Our Product Features • Requirements Process • Product Architecture and Design • Software Process • Risk Analysis and Mitigation • Metrics • Status & Future Goals • Demo

  3. Problem Statement • The RIT Online Learning department is in need of an online peer evaluation system that can enhance the collaborative learning experience. Existing Tool: • Paper Alternative • Clipboard Survey System

  4. Importance • Group work is an important aspect in today's education system • The average SE graduate does about 16 group projects

  5. Current System: Clipboard • Create, Deploy and Analyze • Does provide different views for analysis but more effective for analyzing surveys then Peer Evaluations. • Very Hard to identify problem groups • Not integrated with myCourses • Survey System • Can’t deploy evaluations per group • Hard to setup • Reporting does not show group weaknesses • No control over who takes the survey

  6. Current System: Reporting View • View: Percentage/ Graph

  7. Current System: Reporting View

  8. Solution: Custom Application

  9. Peer Evaluation System • Integrated with existing system • Login pass-through • Course and group data imported directly from myCourses • Setup Workflow • Tailored for peer evaluations • Question Templates • Reusable • Shared between instructors

  10. Application Workflow Student main -Take Eval WOW!! Instructor Main -Create Eval Instructor Main -Reporting 2. Take Evaluation 1. Create Evaluation 3. Analyze Results

  11. Instructor Main Evaluations listed per course Evaluation status List of global and personal questions templates

  12. Solution: Create Evaluation Eval Setup Info Select Template

  13. Solution: Create Templates Global/ Personal

  14. Solution: Students View Instructions All students of a group.

  15. Solution: Reporting • Reporting (Provided with the help of multiple views) • Multiple levels of detail • By group • By student • Sorted by groups or individuals • Quickly identify problem groups

  16. Solution: Reporting View

  17. Solution: Reporting View

  18. Requirements Process • Mainly elicited by: • In-person Interviews • Project Sponsors • Subject Matter Experts • Online Learning Technical Staff • UI Mockups • Evaluating • RIT Clipboard • Peer Evaluation Templates

  19. Requirements Analysis • Use Case Analysis • Workflow Diagrams • Workflow Steps • Constant user feedback at the end of each Sprint

  20. Product Architecture and Design

  21. Entity Relationships

  22. Data Model Architecture

  23. Package Diagram

  24. Deployment Diagram

  25. Software Process

  26. Process: Scrum • What is Scrum? • Scrum is an iterative, incremental process for developing any product or managing any work. It produces a potentially shippable set of functionality at the end of every iteration (Sprint).

  27. Scrum: Sprint • Typical team size 2 to 4 members • Delivers working software • Typically between 1-4 week iterations • Cross-functional tasks per team member • New work may be uncovered by the team during development

  28. Our Methodology • Flavor of Scrum • Differences: • Upfront requirements • Postponed the Sprint one delivery date by 2 weeks • Similarities: • The whole project was implemented in chunks (Sprints) depending on the requirements prioritization (Sprint Backlogs). • Team meetings

  29. Risk Analysis and Mitigation

  30. Risk • New Technologies • .NET • Integration with myCourses • XML Feeds • Testing • LDAP Authentication • Complexity of business requirements

  31. Risk Mitigation: Task Planning • New Technologies • Allocated tasks according to skill set • Team members started off with small/simple programs • Experienced team members educated the team

  32. Risk Mitigation: Development • LDAP & myCourses integration • Great help from the Online Learning • Complex business requirements • Incremental development & comprehensive requirements gathering

  33. Risk Mitigation Plan: Software Process • Use of Scrum • User Feedback (Allows for midcourse corrections) • Increased Product Visibility • Increased Progress Visibility • Sprint Planning • Through many sprints the requirements were revised many times to ensure that clarity is achieved. • Throughout every sprint, each decision will be evaluated to make sure that it aligns with the overall goals of the project.

  34. Risk Mitigation: Tooling • Subversion for revision control • Google groups • Trac provides web based management • View files and changesets • Automated synchronization of project documents to web site • Trac provided an integrated bug tracking system

  35. Data Collection

  36. Metrics • Backlogs • Product • Sprint • Number of tasks completed for a particular sprint (work effort distributed for each sprint) • Number of bugs • By Feature • By Severity • Per Sprint • Total effort (man hours) for all phases

  37. Effort Metrics

  38. Bugs Per Feature Total # of bugs: 53 Major: 22Minor: 11Trivial: 20

  39. Current Status • Progress

  40. Future Enhancements • More views for reporting • Currently our application supports 2 views • High-level groups + students • Team member + responders + questions • Better support for answer type • Currently our application supports • Text Type • Radio Button

  41. Reflections • Great Team!!! • All team members were new to the group • Appropriate Software Process Model • Delays in Sprint 1 • Unknown Technologies • .NET 2.0

  42. Demo • Peer Evaluation System

  43. Questions • Thank you!

  44. Supporting Data

  45. Supporting Data

  46. Challenges • Uniformity • Rating System • Question System • Faculty View • Different User Types • Synchronization with myCourses

More Related