1 / 24

Researching the Rubrics Site learnrubrics

Researching the Rubrics Site www.learnrubrics.ca. Roberta F. Hammett hammett@mun.ca. ‘Research at Work’ Context. Teaching and Research go hand in hand Research improves teaching and/or solves teaching problems But… Ethics!

ira-pollard
Download Presentation

Researching the Rubrics Site learnrubrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Researching the Rubrics Sitewww.learnrubrics.ca Roberta F. Hammett hammett@mun.ca

  2. ‘Research at Work’ Context • Teaching and Research go hand in hand • Research improves teaching and/or solves teaching problems • But… Ethics! • Negotiate the dual roles so as not to exploit students who must benefit (more) from the activities, receive their full course requirements/syllabus, and understand the nature and purpose of the research

  3. Goals of the Website • Familiarize teachers and potential teachers/education students with ‘official’ assessments in Newfoundland and Labrador • Share ‘privileged’ information with a broad audience: parents, students themselves (openness of dept processes) • Research: inter-rater reliability, learning tool, new literacies • Encourage understanding and use of multimodal literacies

  4. Beta Site – Pilot Study • The initial site www.rubrics.ca became the experimental site for research, design, and teaching activities • The current site www.learnrubrics.ca resulted from the re-design and from the renewed and continuing interest of the partners

  5. More Advice • Build partnerships with other educational institutions and ‘stakeholders’ – DoE, NLTA, schools, universities and colleagues • Use resources, especially human resources – research technical support (SPSS, Ethnograph), librarians and their workshops • 3 for 1 rule

  6. Inter-rater Reliability • US states use performance-based assessments; investigations of this issue are common (Mabry, 1999; Penny, Johnson, & Gordon, 2000; Coffman, 1971; Cooper, 1984; Cronbach, Linn, Brennon & Haertl, 1995). • Analytic scoring is associated with higher levels of inter-rater reliability than holistic scoring (Breland, 1983; Glaser, 1994). • Specifying of limited criteria in analytic writing scores (i.e., number of descriptors in each criteria at each level) assures reliability by limiting the scope of variability of scores (Moss, 1994; Mabry, 1999). • Scales with more than four levels of proficiency were found to be more effective than those with fewer (Godshalk, Swineford, and Coffman, 1966; McColly & Remstad, 1965; Longford, 1994).

  7. Scoring Patterns Research (Beta Site) • DoENL interested in “learning to score” • Analysis of the pilot stage data, including the comparative scores of different user groups (teachers, students, etc.) • The possibilities of the technology gathering capabilities of the beta site have been investigated • Re-design of the Internet site and the design of the next phase of the research

  8. Average Scores by PlaceThe boy with the dinorsaur (Primary #1) 40 Evaluations Completed

  9. User Average ScoresThe boy with the dinorsaur (Primary #1) 40 Evaluations Completed

  10. Comparison Graph

  11. Comparison Graph (2)

  12. Data Summary

  13. Conclusions • Research questions: • Is the site a good learning tool? • Is the site a good research tool? • Some evidence of improved scoring • Demonstrated the advisability of closely observing a group of users—probably the prospective teachers in our own undergraduate education programs scoring all selections, in sequence. • With sufficient numbers of writing selections at each ability level and sufficient numbers of scorers of each selection, more revealing and accurate patterns of learning could be discerned.

  14. Site Re-design • Reword login categories • More explicit descriptions of the categories • Count numbers of users scoring each selection by user category (anonymous) • Set up ‘closed’ sites for individual user groups (e.g., education students, teachers for panel participation or professional development) – research participation with consent

  15. Teaching and Other Research Directions • Site provides examples of multimodal texts and curricular links • Education students familiarize themselves with the sorts of texts children create • Education students learn to score children’s texts using ‘official’ rubrics • This is the research contention/hypothesis • As a researcher I have access to all kinds of data/information

  16. Project Partners • Faculty of Education, Memorial University • Department of Education Newfoundland & Labrador • Centre for Distance Learning and Innovation (CDLI) • Newfoundland & Labrador Teachers Association WWW.LEARNRUBRICS.CA

  17. Thank you • www.learnrubrics.ca • hammett@mun.ca

More Related