220 likes | 340 Views
Assessment and Communication Journey of One Institution:. Lessons from SPA Reviews. Rubrics, Data Collection and Reporting. Assessment Issues . Specialty Professional Associations (SPAs) reported weakness in our rubrics. Need to provide better feedback to candidates.
E N D
Assessment and Communication Journey of One Institution: Lessons from SPA Reviews. Rubrics, Data Collection and Reporting
Assessment Issues • Specialty Professional Associations (SPAs) reported weakness in our rubrics. • Need to provide better feedback to candidates. Download the presentation at: http://education.indiana.edu/aacte2011
The Issue: • “…the absence of tabled scores by category or a description of the range of obtained scores did not allow reviewers to determine whether ALL candidates had achieved at least a score of proficient on ALL standards.” • The Solution: • iRubric reporting format; candidate performance by indicator and performance level. • The Journey: • Ongoing collaboration with iRubric in the development of meaningful reporting to reflect candidate performance by indicator/ SPA standard.
The Issue: • “Data tables need to be submitted that reflect the number/percentage of candidates scoring at each element of the assessment rubric rather than average scores.” • The Solution: • iRubric’s reports and analysis tools allows administrators to generate detailed and aggregate reports on individual standards and competencies in real-time. • The Journey: • Faculty introduction and use of iRubric reports and analysis tools; recognition of reports and tools value.
Include detailed descriptors in the cells to give students an idea of what is expected. Can add new sections with different grade scales
Click on the iRubric icon to grade this assignment using the attached rubric.
Click on the cells and the rubric is automatically tallied and scored. Add comments to give the candidate tips for improvement. The candidate gets clear feedback on areas for improvement and also, by seeing the rubric, knows what is expected. It is this level of detail that, in Bloomington, we were cited as in need of improvement by the SPA’s.
Instructor/Candidate Survey • Purpose • aid understanding trends in instructor/candidate communication about course assignments, expectations and earned grades.
Instructor/Candidate Survey • Survey Process • Pre- and Post- iRubric use surveys to instructors and their candidates. • Surveys distributed electronically.
Candidate Responses:What, if anything, did you like about the use of iRubric? • “I like that we are not wasting paper, and all of the iRubrics are found in the same place. They are also easy to follow.” • “I like the way iRubric popped up in its own window and clearly showed the grade I earned in highlights.” • “I like how you can read the expectations and then read your teacher’s comments on how you lost points.” • “It was clear and easy to see the expectations for the assignments. When they were graded, I knew exactly why the professor gave me a particular score.”
Candidate Responses: Would you like to see iRubric utilized for grading in your future courses? • “Yes I would – having access to a rubric before completing the assignment is a great tool and advantage, and this particular form of rubric is very handy for the student and the instructor because of its simplicity and availability.” • “Yes, I would like to see iRubric utilized for grading in my future courses because it was easy to obtain and understand how it worked. It was always in the same place and just a simple, but efficient program.” • “Yes, I would, because I appreciate the clarity of feedback and the ease of access online.”
Instructor Responses:How, if at all, did iRubric(s) help save you time and effort in grading, communication, meeting course objectives, etc.? • “Using the iRubric helped in grading extensively since I did not have to worry about being unfair and ensuring that each component matched the assignments. I did spend some time highlighting the rubrics in class so that students were aware of the requirements.” • “It made my grades very transparent and I did not get as many complaints with regards to grading assignments.”
Instructor Responses:What, if anything, have been the top few contributions that iRubric has made to your classroom? • “They provide a nice format for discussing (and impetus to review) requirements for projects.” • “Made me be clearer about my expectations and grading rationale to students.” • “More efficient grading. Clear feedback.”
Closing the Communication Loop • Through iRubric’s Analysis and Reporting engine, administrators have access to detailed aggregated/disaggregated performance data in real-time. • Faculty save time scoring rubrics with a few mouse clicks and iRubric will calculate the score and save the data. • Candidates are always informed as they have access to the rubric prior to submission of their work and access to scored rubrics after submission.
Lessons Learned • Use accreditation assessment process as a catalyst for making improvement. • Build easy-to-use solutions within the normal faculty toolsets. • Get support of teaching committees to encourage use and to create incentive. Word-of-mouth was important factor in expanded use.
Contact Us • Jill Shedd • jshedd@indiana.edu • Larry Riss • lriss@indiana.edu • Ramesh Sabetiashraf • rs@reazon.com
Indiana University School of Education Assessment and Communication Journey of One Institution: Lessons from SPA Reviews, Rubrics, Data Collection and Reporting From learning to navigate new report formats and performance based data requirements as well as the requirements of each specialty professional association (SPA), the teacher education unit has learned a variety of lessons including ways to: • refine key assessments; • promote a more robust communication among faculty, teaching candidates, and; • document for outside evaluators the performance of candidates and their collective ability to impact positively student learning among a diverse student population. This presentation, given at the 2011 AACTE National Conference Roundtables Session, describes how the IU School of Education effected a significant change that came as a result of SPA reviews – the design and implementation of an integrated process using iRubric Assessment & Outcomes System to refine the use of rubrics used for key assessments within programs and integrate the technology for the development and use of rubrics within the Indiana University course management system. Read about the journey the unit has taken through the SPA program review process, its implications on our assessment practices, and how it demonstrates even greater potential to improve the quality and clarity of communication of standards and expectations between faculty and teaching candidates, and subsequently to external reviewers when needed. Download the presentation at: http://education.indiana.edu/aacte2011