190 likes | 379 Views
Assessment & Technology. UH-M COLLEGE OF EDUCATION. COE Outreach & Technology. Electronic Exhibit Room. Systematic assessment of evidence of student learning at multiple points in program Program assessment data compiled to an internal website
E N D
Assessment & Technology UH-M COLLEGE OF EDUCATION COEOutreach & Technology
Electronic Exhibit Room • Systematic assessment of evidence of student learning at multiple points in program • Program assessment data compiled to an internal website • NCATE reviewers access at their leisure • Data remain available between reviews • Easy for program faculty to maintain
Program Assessment • Assessed by: • Candidates (exit surveys, course evaluations) • Alumni (survey, focus groups) • Employers • Mentor Teachers • COPR Process • Learned Societies • Candidate Learning Outcomes Review Assessing Programs by Assessing Candidates
Candidate Assessment • Candidate Assessment • Evidence Collection = Portfolio • Grade Reports • Exam Scores • Work Samples • Faculty Observation Summaries • Students’ Work Samples • Candidate Portfolio Tools • PowerPoint (hyperlinks, external files, branching) • Task Stream – online • CD-R
Assessing Learning Outcomes • Define Program Objectives • Define Points of Measurement • Define Evidence for Objectives • Define Rubric for Assessing Evidence • Delineate Who Evaluates and When
Example (part 1: Program defines Objectives) • ABC Program Objective 1: (Knowledge) Candidates know . . . (Skill) Candidates are be able to … (Disposition) Candidates exhibit… Objective 2: Objective 3: Objective 4:
Example (part 2: Program Chooses Points of Measurement) Program will Assess Candidates • Beginning (defined: immediately upon admission) • Middle (defined: conclusion of EDUC XXXX course and/or prior to student teaching) • End (defined: conclusion of field experience XXXX)
Example (part 3: Program assigns evidence to objectives) • Objective 1: Professional Legal Responsibilities The teacher candidate demonstrates an understanding of (knowledge) and ability to apply (skill) and model (disposition) legal responsibilities expected of professional educators.
Example (part 4: Program defines rubric scale for evidence) 3 Target: Evidence reflects in-depth knowledge and understanding of standard; outstanding data and evidence of application 2 Acceptable: Evidence indicates knowledge and understanding of standard; satisfactory data and evidence of application 1 Unacceptable: Evidence shows little or inadequate knowledge of standard; limited data and evidence of application
Example (part 5: Program states who will measure and when) Candidate Outcomes Review • Faculty assigned to review candidate outcomes • Review Committee determines program completion for candidate • Candidate outcomes aggregated • Summary data provided to Associate Dean on cohort
Example (part 6: Composite candidate scores defined, measured) • Summarize each Candidate’s Assessment Mid-Point Assessment: • e.g. Overall Unacceptable: 1 or more unacceptables • e.g. Overall Acceptable: 0 unacceptables, <4 superiors • e.g. Overall Superior: 0 unacceptables, 5 or more superior scores
Use of Technology • Candidate use to collect and present evidence • Program use to assess candidate learning • Program use of aggregated data for program review • College use to maintain data overtime for accreditation purposes
Challenges • Requires a shift in thinking: from grades to authentic assessment of learning outcomes • Program objectives must be made explicit • Faculty agreement on rubrics and scales • Need to identify ways to manage the process • Technology must be helpful, not burdensome