210 likes | 234 Views
Learn about the assessment system at the Education Department for effective program evaluation and improvement. Explore design, implementation, evaluation, and evidence aspects.
E N D
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment Workshops May 17, 2012, 1:30 – 3:00 p.m.Paine College, Augusta, Georgia
Session 3Deconstructing Standard Two Element 2a Assessment System Element 2b Data Collection, Analysis, & Evaluation Element 2c Use of Data for Program Improvement
Standard TwoAssessment System and Unit Evaluation The professional education unit has an assessment system that collects and analyzes data on applicant qualifications, candidate and graduate performance, and professional education unit operations to evaluate and improve the unit and its preparation programs.
Element 2a. Assessment System FOCUS ON 3 ASPECTS
Design & Development of System • Involve Professional Community • Reflect Conceptual Framework • Reflect State and Professional Standards • Target Candidate Performance with Key Assessments • Target Unit Operations • Include Multiple & Comprehensive Measures • Include Multiple Transition Points
Implementation of System • Involve Professional Community • Have an Assessment Calendar • Use Protocols, Instruments, & Rubrics • Have Consistent Collection of Data • Have Consistent Analysis of Data
Evaluation of System • Involve Professional Community • Have Regular Evaluation of Assessment Procedures • Have Regular Evaluation of Unit Operations • Check for Fairness, Accuracy, Consistency, and Freedom from Bias • Strive for Continuous Improvement
EVIDENCE Show Evidence for Each Aspect
Assessment SystemEvaluation Rubric Three Levels
Unacceptable • System does not reflect standards • Has not involved professional community • System is limited in its capacity to monitor candidate performance, unit operations, and programs • Assessments are limited • Bias, fairness, accuracy, and consistency not addressed
Acceptable • System aligns with conceptual framework, and state and professional standards • Regularly involves professional community • Uses comprehensive & integrated measures to assess/evaluate candidate performance, unit operations, and programs • Makes decisions based on multiple assessments made at multiple points throughout programs • Takes steps to address bias, fairness, accuracy, and consistency
Target • Meets all “Acceptable” criteria PLUS • Regularly evaluates effectiveness of system • Regularly examines validity and utility of data • Makes modifications to keep current • Determines relationship between program success and employment success • Makes data-driven changes for improvement
Sub-element of Standard 2a • How does the unit monitor candidate performance from admissions to program completion? Example: The unit uses key assessment measures at 5 key transition points from admissions to completion to assess candidate knowledge, skills, and professional dispositions in its initial B.A. and M.A.T. programs.
Sub-element of Standard 2a • How does the unit ensure its assessments are free of bias and its assessment procedures and unit operations are fair, accurate, and consistent? • 5 AFIs cited from 2007-2011
Examples Fair: Multiple decision points in time Multiple indicators Multiple assessments in multiple forms Multiple assessors Accurate: Crosswalks & Tables used to align assessments with all standards Electronic database used to collect and hold data Consistent: Assessment Calendar is followed Protocols and explicit rubrics Rater training Surveys checked for reliability Bias Free: Rater training Inter-rater reliability checked on field instruments Review instruments for unfairness & offensiveness Check for disparate impact
Sub-element of Standard 2a • What assessments and evaluations are used to monitor and improve candidate performance, unit programs, and unit operations? GACE content pass rates At the end of each semester, candidates complete a faculty evaluation. Each year, faculty complete a survey about the unit. Program completers complete an exit survey. Department Chair does a program report annually.
Additional Evidence • Admission process • Advising system • Unit governance structure • Candidate evaluation of supervising faculty, cooperating teachers, and field director • Supervising faculty and cooperating teachers evaluate the field director • School principals of program completers complete a first year survey
Sub-element of Standard 2a • How is the unit assessment system evaluated and continuously improved? Example: At monthly department meeting, assessment issues are discussed. Every semester, unit faculty and the initial programs arts and science faculty discuss candidate results from previous semester. Every Fall semester, we hold Advisory Council meetings for each program involving supervising faculty, cooperating teachers, school principals, and program completers to discuss data and the assessment system.
Sub-element of Standard 2a • How does the unit ensure that the assessment system collects information on candidate proficiencies outlined in the unit’s conceptual framework, state standards, and professional standards? Crosswalks were developed that show the alignment between the CF, state standards, professional standards and key assessments. All course syllabi reflect the alignment between assessments and the CF and standards.