130 likes | 263 Views
A Dynamic Process for Developing Performance-Based Assessments. Dr. Louise Tanney, Maryland State Department of Education Elizabeth (Liz) Neal, Maryland State Department of Education Kate G. Rohrbaugh, ORC Macro. 1995 Maryland Redesign of Teacher Education. Component I:
E N D
A Dynamic Process for Developing Performance-Based Assessments • Dr. Louise Tanney, Maryland State Department of Education • Elizabeth (Liz) Neal, Maryland State Department of Education • Kate G. Rohrbaugh, ORC Macro
1995 Maryland Redesign of Teacher Education • Component I: • Strong academic background • Component II: • Extensive internship experience • Component III: • Performance Assessment • Component IV: Linkage with K-12 priorities • Maryland content standards are aligned • Diversity • Maryland Teacher Technology Standards
Maryland Teacher Technology Standards (1999) MTTS I. Information Access, Evaluation, Processing and Application MTTS II. Communication MTTS III. Legal, Social and Ethical Issues MTTS IV. Assessment for Administration and Instruction MTTS V. Integrating Technology into the Curriculum and Instruction MTTS VI. Assistive Technology MTTS VII. Professional Growth
MTTS Online Website www.mttsonline.org
Development of Performance Task • Identify Standard and Indicators • Identify Knowledge, Skills, and Abilities (KSAs) • Implement “Promise of Technology” • Develop Performance Task and Scoring Tool • Write Detailed Instructors Notes
MTTS IV: Assessment for Administration and Instruction Indicators: • Research and analyze data related to student and school performance. • Apply findings and solutions to establish instructional and school improvement goals. • Use appropriate technology to share results and solutions with others, such as parents and the larger community.
Pilots of MTTS IV (2001, 2005) • Goal: • Check alignment of Task/Indicators/KSAs/Scoring Tool • Evaluation instruments completed by: • Pilot Instructors • Teacher Candidates • Questions on student instrument: • Self-ratings of KSAs • Self-ratings of Task Indicators • Collected performance data from Pilot Instructors • Held feedback session for Pilot Instructors
Findings from Teacher Candidates • 46% were “Proficient” • Self-ratings of KSAs were low at start* • 23%: Knowledge of types of data elements related to student and school performance • 40%: Ability to reflect on student and teacher performance data to ensure continuous improvement *% rating themselves “Advanced” or “Higher Intermediate”
More Findings • Changes in Self-ratings of Indicators* • 37% to 84%: Able to research and analyze data related to student performance • 28% to 75%: Able to apply findings and solutions from data to establish school improvement goals *% rating themselves “At an Advanced Level” or “At an Intermediate Level”
Lessons Learned: • Task Revisions: • Create more scaffolding within task • Align Scoring Tool more with Standards/Indicators • Recommendations to Consortium: • Greater integration of KSAs into teacher education program • Pilot with instructors outside of consortium
Lessons Learned continued… • Collect Sample Teacher Candidate Products • Simplify Tasks • Provide Answer Keys to Pilot Instructors • Recruited External Instructors