420 likes | 518 Views
Developing Expertise with Classroom Assessment in K-12 Science. Maryl Gearhart, Sam Nagashima, Jennifer Pfotenhauer, Cheryl Schwab, Craig Strang Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles
E N D
Developing Expertise with Classroom Assessment in K-12 Science • Maryl Gearhart, Sam Nagashima, Jennifer Pfotenhauer, Cheryl Schwab, Craig Strang • Center for the Assessment and Evaluation of Student Learning (CAESL) • University of California, Berkeley • University of California, Los Angeles • Lawrence Hall of Science
Developing Expertise with Classroom Assessment in K-12 Science UCLA Joan Herman Sam Nagashima Ellen Osmundson Lawrence Hall of Science Lynn Barakos Craig Strang WestEd Diane Carnahan Karen Cerwin Kathy DiRanna Jo Topps U.C. Berkeley Maryl Gearhart Jennifer Pfotenhauer Cheryl Schwab
Overview of Session • Presentations • Project purpose and context • Framework for quality classroom assessment • PD model - the assessment portfolio • Research design - focus on one research tool • Small groups • Sample assessment portfolios • Research tool: Practice Profiles • Reconvene • Revisions of assessment portfolio • Preliminary research findings • PD model - assessment leadership • Q & A
Purposes of R & D • Classroom assessment expertise • PD strategy - assessment portfolio • Research tools • Framework to guide PD and research • New forms of collaboration
Background • Psychometric standards • Student Evaluation Standards (2002) • Standards for Educational and Psychological Testing (1999) • Buros (1990)
Standards for the teaching profession • National Board for Professional Teaching Standards (NBPTS) • California Standards for the Teaching Profession (CSTP) • Performance Assessment for California Teachers (PACT)
Conceptions of Classroom practice • General models: Stiggins…. • Assessment that supports learning: Black & Wiliam, Shepard, Wiggins …. • Cognitive models:Knowing What Students Know …. • Multi-leveled assessment systems: Shepard, Wilson
Science Education Resources • AAAS, Benchmarks for Science Literacy • National Science Education Standards • NRC, Classroom Assessment
CAESL Mission • Center for Assessment and Evaluation of Student Learning (CAESL) • strengthening K-12 science assessment at all levels of the system
CAESL Quality Classroom Assessment QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS QUALITY USE QUALITY TOOLS
CAESL Quality Classroom Assessment QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS Alignment Alignment QUALITY TOOLS QUALITY USE Alignment • Support Student Learning • Guide for Instruction • Instructional Materials Refinements • Assessment Refinements • Student Involvement and Accountability • Timely Feedback • Equitable • Valid for Purpose • Developmentally Sound Content • Clear Expectations • Reliable & Accurate • Fair & Unbiased Sound Interpretation • • Quality Criteria • Quality Application • Quality Analysis
CAESL Quality Classroom Assessment CLASSROOM ASSESSMENT SYSTEM • Longitudinal to track student progress • Multileveled • Coordinated and strategic • Feasible
Assessment Portfolio • Purpose • design, implement, evaluate an assessment plan for a curriculum unit • Support for • individual learning and reflection • coaching • collaboration • leadership • Team organization • cross site unit teams • K-12 district teams • local coaching
Portfolio sections • Section I: The plan • Section II: Implementation • Section III: Evaluation
I: The plan • Conceptual Flow • Establish learning goals • Context for assessment planning • Deepen knowledge
I: The plan • Record of Assessments in Instructional Materials (RAIM) • Concepts assessed • Expected Student Responses (ESRs) • Any need for assessment refinement • Key assessments to track progress
II: Implementation • Selected assessments • Documentation of: • Concepts assessed • Expected Student Responses (ESR) • Analysis of student work • Instructional follow-up and feedback
III: Evaluation • Prompts to guide review and revision • Revised tasks • Revised criteria
Research Questions • How does assessment expertise develop? • How does the portfolio support the development of assessment expertise?
Design and Data Sources • Longitudinal • Nested design All participants: • Surveys • Assessment portfolios Cases: • Observations • Interviews
Research tool: Practice Profiles Purpose: Capture development of assessment expertise over time Unit of Analysis: Teacher practice regarding one assessment
Designing the Profiles • Audience • Comprehensiveness • Detail • Assumptions
Draft Profile Levels • Draft levels of emerging assessment expertise Inexperienced -> Developing -> Maturing -> Expert
CAESL Quality Classroom Assessment QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS Alignment Alignment QUALITY TOOLS QUALITY USE Alignment • Support Student Learning • Guide for Instruction • Instructional Materials Refinements • Assessment Refinements • Student Involvement and Accountability • Timely Feedback • Equitable • Valid for Purpose • Developmentally Sound Content • Clear Expectations • Reliable & Accurate • Fair & Unbiased Sound Interpretation • • Quality Criteria • Quality Application • Quality Analysis
Sound Interpretation • Draft components Quality criteria Quality application of criteria Quality analysis
Small groups • Review sample portfolio • Review draft profile for Sound Interpretation
Portfolio Revisions I: Greater focus and integration RAIM fewer assessments Integrated CF & RAIM Tradeoff Comprehensive <- -> Focused
Portfolio Revisions II: Scaffolding the analysis Preliminary criteria for H M L Criteria revision using student work Models of whole class analysis Models of target student analysis TradeoffResources <- -> Prescriptions
Research Questions • How does assessment expertise develop? • How does the portfolio support the development of assessment expertise?
Findings from ‘03-’04 survey • Quality of assessment tool • Assessment use
Teachers feel most comfortable with: • Using assessment results as evidence of student progress to design developmentally appropriate instruction • Evaluating the alignment of assessments with learning goals • Teachers feel less comfortable with: • Evaluating scientific soundness of assessment content • Judging whether assessments capture full range of alternative conceptions
Leadership Context • Districts committed to K-12 science • Districts committed to reform/inquiry-based science
Challenges of Leadership • Various, developing levels of assessment expertise • Various, developing levels of leadership expertise • Science not the priority, erosion of reform • Quality assessment of reform science really not the priority
Goals of Leadership • Growth & improvement of individual practice in instruction & assessment • Leading/guiding individual growth in others • Influencing systemic change in districts: developing/implementing quality assessment systems
Dimensions of Leadership(Program Elements Matrix) • Professional Development • District Policy • Parent/Community Involvement/Support • Develop/Implement Quality Assessment System • Leadership Development & Capacity Building
Leadership Activities • CF & RAIM with Reform Unit with a Buddy (or a few) • Participate in FOSS ASK Project • Serve on and influence district committees • Presentations to school board, district administrators, parent groups • Develop benchmark assessments(!?)