360 likes | 552 Views
Assessment for learning in science: Issues in learning to interpret student work. Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles Lawrence Hall of Science. UCLA Shaunna Clark Joan Herman
E N D
Assessment for learning in science: Issues in learning to interpret student work • Center for the Assessment and Evaluation of Student Learning (CAESL) • University of California, Berkeley • University of California, Los Angeles • Lawrence Hall of Science
UCLA Shaunna Clark Joan Herman Sam Nagashima Ellen Osmundson Terry Vendlinski WestEd Diane Carnahan Karen Cerwin Kathy DiRanna Jo Topps Lawrence Hall of Science Lynn Barakos Craig Strang U.C. Berkeley Maryl Gearhart Jennifer Pfotenhauer Cheryl Schwab
Presentation • Program and participants • Research framework • Design and methods • Selected findings • Implications 3
Impetus for program Situation • assessments in materials of variable quality • teachers lack expertise to revise • professional practices not well established • Argument • science education reform (NRC/NSES) • known benefits of classroom assessment (e.g., Black & Wiliam, 1998; Sloane & Wilson, 2000) • value of reflective practice and long term collaboration (Garet et al, 2001) 4
CAESL Leadership Academy 7/03 - 12/04 • Principles • integrated with practice • long term • collaborative • reflective practice • Core strategy • assessment portfolio 5
Program organization • Interwoven structures • district vertical teams w/ administrators • cross district grade level teams • independent classroom implementation • Series of portfolios • repeated opportunities to build expertise 7
Portfolio: I. Plan • Establish learning goals • analyze ‘conceptual flow’ of materials • align with standards • Select assessments • choose key assessments to track progress: pre -> junctures -> post • identify the concepts assessed • anticipate ‘expected student responses’ 8
Portfolio: II. Implementation • Interpret student work • refine assessment criteria • score • chart and identify trends • Use evidence • document instructional follow-up and feedback 9
Portfolio: III. Evaluate & revise • Evaluate using student work • alignment with goals and instruction • quality of tasks and criteria • methods of analysis • Revise and strengthen 10
Portfolio Completion • Rated for completeness Complete: I, II (some student work), III Partial: I or III, II (some) Minimal: I or III only None (but participating) 11
Study Focus • Growth in understanding and practice • Supports and barriers Longitudinal, nested design • 18 months = 3 portfolios • Cohort: Surveys, focus groups, portfolios • Cases: Interviews and observations 13
Framework for classroom assessment expertise • Understanding of assessment concepts • Facility with assessment practices 14
UNDERSTANDING ASSESSMENT CONCEPTS QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS QUALITY TOOLS QUALITY USE 15
UNDERSTANDING ASSESSMENT CONCEPTS QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS QUALITY TOOLS QUALITY USE SOUND INTERPRETATION 16
CLASSROOM ASSESSMENT PRACTICES ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 17
USING CONCEPTS TO GUIDE PRACTICE ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS • Soundness of interpretation: • criteria capture student understanding? • - scoring consistent? • - interpretation appropriate to purpose? INTERPRET STUDENT WORK 18
USING CONCEPTS TO GUIDE PRACTICE ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS • Soundness of interpretation: • criteria capture student understanding? • - scoring consistent? • - interpretation appropriate to purpose? INTERPRET STUDENT WORK 19
Selected findings • Portfolios: Patterns of change • assessment criteria • analysis of whole class patterns • alignment • Exit survey and focus groups • perceived growth • supports, barriers, needs 20
Patterns in portfolios Source • series of 2 or 3 portfolios(n ≈ 10) Issues & constraints • burden of documentation • paper & pencil assessments • professional choice 21
from global/holistic to more specific, differentiated, and assessable • from focus on surface features to efforts to capture student understanding • from dichotomous (right/wrong) to attention to qualitative levels of understanding • but … quality variable despite teacher interest (example: reliance on content analysis or notes) • Assessment criteria 22
from few to efforts at systematic analysis using charting or content analysis • from global patterns toward more differentiated analysis (item analysis, item clustering) and efforts to coordinate group & individual patterns • efforts to analyze progress (espec. pre-post) • but …information oftenunintegrated, inferences unsystematic, comparisons inappropriate • Whole class analysis 23
efforts to align interpretations with learning goals, tasks, and criteria • efforts to revise criteria to strengthen alignment • fewer inferences about ancillary skills not assessed • but … problematic alignment of assessments and inferences to track progress • Alignment 24
Exit survey Understanding of CAESL • 1 (none) <--> 5 (full) Implementation of CAESL • 1 (none) <--> 4 (full) <--> 5 (beyond) 25
PRACTICES STRENGTHENED MOST? ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 29
PRACTICES STRENGTHENED LEAST? ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 30
PRACTICES MOST IMPORTANT? ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 31
Supports Portfolio • establishing goals • revision and re-application • resource for collaboration Resources • CAESL framework • articles and books • grade level teams & facilitators • long term program 32
Barriers Portfolio • assessment development • only paper and pencil • focus on perf. assessments • time Resources • weak assessments • limited frameworks • no clear models for progress • gaps in teacher knowledge Context • standards • testing • school & district 33
Barriers Portfolio • assessment development • only paper and pencil • focus on perf. assessments • time Resources • weak assessments • limited frameworks • no clear models for progress • gaps in teacher knowledge Context • standards • testing • school & district Unnamed challenges • inquiry • ancillary skills 34
Requests Resources • embedded assessments • handbook • conceptual development • grade level collaboration • coaching and facilitation Portfolio …if…. • streamline • focus on goals, interpretation, and use • refinement not development • expand assessment types Context • align with district and state assessments 35
Implications • Strengthen materials & resources • Expand to unit assessment systems Align assessment content and quality Modify program organization 36