320 likes | 584 Views
Georgia Alternate Assessment. Eighth Annual Maryland Conference October 2007. Melissa Fincher, Georgia Department of Education Claudia Flowers, UNCC. GAA Purpose. To ensure all students, including students with significant cognitive disabilities, are provided access to the state curriculum
E N D
Georgia Alternate Assessment Eighth Annual Maryland Conference October 2007 Melissa Fincher, Georgia Department of Education Claudia Flowers, UNCC
GAA Purpose • To ensure all students, including students with significant cognitive disabilities, are provided access to the state curriculum • To ensure all students, including students with significant cognitive disabilities, are given the opportunity the demonstrate their progress in learning and achieving high academic standards
Overview of the GAA • The GAA is a portfolio of student work provided as evidence that a student is making progress toward grade-level academic standards. • Evidence provided must show instructional activities and student work that is aligned to specific grade-level standards. • The portfolio system is flexible allowing for the diversity of the students participating in the GAA.
GAA Core Belief and Guiding Philosophy • All students can learn when provided access to instruction predicated on the state curriculum • Educators are key – significant training and support surrounding curriculum access is critical • Test development and technical documentation is ongoing • and includes documentation of decisions surrounding development and implementation • Technical expertise is important • Georgia’s Technical Advisory Committee • Augmented with an AA-AAS expert
Additional Resources Georgia took advantage of Learning from other states The growing understanding in the field of alternate assessments and what students with significant cognitive disabilities can do US ED’s offer of technical assistance We elected to focus on technical documentation Invitation to have the National Alternate Assessment Center (NAAC) Expert Review Panel review documentation National Center for Educational Outcomes (NCEO) Peer Review
Description of GAA • Structured Portfolio • a compilation of student work that documents, measures, and reflects student performance and progress in standards-based knowledge and skills over time
Overview of the GAA English/Language Arts (Grades K – 8 and 11) Entry #1: Reading Comprehension Entry #2: Communication – Writing or Listen/Speaking/Viewing Mathematics (Grades K – 5) Entry #1: Numbers and Operations Entry #2: Choice from Measurement and Geometry Data Analysis and Probability or Algebra (grades 3 – 5)
Overview of GAA Mathematics(Grades 6 – 8 and 11) Entry #1: Numbers and Operations or Algebra* Entry #2: Choice from Measurement and Geometry Data Analysis and Probability Algebra Science(Grades 3 – 8 and 11) Entry #1: Choice from blueprint, paired with Characteristics of Science standard Social Studies (Grades 3 – 8 and 11) Entry #1: Choice from blueprint *Algebra strand is mandated for grade 11.
Overview of the GAA • There are two collection periods for each entry over the course of the school year—minimum time between collection periods is 3 weeks. • Teachers collect evidence of student performance within tasks aligned to a specific grade level content standard. • This evidence shows the student’s progress toward those standards. • Each entry is comprised of 4 pieces of evidence • Primary and Secondary for Collection Period 1 • Primary and Secondary for Collection Period 2
Types of Evidence • Primary Evidence • demonstrates knowledge and/or skills either through work produced by the student or by any means that shows the student’s engagement in instructional tasks. • Secondary Evidence • documents, relates, charts, or interprets the student’s performance on similar instructional tasks.
1/31 100% 100% Captioned photos clearly show the student in the process of the task as well as his completed product. The captions describe each step of the task and annotate the student’s success.
Rubric Dimensions • Fidelity to Standard • the degree to which the student’s work addresses the grade-level standard • Context • the degree to which the student work is purposeful and uses grade-appropriate materials in a natural/real-world application • Achievement/Progress • the degree of demonstrated improvement in the student’s performance over time • Generalization • the degree of opportunity given to the student to apply the learned skill in other settings and with various individuals across all content areas assessed
Rangefinding and Scoring • Rangefinding took place in Georgia • committee of general and special educators • scored a representative sample of portfolios • provided guidance and a rationale for each score point assigned, which were used to create scoring guides and training/qualifying sets • Scoring took place in Minnesota • GaDOE staff on site • 15% read behind and other typical quality control checks
Standard Setting • Three performance/achievement standards called ‘Stages of Progress’ • Emerging Progress • Established Progress • Extending Progress • Descriptions written by development committee
Standard Setting Method • Portfolio Pattern Methodology • combined critical aspects of the Body of Work (Kingston, Kahl, Sweeney, & Bay, 2001) and the Massachusetts (Wiener, D., 2002) models • Holistic view of student work and direct tie to the analytic rubric as applied to performance levels • Standards set by grade bands – K – 2; 3 – 5; 6 – 8; and 11 • Articulation committee reviewed recommendations across grade bands and content areas
Individual Student Report Individual Student Report – Side 1 Individual Student Report – Side 2
First Year Look at Data • Reliability • Potential largest source of error is the test administer • Inter-rater agreement (% exact agreement, % adjacent agreement, & Kappa) • Correlation between scores for Entry 1 and Entry 2 • G-study (persons, items, raters) • Comparability of scores across years (planned) • stability over time
Inter-rater Agreement Based on 15% read behind; % represents exact agreement.
Kappa Based on 15% read behind.
Sources of Evidence for Validity • Inter-correlation among the dimensions • Content and Alignment—Fidelity to Standard • Consequential Validity Study • Curriculum access (baseline completed) • Alternate assessment (planned) • Comparability of scores across years (planned) • Alignment study (Links for Academic Learning) • Content coverage (depth and breadth) • Differentiation across grades (vertical relationship) • Barriers to learning (bias review) • Alignment of instruction
The Challenge of Documenting Technical Quality of Alternate Assessments • Diversity of the group of students being assessed and how they demonstrate knowledge and skills • Often “flexible” assessment experiences • Relatively small numbers of students/tests • Evolving view/acceptance of academic curriculum (i.e., learning experiences • The high degree involvement of the teacher/assessor in administering the assessment (Gong & Marion, 2006) • A lack of measurement tools for evaluating non-traditional assessment approaches (e.g., portfolios, performance tasks/events), demonstrating a need to expand the conceptualization of technical quality (Linn, Baker, & Dunbar, 1991) NHEAI / NAAC
Georgia Technical Considerations • Heavy investment in training special educators • Curriculum access training teachers started two years before implementation of the assessment • Four-step process for aligning instruction to grade-level standards • Teacher Training (Regional and Online) • Regional workshops and online presentations • Electronic message/resource board • Sample activities vetted with curriculum staff
GAA Technical Documentation Our documentation includes Rationale for our AA approach (portfolio) Consideration of the purpose of the assessment and its role in our assessment system Consideration of who the students are and how they build and demonstrate their achievement Consideration of the content assessed and establishment of alignment, including monitoring plans
GAA Technical Documentation Our documentation includes Development of the assessment, including rationale for key decisions Training and support for educators responsible for compiling portfolios from both a curriculum access and assessment perspective Consideration and analysis of potential test bias Scoring and reporting procedures, including steps taken to minimize error
GAA Technical Documentation Our documentation includes Our plans for specific validity studies Traditional statistics you would expect to find in technical documents Inter-rater reliabilities Score point distributions with standard deviations Correlations of rubric dimensions by content area Our goal is to collect validity evidence over time and systematically document the GAA “story”
Next Steps • Continue refining program • Continue training and support of teachers • Expanding resource board with adapted lessons and materials • Conduct a series of validity studies • In collaboration with NAAC and other states (GSEG)