320 likes | 774 Views
GAA Purpose. To ensure all students, including students with significant cognitive disabilities, are provided access to the state curriculum To ensure all students, including students with significant cognitive disabilities, are given the opportunity the demonstrate their progress in learning and
E N D
2. GAA Purpose To ensure all students, including students with significant cognitive disabilities, are provided access to the state curriculum
To ensure all students, including students with significant cognitive disabilities, are given the opportunity the demonstrate their progress in learning and achieving high academic standards
3. Overview of the GAA The GAA is a portfolio of student work provided as evidence that a student is making progress toward grade-level academic standards.
Evidence provided must show instructional activities and student work that is aligned to specific grade-level standards.
The portfolio system is flexible allowing for the diversity of the students participating in the GAA.
4. GAA Core Belief and Guiding Philosophy All students can learn when provided access to instruction predicated on the state curriculum
Educators are key – significant training and support surrounding curriculum access is critical
Test development and technical documentation is ongoing
and includes documentation of decisions surrounding development and implementation
Technical expertise is important
Georgia’s Technical Advisory Committee
Augmented with an AA-AAS expert
5. Additional Resources Georgia took advantage of
Learning from other states
The growing understanding in the field of alternate assessments and what students with significant cognitive disabilities can do
US ED’s offer of technical assistance
We elected to focus on technical documentation
Invitation to have the National Alternate Assessment Center (NAAC) Expert Review Panel review documentation
National Center for Educational Outcomes (NCEO)
Peer Review
6. Description of GAA Structured Portfolio
a compilation of student work that documents, measures, and reflects student performance and progress in standards-based knowledge and skills over time
7. Overview of the GAA English/Language Arts (Grades K – 8 and 11)
Entry #1: Reading Comprehension
Entry #2: Communication – Writing or Listen/Speaking/Viewing
Mathematics (Grades K – 5)
Entry #1: Numbers and Operations
Entry #2: Choice from
Measurement and Geometry
Data Analysis and Probability or
Algebra (grades 3 – 5) The window between Collection Period 1and Collection 2 is from a minimum of 3 weeks to a maximum of 5 months.
Teachers will collect evidence of student performance of tasks aligned to a specific content standard.
The evidence will show the student’s progress toward these standards.
The window between Collection Period 1and Collection 2 is from a minimum of 3 weeks to a maximum of 5 months.
Teachers will collect evidence of student performance of tasks aligned to a specific content standard.
The evidence will show the student’s progress toward these standards.
8. Overview of GAA Mathematics (Grades 6 – 8 and 11)
Entry #1: Numbers and Operations or Algebra*
Entry #2: Choice from
Measurement and Geometry
Data Analysis and Probability
Algebra
Science (Grades 3 – 8 and 11)
Entry #1: Choice from blueprint, paired with Characteristics of Science standard
Social Studies (Grades 3 – 8 and 11)
Entry #1: Choice from blueprint
*Algebra strand is mandated for grade 11.
9. Overview of the GAA There are two collection periods for each entry over the course of the school year—minimum time between collection periods is 3 weeks.
Teachers collect evidence of student performance within tasks aligned to a specific grade level content standard.
This evidence shows the student’s progress toward those standards.
Each entry is comprised of 4 pieces of evidence
Primary and Secondary for Collection Period 1
Primary and Secondary for Collection Period 2
10. Types of Evidence Primary Evidence
demonstrates knowledge and/or skills either through work produced by the student or by any means that shows the student’s engagement in instructional tasks.
Secondary Evidence
documents, relates, charts, or interprets the student’s performance on similar instructional tasks.
11. Entry
12. Data Analysis and ProbabilityData Analysis and Probability
13. Can do a synopsis of all the photos if you choose – rather than captioning each one. Can do a synopsis of all the photos if you choose – rather than captioning each one.
14. Rubric Dimensions Fidelity to Standard
the degree to which the student’s work addresses the grade-level standard
Context
the degree to which the student work is purposeful and uses grade-appropriate materials in a natural/real-world application
Achievement/Progress
the degree of demonstrated improvement in the student’s performance over time
Generalization
the degree of opportunity given to the student to apply the learned skill in other settings and with various individuals across all content areas assessed
15. Rangefinding and Scoring Rangefinding took place in Georgia
committee of general and special educators
scored a representative sample of portfolios
provided guidance and a rationale for each score point assigned, which were used to create scoring guides and training/qualifying sets
Scoring took place in Minnesota
GaDOE staff on site
15% read behind and other typical quality control checks
16. 2006 – 2007 Entries by Grade
17. Standard Setting Three performance/achievement standards called ‘Stages of Progress’
Emerging Progress
Established Progress
Extending Progress
Descriptions written by development committee
18. Definitions of Stages of Progress
19. Standard Setting Method Portfolio Pattern Methodology
combined critical aspects of the Body of Work (Kingston, Kahl, Sweeney, & Bay, 2001) and the Massachusetts (Wiener, D., 2002) models
Holistic view of student work and direct tie to the analytic rubric as applied to performance levels
Standards set by grade bands – K – 2; 3 – 5; 6 – 8; and 11
Articulation committee reviewed recommendations across grade bands and content areas
20. Individual Student Report
21. First Year Look at Data Reliability
Potential largest source of error is the test administer
Inter-rater agreement (% exact agreement, % adjacent agreement, & Kappa)
Correlation between scores for Entry 1 and Entry 2
G-study (persons, items, raters)
Comparability of scores across years (planned)
stability over time
22. Inter-rater Agreement
23. Kappa
24. Correlation Between Entry 1 and Entry 2
25. Sources of Evidence for Validity Inter-correlation among the dimensions
Content and Alignment—Fidelity to Standard
Consequential Validity Study
Curriculum access (baseline completed)
Alternate assessment (planned)
Comparability of scores across years (planned)
Alignment study (Links for Academic Learning)
Content coverage (depth and breadth)
Differentiation across grades (vertical relationship)
Barriers to learning (bias review)
Alignment of instruction
26. ELA Entry 1 Correlations for Grade 3
27. The Challenge of Documenting Technical Quality of Alternate Assessments Diversity of the group of students being assessed and how they demonstrate knowledge and skills
Often “flexible” assessment experiences
Relatively small numbers of students/tests
Evolving view/acceptance of academic curriculum (i.e., learning experiences
The high degree involvement of the teacher/assessor in administering the assessment (Gong & Marion, 2006)
A lack of measurement tools for evaluating non-traditional assessment approaches (e.g., portfolios, performance tasks/events), demonstrating a need to expand the conceptualization of technical quality (Linn, Baker, & Dunbar, 1991)
NHEAI / NAAC
28. Georgia Technical Considerations Heavy investment in training special educators
Curriculum access training teachers started two years before implementation of the assessment
Four-step process for aligning instruction to grade-level standards
Teacher Training (Regional and Online)
Regional workshops and online presentations
Electronic message/resource board
Sample activities vetted with curriculum staff
29. GAA Technical Documentation Our documentation includes
Rationale for our AA approach (portfolio)
Consideration of the purpose of the assessment and its role in our assessment system
Consideration of who the students are and how they build and demonstrate their achievement
Consideration of the content assessed and establishment of alignment, including monitoring plans
30. GAA Technical Documentation Our documentation includes
Development of the assessment, including rationale for key decisions
Training and support for educators responsible for compiling portfolios from both a curriculum access and assessment perspective
Consideration and analysis of potential test bias
Scoring and reporting procedures, including steps taken to minimize error
31. GAA Technical Documentation Our documentation includes
Our plans for specific validity studies
Traditional statistics you would expect to find in technical documents
Inter-rater reliabilities
Score point distributions with standard deviations
Correlations of rubric dimensions by content area
Our goal is to collect validity evidence over time and systematically document the GAA “story”
32. Next Steps Continue refining program
Continue training and support of teachers
Expanding resource board with adapted lessons and materials
Conduct a series of validity studies
In collaboration with NAAC and other states (GSEG)