480 likes | 595 Views
Formative Assessment Year 3 and Beyond TAP Schools. December 11, 2009. Welcome. Welcome and Overview Sheila Talamo, TAP Director The World According to Mr. Rogers by Fred Rogers. Objectives.
E N D
Formative AssessmentYear 3 and Beyond TAP Schools December 11, 2009
Welcome • Welcome and Overview • Sheila Talamo, TAP Director • The World According to Mr. Rogersby Fred Rogers
Objectives • Deepen understanding of the role of the Leadership Team in supporting and monitoring the analysis of student work in the field tests and in cluster. • Deepen understanding of the process of establishing criteria, developing measurement tools, and creating formative assessments that are based on the criteria and aligned to the testing format. • Develop a working knowledge, using formative assessments to track cluster progress and pinpoint specific student difficulties.
Agenda • Welcome • Formative Assessment Review • Backwards Design Model (I DO) • Refine Criteria (We Do) • Schools Establish Criteria (You Do) • When Students Track Their Progress • Tracking Data • Development • Book Talks • Evaluation/Closing
Questions for my EMT Due to time constraints please: • Jot down the questions you are having as we go through the training • You might discover the answer to your question later in the day • EMTs will give individualized support/consultation with your questions Handout
Let’s Review • Effective Use of Formative Assessments: Classroom vs. Cluster • Planning for Assessment of Student Learning • Leadership Team’s Role Regarding Formative Assessments
What do you consider when planning your assessment of student learning? • AllWrite RoundRobin • Stand-N-Share
When planning assessment of student learning I consider… • Criteria • Skills • Alignment • Formative Measures • Summative Measures • Student Characteristics • Tracking & Utilizing Student Data
What is the Leadership Team’s Role in Establishing Criteria and Tracking Formative Assessment ? RoundTable
Leadership Team’s Role: Establishing Criteria and Scoring Guides • Field Testing information is continuously brought to the Leadership Team before presenting at cluster. • The Leadership Team monitors and supports the development of criteria and formative assessments.
Leadership Team’s Role: Tracking Formative Assessments • The Leadership Team monitors and supports the analysis and tracking of student work • After the pre-test is administered and analyzed, the Leadership Team supports the development of the Student IGP Goal
Leadership Team’s Role: Tracking Formative Assessments • Based upon the progress of data, the Leadership Team may have to assist the master and mentor teachers in refining the criteria and/or detecting sub-skill problems • The Leadership Team is the Master & Mentor teachers’ cluster
What is the purpose in setting a Student IGP Goal? RoundRobin
Setting the IGP Student Goal • The cluster cycle goal mirrors the Master/Mentor’s IGP goal. • Master’s/Mentor’s IGP guides the cluster long range plan
Setting the IGP Student Goal • The origin of the Student IGP Goal is the field test pre-test data • Progress toward the IGP goal helps determine when to post-test • Leadership team decides when to administer the field test post-test that ends the cycle Handout
Backwards Design Model (I Do) • Identify Need/Skill… -What is the defined need? -Use various data sources (LEAP, iLEAP, GEE, Benchmark Tests, District Assessments, Dibels, etc.) -What GLE(s) will we target? -GLE(s) appropriate for content area based on biggest needs -What sub-skills/prerequisite skills that are critical in order to address the targeted GLE(s)? Handout
Backwards Design Model (I Do) • Select Strategy -What strategy will assist students in mastering the targeted GLE(s)/identified need? -Research & bring possible strategies to the Leadership team for consideration. • Establish Criteria/Develop Scoring Guide -What does mastery of the targeted GLE(s)/identified need look like? -Test the students’ knowledge and application of the skill not the strategy. -Language within the criteria will change based on content.
Mix-Pair-ShareRally Robin What are some things you should consider when creating a scoring guide?
Questions to Consider When Creating/Evaluating a Scoring Guide • Does the scoring guide relate to the outcome being measured? • Is the scoring guide free from irrelevant skills/sub skills and information? • Are the descriptors and scoring levels well defined? • Do the criteria align to high stakes testing? • Can teachers and students understand and consistently apply the scoring guide? • Is the scoring guide developmentally appropriate? • Does it reflect teachable skills or the strategy? • Will it provide the kind of information you need and can use effectively? Handout
Backwards Design Model (I Do) • Develop Pre-Test -How will the pre-test be aligned to the criteria and the testing format? -Set parameters for administering the pre-test • Determine Field Test Group • Conduct Pre-Test
Backwards Design Model (I Do) • Use Pre-Test Data to Refine Criteria -After examining the pre-test results, are there any grey areas within the scoring guide that must be clarified and resolved? -It is okay to change/adjust if needed • Set a Student IGP Goal -Based on the field test pre-test results, what will the Student IGP Goal be? -Set a realistic goal where all students must show growth -This goal tells you when to end the cycle
Backwards Design Model (I Do) • Chunk/Segment Strategy -As you field test this may need to be adjusted or modified • Develop Formative Assessments for each chunk, aligned to criteria and sometimes aligned to testing format *see handout • Field Test Each Chunk Handout
VIDEO Alice Harte: Master teachers creating the criteria
RoundRobin • Why is it important to use student work samples when refining the criteria/scoring guide? • Why is it important to include the whole leadership team in this process?
Establishing Criteria (We Do) • 3rd Grade Work Samples • Tessie Bell, Alice Harte Master Teacher • 3CR Strategy
Establishing Criteria (We Do) • Number Off • One Stray
Establishing Criteria (We Do) Using Tessie Bell’s 3rd Grade student work samples from Alice Harte: • We will refine theProgressing to Criteria section under the Restatementportion of the generic scoring guidetogether. • Our mission is to define the scoring levels to eliminate any grey areas teachers may encounter when scoring student work. Handout
Establishing Criteria (You Do) Using Tessie Bell’s 3rd Grade student work samples from Alice Harte: • Table groups will refine the Progressing to Criteriasection under the Details portion of the criteria within the scoring guide • Refer to the GLEs and guiding questions. • Sort the samples into HML, according to characteristics, and use them to refine the scoring guide. Handout
When Students Track Their Progress by Robert Marzano • Providing teachers with graphic displays of students’ scores on formative assessments were associated with a 26 percentile point gain • Students tracking their own progress using graphic displays, gains are even higher Handout
Fourteen Studies… Two different settings, same content taught for the same length of time: • Students tracked their progress • Students did not track their progress On average, students tracking their own progress was associated with a 32 percentile point gain in achievement.
What information is provided for students? • First, the rubric provides a description of the levels of performance • Second, tracking progress provides a representation of each student’s progression of learning
Best results obtained when: • Addressing a single goal in all the assessments • Use rubrics instead of points • Use different types of assessment
Tracking Data • Formative Assessments ARE the student data in a cluster cycle • Before data is tracked the criteria, scoring guide, and pre-test are presented to cluster • Data must be analyzed both quantitatively and qualitatively in order to thoroughly track students’ progress
Cluster Skill-Cycle Tracking Chart Handout Growth Formula: Increase (Post-Test Score – Pre-Test Score = ___) / Pre-Test Score = Percentage Growth *Exception: With a score of 0 on Pre-Test, % growth IS Post-Test score. To calculate a DECREASING score use formula above, but % will be in negative terms.* Qualitative Data
Cycle-Skill Student Tracking Chart Handout Growth Formula: Increase (Post-Test Score – Pre-Test Score = ___) / Pre-Test Score = Percentage Growth *Exception: With a score of 0 on Pre-Test, % growth IS Post-Test score. To calculate a DECREASING score use formula above, but % will be in negative terms.*
Grade/Content _________ Skill ____________ Strategy _________________ Date & Color of Modifications: ___________________________________ Handout Qualitative Data
Tracking Data Video • Present, chart, and analyze data (Quantitative) • Defining and categorizing characteristics of student work samples using the HML chart (Qualitative) Handout Sandra Walker-Parker Trenise Duvernay
Tracking Reminders • Present, chart, and analyze student work samples (Formative Assessments) for each chunk. • Revisit HML chart (add & scratch through). • Continuously connect back to IGP student goal to monitor progression toward goal. • Use the data to pin-point specific problems with sub-skills and individual students and determine next steps.
Checking for Understanding • Formative assessments are ALWAYS aligned with and measured against the criteria and scoring guide. • Formative assessments need to be a mixture of test items formatted and aligned to the high stakes test and test items where students must demonstrate their knowledge in other ways. • Quantitative and qualitative data are extracted from formative assessments.
Checking for Understanding • The data from the formative assessments guides the progress of the cluster cycle. • The data should be frequently measured against the Student IGP Goal in order to determine when to post-test. • The data is also used to guide all decisions, determine the success of a strategy, and to pinpoint specific student needs.
Rally Coach Handout
Evaluate Your Scoring Guide • Does the scoring guide relate to the outcome being measured? • Is the scoring guide free from irrelevant skills/sub skills and information? • Are the descriptors and scoring levels well defined? • Do the criteria align to high stakes testing? • Can teachers and students understand and consistently apply the scoring guide? • Is the rubric developmentally appropriate? • Does it reflect teachable skills or the strategy? • Will it provide the kind of information you need and • can use effectively? Handout
Book Talks • Developing Minds by Costa (Nicole) • The Power of Retelling by Benson (Vicky)