290 likes | 443 Views
Monitoring and Pre-Scoring Activities. Virginia Department of Education 2007-2008 Alternative Assessments Administrator’s Update Workshop August 2007. Monitoring.
E N D
Monitoring and Pre-Scoring Activities Virginia Department of Education 2007-2008 Alternative Assessments Administrator’s Update Workshop August 2007
Monitoring • Periodic and systematic review of evidence-based assessments in the development process with opportunities for feedback and intervention.
What Monitoring is Not • Checking with the teacher • Flipping through an evidence based assessment • Review without feedback
Pre-Scoring • A detailed review of evidence-based assessments against scoring rules prior to scoring with opportunity for feedback and intervention
What Pre-Scoring is Not • A “mad dash” to create a collection of evidence
Why Monitor and Pre-Score? To create better Collections of Evidence (COE) and Coursework Compilations (CWC).
Why COEs and CWCs Fail… • Missing evidence to support the SOL • Use of unacceptable evidence • Use of evidence not aligned with the SOL • Un-graded or incorrectly graded evidence • Missing or incomplete SEI tags
Components of Monitoring • Trained Reviewers • Feedback Loops • Intervention Options
Trained Reviewers • Persons with Content Knowledge and V-Program Knowledge • School Coordinator • Assigned or Hired Staff • Building Level Teams • Central Office Level Teams
Feedback Loops • Teachers preparing collections • Principals • V-Program Leaders • Trainers/Technical Assistance • Providers
Intervention Options • Individualized Consultation and Support • Peer or small group sessions • Attendance at Local Training • Access to state and local SOL or assessment resources
Monitoring Decisions • Who will monitor? • How will you train them? • When will monitoring occur? • Who will receive feedback? • How will feedback be given? • What are your intervention options?
A Sample Monitoring Plan • School-based Review Teams created and trained • Teachers turn in evidence-based assessments every nine-weeks • All evidence-based assessments reviewed by team according to division pacing chart • Team report shared with principal and central office • Central office dispatches Rapid Response Teams
A Sample Monitoring Plan School-Based Teams Review COEs every 9 weeks according to Pacing Chart and report To Principal and Division V-Program Leaders Select And Train School-Based Oversight Teams Instructional Teams dispatched Based on 9 week Report
A Sample Teacher Review Form VGLA 9 Week Review Sheet School-Based Teams Teacher: ____________________ Student: ___________________________ Grade Level:______ Content Area:_________________________________________ REVIEW for: ____ 1st Nine Weeks ____ 3rd Nine Weeks ____ 2nd Nine Weeks ____ 4th Nine Weeks Reviewed By:____________________________ Date:________ Collection Status: Address the following questions: (1) Is there evidence for all of the standards for the nine weeks according to the Pacing Chart? (2) Does the work submitted align with the standards satisfactorily? (2) Does the evidence demonstrate student mastery? (3) Has the student work been graded accurately? (4) Other? Recommendation(s): Be as specific as possible Follow-up review needed for implementation of recommendations: ___Yes ____No Follow –up Review Date: ________________
Pre-Scoring Activities • Permits a review of the completed (or nearly completed) collections of evidence and course work compilations against scoring rules with opportunity to make corrections
Rule #1- Evidence Must Show Individual Student Achievement • Does the evidence show any level of achievement according to the SOL being defended? • Is the evidence on grade level? • Are all the SOL addressed (stems and bullets)?
Rule #2- Evidence Must Be Student-Generated • Has the evidence been copied from the blackboard, textbook, or computer? • If the evidence is from a group project, is the achievement of the V-Program participant clearly identified? • If the evidence is a worksheet, are there examples or directions that provide answers?
Rule #3 - Evidence Must Be Labeled with SEI Tags • Does each piece of evidence have a completed SEI tag with: • Content area • SOL • Bullet (s) • “Demonstrated” or “Inferred” checked
Rule #4- Evidence Must Address Accommodations, Captions and Grading • If accommodations have been used, are they documented in the evidence? • Does each picture have a caption that clearly address the student’s level of achievement? • Has the work been graded accurately with correct and incorrect answers clearly identified?
Media Considerations • Do videotapes and audiotapes have transcripts? • Are videotapes and audiotapes clearly labeled with SEI tags?
Organizational Considerations • Is the evidence organized according to the scoring worksheet? • Are all required state and local forms completed?
Pre-Scoring Decisions • Who will pre-score? • How will you train them? • When will pre-scoring occur in relation to our submission date? • Who will receive feedback? • How will feedback be given? • What are your intervention options?
PHASE 1 - Building Level Review Submitting Teacher Review School-Based Oversight Teams Review Building Administrator Review and sign-off PHASE 2 - Central Office Review Submitted from schools Reviewed by Central Office Teams Return to Schools to Correct identified errors or omissions Sample Pre-Scoring Plan
Monitoring & Pre-Scoring Results • Reduces or Eliminates COEs and CWCs with: • Missing Evidence • Unacceptable Evidence (textbooks, homework) • Inaccurate or Ungraded Evidence
Monitoring & Pre-Scoring Results • Eliminates Surprises: • Collections not done for students educated outside of the division • Incomplete collections in the division • Collections not driven by IEP, 504 or LEP Student Assessment Participation Plans
Questions Division of Student Assessment and School Improvement (804) 225-2107