130 likes | 279 Views
Agenda. NYC Scanning Solution Process Overview Scanners Answer Documents Reports for Schools June/August 2011 Debrief Summary Lessons Learned Process Improvements Q & A. DOE Data Capture Solution .
E N D
Agenda • NYC Scanning Solution • Process Overview • Scanners • Answer Documents • Reports for Schools • June/August 2011 Debrief • Summary • Lessons Learned • Process Improvements • Q & A
DOE Data Capture Solution • Leverage existing technology and processes currently in use in schools to capture item-level information & provide real-time results. This includes: • Answer Documents: Generated & printed at school-level • Scoring:Teachers to score Constructed Response items only • Scanning: Using schools’ attendance scanners, locally scan documents on-site • Reporting Results:Automated transfers between internal systems • Benefits: • Increased automation of time-consuming manual processes • Less overall time required to complete scoring by teachers • No further data entry requirements of aggregate scores • Reduction in clerical scoring errors • eliminate manual scoring of MC items • adding totals from different sections • application of conversion tables • Districts/schools/teachers will be able to better utilize item-level results to help inform instruction, similar to Grade 3-8 exams
NYC Scanning Process Overview Overnight data transfer • Schools schedule students to sit for Regents exam • Schools locally print answer documents (student specific) • Pre-slugged with Bar Code (visible Student Name & ID) • Answer documents are 1 or 2 pages (depending on exam) • NYC process does not allow for Blank (non pre-slugged) answer documents • Walk-ins: School must individually enter student ID into ATS (local Student Information System) to generate & print the student’s Regents answer document • Administer Test Scoring committee scores open-ended items • DOE inputs SED-released answer key & conversion table into ATS & “opens” scanning • School scanning team SCANS all answer documents, data upload to ATS • Confirm accuracy of data capture (manually score 5% of exams) • Re-scan if needed (audit trail of any changes) • Transfer of final score data: Report Cards, Transcripts Load to SED Real-time scoring: Raw & Scale Score
Image Scanners Currently Utilized by Schools Lexmark X656DE Capacity: 75 pages Standalone scanner (direct network connection) **Most commonly used scanner Fujitsu fi-6670 Capacity: 200 pages Connected via desktop
NYCDOE Answer Docs • Note: • NYCDOE will require use of Pencil (not Pen) for students and raters • Double-sided answer documents posed image capture challenges
NYCDOE Reports for Schools • All schools receive immediate reports detailing performance (available on screen with options to print or export to MS Excel) • Summary / Status reports • Item Level Reports • Item Distribution Reports • Change Reports • Omission/Multiple Reports
Utilization Metrics – June 2011 • Nearly 96% of answer documents generated (> 507K exams) were successfully scanned. • 814 unique DBNs administered at least one scanned Regents.
Utilization Metrics – August 2011 • Approximately 145K exams (≈197K pages) were successfully scanned, resulting in a scaled score, ABS, or INV record. • 338 unique DBNs administered at least one scanned Regents
Implementation Highlights • More than 674K+ exams successfully processed end-to-end, with high (96%) scanned/printed ratio across 800+ schools. • New technology solution met SED requirements • Data output: general excitement at both the school level & central office • Item-level data available for instructional and planning purposes • Image capture and data string available for audit purposes • Time Savings: teachers were pleased not to have score everything themselves. • 2.5 minutes per exam x 674,000 exams scanned = 28,083 hours of manual data entry work saved • Schools’ and networks’ familiarity with Regents Scanning from June cycle resulted in fewer process errors and support needs in August.
General Challenges • High volumes of scanning (2x daily average) put strain on IT mainframe and downstream systems. Systems bent, but did not break • Process and IT improvements planned for future testing cycles (e.g. additional mainframe space, re-prioritization DBA activity during scanning week). • Time lags in applying data from Scanner to Student Information System was at times confusing to schools, and problematic for downstream systems. • SED errata corrections posed new challenges to correct previously scanned scores. Different answer keys/conversion tables based on language needed to be created. • Students using wrong answer documents/mis-matching exams • NYC Students outside of NYC – needed to use different answer documents
General Challenges • Rescanning – many schools had to re-scan answer documents, thereby requesting a time extension on scanning. Key reasons include: • Light bubble marks • Multiple marks picked up / stray lines • Clerical errors on teacher scored sections – all exams • Light Toner • Incorrect printer settings • NYC created an online tool to help address these rescanning issues • Policy issues: majority of schools requested clarification on various administrative and policy topics, including • Pre-populating accommodations onto answer documents • What’s allowed for transcription (e.g. paper rips, paper won’t scan, student gets sick) • Policy on scanning “absent” students or students not finishing science lab requirements • “Did not meet lab requirement” for Science Exams was a common question from schools • Earth Science – Performance Score but no written test. What is the score? • Scanning before the constructed response section is scored (different for 1 to 2 page exams)
Process Improvements • For upcoming Regents Administrations, we are implementing the following process improvements • Creating a scannable, generic answer document • Incorporating “Did not meet lab requirement” on Science exams • Changing the scoring algorithm so that any response will override the absent bubble • For scanning reports, the system will not display raw/scale score until both pages are scanned • NYC created an online tool to allow Principals to correct an answer if the scanner did not pick up the intended response (e.g. light mark, stray mark)