420 likes | 586 Views
Are Your Schools Ready for the Next Generation Assessments ? What You Need to Know from All Six Multi-State Consortia. Wes Bruce – PARCC Brandt Redd – Smarter Balanced Scott Elliot – ELPA 21 ASSETS – Carsten Wilmes Neal Kingston – DLM Chris Domaleski - NCSC
E N D
Are Your Schools Ready for the Next Generation Assessments?What You Need to Know from All Six Multi-State Consortia Wes Bruce – PARCC Brandt Redd – Smarter Balanced Scott Elliot – ELPA 21 ASSETS – Carsten Wilmes Neal Kingston – DLM Chris Domaleski - NCSC Philip Olsen – Wisconsin Department of Public Instruction Andy Middlestead – Michigan Department of Education
Assessment Consortia • Comprehensive • PARCC • Smarter Balanced • English Language Proficiency • Elpa21 • ASSETS • Alternate • Dynamic Learning Maps (DLM) • NCSC
“Get ready cause here we come!”* NCSA – New Orleans Wes Bruce June 27, 2014 * With apologies to The Temptations
Readiness, “Has Only Just Begun” with the Technical Specifications • Counting devices and checking bandwidth is the tip of the iceberg…. • Can schools actually deliver the tests? • Are test administrators ready for the logistics? • Can you provide the data to take advantage of the new opportunities? • Do teachers and students know what is expected of them? • Are parents and key publics aware of the tests and the possible results? • Have you addressed Opportunity to Learn?
Schools, Districts and Systems • Determine if schools/districts are ready • What is your “due diligence” to validate that schools can deliver? • Test administrators • Even if you have been online, these systems are different and place different expectations on those administering the test • Data Systems • Can you produce the kind of data needed to take advantage of these new systems • PNP?
Teachers, Students and Parents • Teacher Readiness • Have they taught the content and are they aware of the tasks that will be used • Student Readiness • Item types • Test interface • Parent Readiness • Assessment itself • Results - both item and CCR • Cite evidence vs. plagiarism
PARCC Readiness Resources • Model Content Frameworks • www.parcconline.org/parcc-model-content-frameworks • Test Specifications and Blueprints • http://www.parcconline.org/assessment-blueprints-test-specs • Sample items, tutorials and practice tests (all grades & sub.) • http://www.parcconline.org/practice-tests • Technology Specifications • http://parcconline.org/technology • Technology Resources • http://parcc.pearson.com/support
Smarter Balanced Brandt ReddCTO National Conference on Student Assessment27 June 2014
A National Consortium of States 22 member states and territories representing 39% of K-12 students 20 Governing States, 1 Advisory State, 1 Affiliate Member Washington state is fiscal agent UCLA Graduate School of Education will be permanent home.
A Balanced Assessment System Summative assessments Benchmarked to college and career readiness Teachers and schools have information and tools they need to improve teaching and learning Common Core State Standards specify K-12 expectations for college and career readiness All students leave high school college and career ready Teacher resources for formative assessment practices to improve instruction Interim assessments Flexible, open, used for actionable feedback
SmarterApp.org • Smarter Balanced: A consortium of states developing common assessments for ELA and Mathematics that are aligned to the Common Core State Standards. • SmarterApp: A community of organizations devoted to collaboration on an open licensed software suite for the support of educational assessment.
Smarter Balanced Assessment Delivery Architecture Consortium Hosted Item Authoring(author, approve, versions, etc.) Test Item Bank(test items, test authoring, test packager, etc.) Data Warehouse and Reporting Student responses,item scores,and test scores. Operationalitems Assessment DeliverySystem* Aggregate and student-level‡ reports. Test package Test Integration, Test Scoring Test Delivery Items TestAdministrator(Proctor) Responses Parents and Educators Item Scoring(Deterministic, AI, Hand) Eligible students, scheduled tests Students Extract filewith student-level results Test Administration and Registration (student reg.,test scheduling) State Student Data System Determine next set of items Student ID, school, grade, ethnicity, etc. Adaptive Engine ‡ Individual Student Reports will be generated by the Consortium for states that allow student identification data to be stored by the Consortium. Other states will host instances of the Data Warehouse and Reporting components. DistrictSIS *Operated by a State or Smarter Balanced-certified vendor. District will need to register studentsif no State system is available Test registration will Q/A registration info against previous years’ data in the Data Warehouse.
Spring 2014 Field Test • 4.2 Million Students • 16.5 Thousand Schools • 12.2 Million Tests completed • 4.5 Million with accessibility features • Up to 4 tests per student. Average: 2.8. • ELA • ELA Performance Task • Math • Math Performance Task
A New Generation of Standards & Assessments for ELLs • 11 states funded in September 2012 by the U.S. Department of Education • Partners • Lead State: Oregon Department of Education • Project Management: Council of Chief State School Officers (CCSSO) • Understanding Language Initiative (Stanford University), CRESST of the University of California, Los Angeles, and NCEO of the University of Minnesota • Timeline • Item bank development (ongoing) • Field Test SY 2014-2015 • First Operational Summative SY 2015-2016 • Operational Screener SY 2016-2017 • Platform, Technical Requirements, and Reporting SY 2014-2015
ELPA21 Consortium States Arkansas, Florida, Iowa, Kansas, Louisiana, Nebraska, Ohio, Oregon, South Carolina, Washington, and West Virginia
ELPA21 Structure • New English Language Proficiency Standards • With guidance from states, WestEd, Understanding Language Initiative of Stanford University, and CCSSO developed completely new standards • College and career readiness focus • Screener for each of 6 grade bands • Summative
Assessment System Features • We are in the unique position of integrating new standards, assessments, technology, and teacher and administrator supports — all leading toward better systems of support and learning for ELLs. • Comprehensive web-based delivery • Innovative technology-enhanced items • Includes teacher-developed items • Cohesive system • High quality communications and outreach within states • Sustainability
Features (cont.) • Reports • Screener & Summative • Individual Student • Parent/Guardian • Aggregate (e.g., Classes, Schools, Districts, and States) • Administrative and Technical (e.g., Registration, Q/A, Analyses) • Potential Reporting Information • Scores for Listening, Reading, Writing, and Speaking – And comprehension • Student Proficiency Level • Performance Level Descriptors • Interpretive Guide • Professional Development
Technical Challenges • Demands of Speaking, Listening, Writing • Hardware Requirements for Listening and Speaking • Assessment of Earliest Learners (K-2)
WIDA Consortium with ASSETS 35 member states - WIDA 35 member states - ASSETS
Development Timeline 2015-16: Fully Operational
Updates • 2014 Field Test wrapping up this month • Analyzing results of student/LEA/SEA surveys to better deliver Field Test in 2015 • Selecting technology vendor via RFP process (completed by August) • Assisting SEAs/LEAs with technology readiness preparation
For more information • ASSETS Project Website: http://www.assetsproject.org/ • ACCESS for ELLs 2.0 Field Test resources: http://assetsproject.org/implementation/fieldtest.aspx#overview • ACCESS for ELLs 2.0 Operational Test resources: http://assetsproject.org/implementation/operational.aspx
School Readiness: Lessons Learned during DLM Field Testing Neal Kingston Meagan Karvonen Nicholas Studt June 27, 2014
Overview of the Dynamic Learning Maps Alternate Assessment • Fine-grained learning maps • A subset of particularly important nodes that serve as content standards – Essential Elements • Instructionally-embedded and year-end assessments • Instructionally relevant testlets • Accessibility and alternate pathways • Dynamic assessment • Status and growth reporting that is readily actionable • Professional development • A technology platform to tie it all together
Lessons Learned about Providing Resources • How your organize information makes a difference • Quick checklists • Comprehensive documents • District people need to have role-specific information • State capacity is critically important to the district staff
Lessons Learned About Training • District people need a training structure, not just good self-directed training materials • Different teachers learn best with different approaches • Confusion between required and optional resources • Educators need time and experience before a new system becomes routine
Lessons Learned about Help Desk Support • Educators are a immensely flexible group. • Educators initiate contact via email more often than phone. • Minor changes to resources and training provided are visibly amplified at the help desk. • A single sentence can cause noticeable increase in calls & emails. • Smaller testing populations, with more educators, require a larger than expected staff to support. • The economies of scale work against DLM educators.
THANK YOU! For more information, please contact: dlm@ku.edu or Go to: www.dynamiclearningmaps.org For Professional Development, contact: dlm@unc.edu The present publication was developed under grant 84.373X100001 from the U.S. Department of Education, Office of Special Education Programs. The views expressed herein are solely those of the author(s), and no official endorsement by the U.S. Department should be inferred.
National Center and State Collaborative National Conference on Student Assessment, June 2014 Chris Domaleski
Overview • Five partner organizations • National Center on Educational Outcomes • edCount, LLC. • National Center for the Improvement of Educational Assessment • University of North Carolina at Charlotte • The University of Kentucky • 13 partner states and 11 tier 2 states • Long-term goal is to ensure that students with significant cognitive disabilities achieve increasingly higher academic outcomes and leave high school ready for post-secondary options. • Theory of Action: a well-designed summative assessment alone is insufficient. To achieve this goal, an AA-AAS system also requires: • Curricular & instructional frameworks • Teacher resources and professional development
NCSC Technology Customized Open Source • Compliant with commonly used AT/AAC devices • Paper & pencil alternative delivery • Verify student profile LCI/PNP data • Hand scoring/interaction for teachers • Keyboard only navigation • Adaptive testing features • Accessibility features (e.g., Text to speech, magnification, high contrast) • Upload evidence for an item feature • PD training, survey, practice tests • Federally funded and open source system/content available to all schools and states without licensing fees
Additional Resources • www.ncscpartners.org • Curriculum and instruction resources • Technology architecture and specifications • Presentations, papers, handouts and more for various audiences…
Q&A • Consortia • PARCC: http://www.parcconline.org • Smarter Balanced: http://www.smarterbalanced.org • Elpa21: http://www.elpa21.org • ASSETS: http://www.wida.us • DLM: http://dynamiclearningmaps.org • NCSC: http://www.ncscpartners.org • Guides • SETDA – Guide to Technology Readiness: http://gtr.setda.org • CoSN – Becoming Assessment Ready: http://www.cosn.org/focus-areas/it-management/becoming-assessment-ready • ETS – Coming Together to Raise Achievement: http://www.k12center.org/publications/raise_achievement.html