210 likes | 327 Views
Getting the Most Value for Your Assessment Dollar – Designing Adapting and Maintaining Quality Assessment Programs During Tough Economic Times. To Consortia, or not to Consortia CCSSO National Conference on Student Assessment June 23, 2010. Joining a Consortium.
E N D
Getting the Most Value for Your Assessment Dollar – Designing Adapting and Maintaining Quality Assessment Programs During Tough Economic Times To Consortia, or not to Consortia CCSSO National Conference on Student Assessment June 23, 2010
Joining a Consortium Implementing a new, innovative assessment program in a consortium as a way to save costs.Or, maintaining a current program without having to make drastic cuts • Is it doable? • Can a consortia of states implement a new assessment at a significantly reduced cost than a single state acting alone? • How large does the consortia need to be? • Where are the cost savings opportunities? www.assessmentgroup.org
Joining a State Assessment Consortium • Joining a state assessment consortium can have its advantages but. . . • Requires a lot of planning, coordination & desire • Several successful examples: • NECAP • WIDA • Achieve Algebra 2 • PARCC & SBAC (responses due today)
Stanford/Nellie Mae Study Purpose of the study was to see if it is possible to create an affordable “high quality” assessment • Step one – Model a current typical assessment in ELA and Math – Cost $19-$20 • Step two – Model a high quality assessment for the same state – Cost $55-$56 a student • Step three – Implement several cost savings strategies Assessment Solutions Group www.assessmentgroup.org
Cost Reduction Strategies • Participation in a consortium • Looked at 10, 20 and 30 state sizes • Cost reduction - $15 per student • Uses of technology for online test delivery, distributed human scoring of some of the open-ended items, and automated scoring for certain constructed response items • Together, these innovations account for cost savings of about $3 to $4 per student • Likely to account for more as efficiencies are developed in programming and using technology for these purposes • Two approaches to the use of teacher-moderated scoring. Teacher-moderated scoring can net both substantial cost reductions as well as potential professional development benefits. We used two different models for teacher-moderated scoring
Cost Reduction Strategies • Two different models for teacher-moderated scoring: • Professional development model - no additional teacher compensation beyond that supported by the state or district for normal professional development days (NY Regents) • Stipend model - assume a $125/day stipend for teachers to score the performance items. • Note: teachers were assumed to score all performance items in a distributed scoring model • These strategies for using teachers as scorers reduce costs by an additional $10 to $20 per pupil (depending on whether teachers are engaged as part of professional development or are paid) • Adopting all cost reduction strategies while paying teachers a $125/day stipend to score all performance tasks results in an assessment cost of $21 Assessment Solutions Group www.assessmentgroup.org
Consortia Size How big do you have to be? • Stanford/Nellie Mae study found that 80% of the cost benefits of joining a consortium are realized at the 10 state size. • Rough estimate is that a 5 state consortium could achieve 75%+ of the cost savings of a 10 state consortia • Perhaps $3 - $6 per student • $2.7 M/year for the average sized state (600K students)
Where are the Cost Savings? • Big cost savings opportunity in development • Largely a fixed cost function • Increase in forms cost partially offsets the savings • Other fixed cost functions such as IT, Quality Assurance and Psychometrics provide savings • Even functions that are largely variable in nature also have a fixed cost component • Some functions like program management allow for economies of scale www.assessmentgroup.org
Where are the Cost Savings? • Consortia size can make assessment technology more affordable • Online test delivery (CBT and CAT) • Artificial intelligence scoring of CRs • More states/students more bargaining power • A common assessment with common standards and operational methods s/b more efficient • Need to weigh this against potential additional collaboration costs and risks
PARCC & SBAC Support • We recently assisted both consortia in preparing their cost estimates for the NIA responses • Both consortia had innovative ideas for new assessments and a wide variety of design and operational decisions to make • Each idea/design choice came with unique cost implications www.assessmentgroup.org
PARCC & SBAC Support • Initially, each consortia’s design was deemed too expensive in both the operational and ongoing periods. Each needed adjustments: • The number of choices and variables can be daunting as there are many variables and moving parts • Ultimately, each consortia created innovative assessment systems with the designs they wanted
Assessment Design Decision Tree • Delivery Method • Paper based • Computer (linear or CAT) • Mixed (both CBT and PPT) • Assessment Types • Summative, through course summative • Interim/benchmark, End of Course, Formative • Domains, special populations Indicates a major cost element for either PARCC or SBAC
Decisions and Cost Variables (cont.) • Development • Types of items (SR, CR, Computer enhanced, PE, PT) • Mix of item types • Number of forms, CAT algorithm (750-1000 items per grade), number of attempts • Release rates (by item type) • Breach form (develop?, print?) • Grades/domains tested • Item bank development
Decisions and Cost Variables • Paper based testing/cutover to CBT • How long to cut over (operating in both modes is very expensive)? • Different production strategies • Minimize print page “signatures” • Use of color (B/W, grey scale, 4 color) • Breach form (print?) • Security measures (# of forms, labels, seals, student ID) www.assessmentgroup.org
Decisions and Cost Variables • Logistics • Transportation mode (ground, air) • Carrier selection • Ship from/to locations (consolidated shipping) • Meetings and Travel (online vs. live) • Scoring • Computer vs. Human (incl. scanning and editing)
Design Decisions & Costs • Scoring (cont.) • Human Method (teacher or 3rd party) • Holistic vs. analytic scoring • Requires a lot of work to develop innovative items that can be scored in a timely manner • Alternatively, a test design where these items are scored during a classroom period may make sense (PEs) • AI scoring for open ended items • Math vs. ELA • Items requiring inference can’t easily be scored using AI • System training fees (fixed cost); per score costs
Design Decisions & Costs • Open-Ended Scoring (cont.) • Double scoring/Read behind rates (by grade) • Distributed vs. on-site • Reporting • Paper vs. online reporting • Number and complexity of reports www.assessmentgroup.org
Conclusion • Even a small consortium of states can achieve significant reductions in assessment cost • Such a strategy can be useful in developing a new, high quality assessment or maintaining a current one during times of budgetary stress • Participating in a consortium also allows for the implementation of innovative technologies that can improve assessment quality and reduce costs • Teacher scoring of open-ended items is critical for implementing a high quality assessment • There are a myriad of design and operational decisions that have significant cost impacts
Conclusion “You can’t always get what you want; but if you try sometime you just might find you get what you need.” - Mick Jagger www.assessmentgroup.org
Questions? • Barry Topol btopol@assessmentgroup.org • John Olson jolson@assessmentgroup.org • Ed Roeber eroeber@assessmentgroup.org www.assessmentgroup.org