310 likes | 422 Views
Assessment for Articulation Ohio’s New Assessment System. May 14, 2009 Bob Mahlman Director, Center on Education and Training for Employment The Ohio State University mahlman.1@osu.edu 614-292-9072. Ohio’s Technical Skills Assessment System. Online assessment through www.webxam.org
E N D
Assessment for ArticulationOhio’s New Assessment System May 14, 2009 Bob Mahlman Director, Center on Education and Training for Employment The Ohio State University mahlman.1@osu.edu 614-292-9072
Ohio’s Technical Skills Assessment System • Online assessment through www.webxam.org • Online testing began in 2001 • About 50 Ohio CTE assessments for Secondary • Several national certification assessments • Secure, hierarchical system that tracks results, links to statewide data system • Assessment types • (1) Modular, completed over the 2-year program • (2) End of program
Assessment Maintenance and Revision • Annual revisions are done to • Enhance psychometric quality (reliability & validity) • Keep the test items current • Maintain linkages to current content standards • Vast majority of tests receive some level of revision every year.
Test to Content Standards Linkages Major Issue: Maintaining linkages to ever-changing content standards. Career Field Technical Content Standards 1991 OCAPs ITACs 1995 OCAPs TCPs Career Clusters
Test to Content Standards Linkages Another Issue: How long should content standards be available before tests follow? Usually 2-3 years to allow standards to become fully implemented.
Moving Toward CFTCS Assessments • Career Field Technical Content Standards (similar to Career Clusters) • Perkins IV requires states to have valid and reliable technical skill assessments for ALL CTE programs
Moving Toward CFTCS Assessments • Ohio has assessments for a subset of programs, linked to standards that are being phased out • Now moving in the direction of new CFTCS assessments • All modular assessments • Heavy reliance on scenarios, higher cognate level • Pathway-level assessments
Moving Toward CFTCS Assessments • General Approach • Crosswalk old standards to new standards • “Pull” test items to new standards & verify linkages • Create test module structure based on postsecondary course delivery • Review & write items to fill gaps • Review and validate items and modules • Set benchmark scores • Field test and refine
Old Standards to New Standards • 2007-2008: Linking old content standards to new CFTCS • Crosswalk database application • Internal staff perform preliminary linkage • Three instructors refine the linkages • Phone conferences to resolve discrepancies • Created linkage product for use by supervisors and instructors
Crosswalk Product • Reports show linkage between older content standards and the new Career Field Technical Content Standards • Results are available online: http://www.cete.org/Projects/OCTCA/CrosswalkReview/products.aspx • Useful for curriculum planning: shows overlap and gaps • Link Here
Crosswalk Results • Crosswalk Homepage • Criminal Justice Page • Criminal Justice Results – Old to New • Criminal Justice Results – New to Old • Lookup Feature
Crosswalk Work 2008-2009Test Item Linkages • Linkage results “pull” test items into new Career Field Technical Content Standards • 2007-2008: item to content standard match results are preliminary • 2008-2009: item to content standard match results are verified by panels of SMEs
Crosswalk Work 2008-2009Test Item Linkages • Three instructors from each content area review linkages between items and new standards • Instructors come to OSU individually to use computerized application to indicate strength of match • Final results = final gap analysis by Career Field and pathway
New Career Field Technical Content Standards Assessments2009 and Beyond
Development of New CFTCS Assessments • The theme for the new CFTCS assessments is “assessment for articulation” • Goal: create assessments that align with courses being delivered in postsecondary programs • Requires input from postsecondary faculty
Development of New CFTCS Assessments • Assessments at the pathway level • Significant implications for curriculum in some areas • All assessments are modular • Modules completed over the 2-year program • Scenario-based, higher cognate levels
Development of New CFTCS Assessments • Modules designed to coincide with content for course articulation • Based on the way content is delivered in the community colleges • Create assessments for all pathways over next four years • except where viable industry-based certifications exist
Development of New CFTCS Assessments • Advantages to the new system • Assessments available to all students • Development of secondary-postsecondary relationships • New utility for technical skill assessments (articulation) • Assessments now have added value to the student; greater value to the school/district/state
Link to Postsecondary • Identify postsecondary programs with courses being articulated • Link competencies in the targeted pathways to postsecondary courses via survey • Postsecondary faculty indicated competencies addressed by postsecondary program courses Survey
Cluster Units of Instruction • Run hierarchical cluster analysis (SPSS) to form preliminary modules • Units clustered together based on how they are addressed at the postsecondary level Cluster Analysis
Create Test Module Structure • Create module content structure based on cluster analysis • Enter human judgment into the statistically-based clustering process Module Creation
Create Test Items • Run item writing workshops • Facilitated groups of secondary and postsecondary faculty (~12 SMEs: 6 secondary, 6 postsecondary) • Review/revise current items • Write scenarios and new items to fill gaps
Validate Items & Modules • Run item & module review workshops using ~12 secondary and postsecondary faculty • Content validate items • Item linkage judgments • Item essentiality judgments • Item quality judgments • Content validate modules • Judgments re: Content domain representativeness
Advanced (Expert) Proficient Benchmark 2 Limited (Novice) Validate Items & ModulesSet Benchmark Scores • Set Benchmark Scores • Two benchmarks = three performance levels; limited, proficient, advanced Benchmark 1
Field Test • Create and publish field test modules • Field test the modules (2009-2010) • Conduct item and test analysis • Create final forms (Fall, 2010) • Repeat the process each year for four years.
Lessons Learned • Difficulty running workshops during the school year (summer better) • Postsecondary relationships need to be nurtured • Concerns with students’ abilities to pass scenario-based, higher cognate level tests • Curriculum issues
Recommendations for Maintaining The System • Create a standing committee for each pathway that • reviews pathway standards annually, • reviews tests annually, recommends item replacements, and • generates new items to replace 1/3 of each test each year.
Recommendations for Maintaining The System • Committee members rotate serving 3-year terms • Institute embedded pretest items • (12-15 non-scored items in each module each year) • Development of new test module structures is done in conjunction with new standards development