1 / 31

Assessment for Articulation Ohio’s New Assessment System

Assessment for Articulation Ohio’s New Assessment System. May 14, 2009 Bob Mahlman Director, Center on Education and Training for Employment The Ohio State University mahlman.1@osu.edu 614-292-9072. Ohio’s Technical Skills Assessment System. Online assessment through www.webxam.org

vito
Download Presentation

Assessment for Articulation Ohio’s New Assessment System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment for ArticulationOhio’s New Assessment System May 14, 2009 Bob Mahlman Director, Center on Education and Training for Employment The Ohio State University mahlman.1@osu.edu 614-292-9072

  2. Ohio’s Technical Skills Assessment System • Online assessment through www.webxam.org • Online testing began in 2001 • About 50 Ohio CTE assessments for Secondary • Several national certification assessments • Secure, hierarchical system that tracks results, links to statewide data system • Assessment types • (1) Modular, completed over the 2-year program • (2) End of program

  3. Assessment Maintenance and Revision • Annual revisions are done to • Enhance psychometric quality (reliability & validity) • Keep the test items current • Maintain linkages to current content standards • Vast majority of tests receive some level of revision every year.

  4. Test to Content Standards Linkages Major Issue: Maintaining linkages to ever-changing content standards. Career Field Technical Content Standards 1991 OCAPs ITACs 1995 OCAPs TCPs Career Clusters

  5. Test to Content Standards Linkages Another Issue: How long should content standards be available before tests follow? Usually 2-3 years to allow standards to become fully implemented.

  6. Moving Toward CFTCS Assessments • Career Field Technical Content Standards (similar to Career Clusters) • Perkins IV requires states to have valid and reliable technical skill assessments for ALL CTE programs

  7. Moving Toward CFTCS Assessments • Ohio has assessments for a subset of programs, linked to standards that are being phased out • Now moving in the direction of new CFTCS assessments • All modular assessments • Heavy reliance on scenarios, higher cognate level • Pathway-level assessments

  8. Moving Toward CFTCS Assessments • General Approach • Crosswalk old standards to new standards • “Pull” test items to new standards & verify linkages • Create test module structure based on postsecondary course delivery • Review & write items to fill gaps • Review and validate items and modules • Set benchmark scores • Field test and refine

  9. Old Standards to New Standards • 2007-2008: Linking old content standards to new CFTCS • Crosswalk database application • Internal staff perform preliminary linkage • Three instructors refine the linkages • Phone conferences to resolve discrepancies • Created linkage product for use by supervisors and instructors

  10. Crosswalk Product • Reports show linkage between older content standards and the new Career Field Technical Content Standards • Results are available online: http://www.cete.org/Projects/OCTCA/CrosswalkReview/products.aspx • Useful for curriculum planning: shows overlap and gaps • Link Here

  11. Crosswalk Results • Crosswalk Homepage • Criminal Justice Page • Criminal Justice Results – Old to New • Criminal Justice Results – New to Old • Lookup Feature

  12. Crosswalk Work 2008-2009Test Item Linkages • Linkage results “pull” test items into new Career Field Technical Content Standards • 2007-2008: item to content standard match results are preliminary • 2008-2009: item to content standard match results are verified by panels of SMEs

  13. Crosswalk Work 2008-2009Test Item Linkages • Three instructors from each content area review linkages between items and new standards • Instructors come to OSU individually to use computerized application to indicate strength of match • Final results = final gap analysis by Career Field and pathway

  14. New Career Field Technical Content Standards Assessments2009 and Beyond

  15. Development of New CFTCS Assessments • The theme for the new CFTCS assessments is “assessment for articulation” • Goal: create assessments that align with courses being delivered in postsecondary programs • Requires input from postsecondary faculty

  16. Development of New CFTCS Assessments • Assessments at the pathway level • Significant implications for curriculum in some areas • All assessments are modular • Modules completed over the 2-year program • Scenario-based, higher cognate levels

  17. Development of New CFTCS Assessments • Modules designed to coincide with content for course articulation • Based on the way content is delivered in the community colleges • Create assessments for all pathways over next four years • except where viable industry-based certifications exist

  18. Development of New CFTCS Assessments • Advantages to the new system • Assessments available to all students • Development of secondary-postsecondary relationships • New utility for technical skill assessments (articulation) • Assessments now have added value to the student; greater value to the school/district/state

  19. New CFTCS Assessmentsfor 2009

  20. The Process

  21. Link to Postsecondary • Identify postsecondary programs with courses being articulated • Link competencies in the targeted pathways to postsecondary courses via survey • Postsecondary faculty indicated competencies addressed by postsecondary program courses Survey

  22. Cluster Units of Instruction • Run hierarchical cluster analysis (SPSS) to form preliminary modules • Units clustered together based on how they are addressed at the postsecondary level Cluster Analysis

  23. Create Test Module Structure • Create module content structure based on cluster analysis • Enter human judgment into the statistically-based clustering process Module Creation

  24. Create Test Items • Run item writing workshops • Facilitated groups of secondary and postsecondary faculty (~12 SMEs: 6 secondary, 6 postsecondary) • Review/revise current items • Write scenarios and new items to fill gaps

  25. Validate Items & Modules • Run item & module review workshops using ~12 secondary and postsecondary faculty • Content validate items • Item linkage judgments • Item essentiality judgments • Item quality judgments • Content validate modules • Judgments re: Content domain representativeness

  26. Advanced (Expert) Proficient Benchmark 2 Limited (Novice) Validate Items & ModulesSet Benchmark Scores • Set Benchmark Scores • Two benchmarks = three performance levels; limited, proficient, advanced Benchmark 1

  27. Field Test • Create and publish field test modules • Field test the modules (2009-2010) • Conduct item and test analysis • Create final forms (Fall, 2010) • Repeat the process each year for four years.

  28. Lessons Learned • Difficulty running workshops during the school year (summer better) • Postsecondary relationships need to be nurtured • Concerns with students’ abilities to pass scenario-based, higher cognate level tests • Curriculum issues

  29. Recommendations for Maintaining The System • Create a standing committee for each pathway that • reviews pathway standards annually, • reviews tests annually, recommends item replacements, and • generates new items to replace 1/3 of each test each year.

  30. Recommendations for Maintaining The System • Committee members rotate serving 3-year terms • Institute embedded pretest items • (12-15 non-scored items in each module each year) • Development of new test module structures is done in conjunction with new standards development

  31. Questions / Discussion

More Related