1 / 27

Welcome

Welcome. Goals for DQI Establish future vision for Perkins Accountability Collaborate on Standardization Establish concrete state recommended alternatives for standardizing measurement definitions and performance indicators Overview of Agenda and Materials. Post Data Quality Institute.

reidar
Download Presentation

Welcome

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome • Goals for DQI • Establish future vision for Perkins Accountability • Collaborate on Standardization • Establish concrete state recommended alternatives for standardizing measurement definitions and performance indicators • Overview of Agenda and Materials

  2. Post Data Quality Institute • Convene forum for Skill Attainment (1S2) and conference calls for Non Trad measures • Continue NSWG calls • Develop report from DQI on State Recommendations • Further explore recommendations at Regional Meetings • Present alternatives at CTE State Directors meeting • Establish a transition period for implementation

  3. Federal PerspectiveCorinne Sauri, Policy Specialists, PRES Accountability In Transition Federal & State Panel

  4. Perkins III • Built on accountability provisions from the Perkins Act of 1990. • Increased emphasis on academics and holding CTE students to same standards as non-CTE students • Required new disaggregated reporting with system of rewards and consequences

  5. Core Indicator Framework • Designed by OVAE with input from states and other stakeholders. • Not meant to be a comprehensive list of acceptable measures but as a tool for states in developing measures. • Perkins requires four core measures for every state to report, but allows for states to develop and define performance measures. • OVAE has limited authority under Perkins III to set measures but may provide guidance and must approve state plans.

  6. Data Integrity? • Data collection under Perkins III has left the program open to criticism and threat of elimination • National Assessment of Vocational Education • Office of Inspector General Report • Program Assessment Rating Tool • Budget Requests

  7. NAVE • “Despite serious commitment among state administrators, technical measurement and data quality problems hinder widespread use of program performance data for program management as the state or local levels.” • The quality of Perkins performance reporting varies considerably by indicator, by state, and sometimes within a state

  8. OIG Report • Invalid Measures • Many states use same indicator for different sub-indicators. • Some states using same measure for multiple sub-indicators. • OVAE can offer better guidance in the state plan approval process. • 57% of states do not provide complete performance data. • Lack of improvement strategies, reporting at sub-indicator level.

  9. PART and Perkins Funding • FY 2004: PART rating “Ineffective” • Unclear program purpose. • Quality of state data is deficient. • Unavailable data on student outcomes. • Lack of demonstrated state progress on core indicators. • FY 2006 budget: Perkins funding eliminated. • PART and NAVE cited.

  10. Perkins Reauthorization • Sec. Spellings letter on Senate Perkins bill • Cited PART results. • Requested authority to establish common measures to assess program performance for Perkins. • Requested authority to negotiate new specific measures and targets with each state.

  11. Report to Congress • Perkins III requires OVAE to report on States’ progress in achieving performance targets. • Missing and incomplete data from states. • State-to-State comparison charts required. • Impossible when concentrators are defined differently. We’re comparing apple to oranges to bananas to coconuts.

  12. Conclusions • Data-driven era of Accountability • Need for standardization of data. • Expectation of demonstrable results • Congress expects results as a return on investment. • Basic questions about data integrity cannot be addressed without common measures among states.

  13. Accountability In Transition Federal & State Panel Federal PerspectiveSharon Miller, Director, DHSPCE

  14. Vision for Perkins Accountability • To achieve greater consistency across the nation regarding definitions of: • Concentrator • Secondary academic attainment • Secondary completion • Secondary transition • Postsecondary completion • Postsecondary placement and retention

  15. Vision for Perkins Accountability • Concentrator • Currently about five different approaches to definitions • Needs to maximize the number of students counted without including those who take only one course • Needs to be distinguished from a completer

  16. Vision for Perkins Accountability • Secondary academic attainment • Currently about seven different approaches to definitions • Half of states aligning their measure to NCLB • Need to align both the test and the proficiency level

  17. Vision for Perkins Accountability Secondary completion • Considerable commonality among definition • Need to include more students who concentrate (not just 12th graders)

  18. Vision for Perkins Accountability • Secondary and postsecondary placement • Roughly forty states use UI wage record to track students • Three states share information about students across state lines • Need to build capacity of all states to use administrative record systems

  19. Vision for Perkins Accountability • Results of greater consistency • Heightened ability to communicate our outcomes to key constituents • More support for our programs and services • Greater ability to use data for program improvement

  20. Accountability In Transition Federal & State Panel Federal PerspectiveJohn Haigh, Chief, Accountability Branch, DHSPCE

  21. Perkins III: Overview History of Perkins III Accountability: CIF Rounds 1-5 Institutes Evaluations NSWG

  22. Perkins III: Overview Define Key Terms: Measurement Approach (CRT)Codes (1S1,etc.)Measures (N & D)Concentrator, Completer, ParticipantBaselines Quality Indicators (scope, coverage, alignment, timing, reliability)

  23. Perkins III: Overview What is the CAR? What are negotiations? What is the process of awarding incentives & sanctions?

  24. Perkins III: Overview Roadmap for where we’ve been to date with activities:Past Data Quality activities Past Program Quality activities NSWG discussions

  25. Perkins III: Overview Conference On Accountability & Core Performance Measurement February 4-5, 1999, Kansas City, Missouri Reporting Results: Strategies To Improve Data Quality Regional Technical Assistance Workshop July 25-27, 2000, Phoenix, Arizona Reporting Results: Strategies To Improve Data Quality Regional Technical Assistance Workshop August 14-15, 2000, Portland, Oregon Introducing the Data Quality Initiative Data Quality Update National Association of State Directors of Vocational Education Consortium December 6, 2000, San Diego, California

  26. Perkins III: Overview Data Quality Initiative National Institute: National Conference Strategies For Improving Data Quality Joint State-OVAE Conference February 1-2, 2001, New Orleans, Louisiana Improving Validity and Reliability: State Technical Assistance Meeting American Vocational Information Association May 13-15, 2001, Reno, Nevada Train-the-Trainer Workshop: State Workshop Working With Local Educators To Collect Quality Data Joint state-federal conference August 23-24, 2001, Chicago, Illinois National Career and Technical Education Leadership Conference November 27-29, 2001, Washington, DC

  27. Perkins III: Overview Program Quality Institute: Strategies For Improving Program Quality May 13-14, 2002, Jacksonville, Florida Program Quality Institute: Strategies For Improving Program Quality August 8-9, 2002, Atlanta, Georgia National Career and Technical Education Leadership Conference May 7-8, 2003, Washington, DC

More Related