1 / 21

Pouring a Foundation for Program Improvement with Quality SPP/APR Data

Pouring a Foundation for Program Improvement with Quality SPP/APR Data. OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement strategies Ruth Ryder USDE/OSEP/MSIP. Updates. Status of revisions to information collection (Indicator/Measurement Table)

uri
Download Presentation

Pouring a Foundation for Program Improvement with Quality SPP/APR Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pouring a Foundation for Program Improvement with Quality SPP/APR Data OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement strategies Ruth Ryder USDE/OSEP/MSIP

  2. Updates • Status of revisions to information collection (Indicator/Measurement Table) • Received recommendations that Indicator 13 be revised • Longer • Shorter • Tweaked • In process

  3. Updates • Status of OSEP’s review of the APR and revised SPP submissions • Opportunity for Clarification • Response Table • Determinations • Letters in early June

  4. OSEP Review Process • State contacts did initial review • Division did facilitated review • Division leadership “triaged” all Status Tables • Opportunity for Clarification • Developing Response Tables

  5. Indicators 1 and 2 • Only a few States continued to do the comparison to all youth • Many States are using 618 State-reported data • Many States revised their improvement activities, usually adding more specific activities for “out years”

  6. Indicator 1 and 2 Issues • Some States could not provide 06-07 data (05-06 data were provided) • Great variations in calculation methodologies (more using cohort) • Not even close to meeting targets • Improvement activities – (“kitchen sink” approach or minimalist approach)

  7. Indicator 13 • All States submitted data • We questioned the validity and reliability of a few States • State compliance ranged from 4.9% to 100% • About 15 States were below 50% compliance • Many States could not demonstrate timely correction of previously identified noncompliance

  8. Indicator 13 Issues • What exactly are States reporting to us? • More than half of the States are using the NSTTAC checklist or some variation • Remaining States are using their own checklists and it’s often hard to tell what requirements they are evaluating • What does timely correction look like for this indicator?

  9. Indicator 14 • With a few exceptions, States were able to give us data • About 8 States did not provide valid and reliable data • Denominator • Only graduates

  10. Indicator 14 Issues • What do the reported data represent? • Many States did not describe the respondent group • Can’t determine if the respondent group is representative of the population • Small sample sizes • Improvement activities focus on data collection

  11. The Challenges: 2007 • From our review of the Feb 2007 submissions we identified patterns of challenges – • The Basics • Data • Compliance • Improvement

  12. The Successes and Challenges: 2008 • Successes • The Basics – Much better, States provided the required information, etc. • Data – Much better, correct measurement, correct year • Compliance – More accurate data, more evidence of timely correction • Improvement Activities – Many States revised and/or added

  13. The Successes and Challenges: 2008 • Challenges • The Basics – Keep up the good work! • Data – Reconciling database data with monitoring system data, calculation methodologies for 1 and 2 • Compliance – Documenting timely correction, improving performance • Improvement Activities – Purposeful, linked, sequenced, evidence-based

  14. Improvement Activities: External TA Analysis Categories • Improve data collection and reporting • Improve systems administration and monitoring • Build systems and infrastructures of technical assistance and support • Provide technical assistance/training/ professional development

  15. (Continued) • Clarify/examine/develop policies and procedures • Program development • Collaboration/coordination • Evaluation • Increase/adjust FTE

  16. One State’s Perspective on Making the Grade with the SPP/APR • Attend as many OSEP-funded TA offerings as possible • Provide accurate and reliable data and if can’t, explain why and what you’re doing about it • Analyze data by local programs • Develop standard headings, stems and data formats to use for all indicators

  17. One State’s Perspective on Making the Grade with the SPP/APR • Maintain documentation that you: • Identify noncompliance at the local level • Identify research-based improvement activities that are a match to the identified problems • Require and approve corrective action plans with appropriate timelines • Oversee timelines and require proof of correction (evidence of success)

  18. Examples

  19. What You’re Doing is Working! • From 1987 to 2003: • Postsecondary enrollment rose from 15% to 32% • 4-year college enrollment rose from 1% to 9%

  20. What You’re Doing is Working! • More academic coursework • More above-average grades • More congruency between age and grade level • More support services

  21. What You’re Doing is Working! • More students with disabilities are exiting with a standard diploma • 1996 to 2006, rates rose from 42% to 56% • Fewer students with disabilities are dropping out • From 1996-2006, rates declined from 47% to 26%

More Related