1 / 15

Data Detective Strategies

Data Detective Strategies. Connecticut Department of Education National Reporting System (NRS) Webinar October 3, 2006 Ajit Gopalakrishnan, ajit.gopalakrishnan@ct.gov. Detective Work Assignments. Data Quality Accountability Program Improvement / Research. Data Quality Examples.

geordi
Download Presentation

Data Detective Strategies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Detective Strategies Connecticut Department of Education National Reporting System (NRS) Webinar October 3, 2006 Ajit Gopalakrishnan, ajit.gopalakrishnan@ct.gov

  2. Detective Work Assignments • Data Quality • Accountability • Program Improvement / Research

  3. Data Quality Examples • Data Entry Checks: • Drop-down lists control data entry • Zip codes are procured from the post office • Only approved assessments can be selected for entry • Scaled score conversions for each approved test form is embedded in the data system • Goals must be established before a student is enrolled • Combined adult education and GED database

  4. Data Quality Examples (cont.) • Online data verify provides warnings (anomalies) and errors (mistakes) after entry of data. Errors must be corrected. Examples: • No attendance hours entered (monthly or daily) • Missing assessments • Minimum credits earned to award credit diploma • Invalid dates • Age verification

  5. Data Quality Examples (cont.) • Reasonableness tests conducted by the Department: • Consistency across reports • Longitudinal comparisons of enrollment and performance • Appropriate test form selection • Timing of pre and post tests • Social Security Number validity

  6. Data Quality Examples (cont.) • Required local staff roles provide checks and balances with respect to data: • Program Facilitator • Data administrator • Data entry staff • Training expectations • Required attendance at training sessions • Follow up with non-attendees • Expected use of reports to monitor data quality locally

  7. Accountability • A Data-Driven Framework is utilized to annually evaluate program performance relative to state averages and NRS targets by comparing: • Program recruitment to census need in community • Program retention and utilization rates to state averages • NRS goal setting rates to state averages • Pre-post testing rates and percent making scaled score gains to state averages • NRS level completion rates to NRS targets • The measures are weighted differently.

  8. Accountability (cont.) • An overall analysis of these program process and outcome measures helps to target programs for on-site monitoring. • The profile report which contains this information has focused program attention on key quality indicators of adult education. • Enrollment and assessment outcomes are monitored on a semester basis.

  9. Program Improvement/Research • Results from the accountability framework target areas for improvement. • Current research and questions for reflection guide program improvement efforts. • Key priority areas based on the data include: • Learning gains and attendance • Goal attainment and goal type • GED results – Transition to Postsecondary • Longitudinal participation

  10. Hours Attended and Learning Gains in ESL – Sample Program

  11. Percent of Exiters Earning a Diploma – Statewide Data

  12. GED Score Analysis-Sample Program • Of those who passed the GED: • 46% passed without achieving a score of at least 450 in each of the subject areas but achieved 2,250 overall; and • Only 17% passed with a score of at least 500 in each of the subject areas.

  13. 255 Ability Level for Future Success D Level Test 245 C Level Test Approx. Average Functioning Level of Graduates 235 220 210 200 The Transition Gap The CASAS Scale

  14. Adult H.S. Credit – Sample Program

  15. Questions? Thank you.

More Related