150 likes | 317 Views
Data Detective Strategies. Connecticut Department of Education National Reporting System (NRS) Webinar October 3, 2006 Ajit Gopalakrishnan, ajit.gopalakrishnan@ct.gov. Detective Work Assignments. Data Quality Accountability Program Improvement / Research. Data Quality Examples.
E N D
Data Detective Strategies Connecticut Department of Education National Reporting System (NRS) Webinar October 3, 2006 Ajit Gopalakrishnan, ajit.gopalakrishnan@ct.gov
Detective Work Assignments • Data Quality • Accountability • Program Improvement / Research
Data Quality Examples • Data Entry Checks: • Drop-down lists control data entry • Zip codes are procured from the post office • Only approved assessments can be selected for entry • Scaled score conversions for each approved test form is embedded in the data system • Goals must be established before a student is enrolled • Combined adult education and GED database
Data Quality Examples (cont.) • Online data verify provides warnings (anomalies) and errors (mistakes) after entry of data. Errors must be corrected. Examples: • No attendance hours entered (monthly or daily) • Missing assessments • Minimum credits earned to award credit diploma • Invalid dates • Age verification
Data Quality Examples (cont.) • Reasonableness tests conducted by the Department: • Consistency across reports • Longitudinal comparisons of enrollment and performance • Appropriate test form selection • Timing of pre and post tests • Social Security Number validity
Data Quality Examples (cont.) • Required local staff roles provide checks and balances with respect to data: • Program Facilitator • Data administrator • Data entry staff • Training expectations • Required attendance at training sessions • Follow up with non-attendees • Expected use of reports to monitor data quality locally
Accountability • A Data-Driven Framework is utilized to annually evaluate program performance relative to state averages and NRS targets by comparing: • Program recruitment to census need in community • Program retention and utilization rates to state averages • NRS goal setting rates to state averages • Pre-post testing rates and percent making scaled score gains to state averages • NRS level completion rates to NRS targets • The measures are weighted differently.
Accountability (cont.) • An overall analysis of these program process and outcome measures helps to target programs for on-site monitoring. • The profile report which contains this information has focused program attention on key quality indicators of adult education. • Enrollment and assessment outcomes are monitored on a semester basis.
Program Improvement/Research • Results from the accountability framework target areas for improvement. • Current research and questions for reflection guide program improvement efforts. • Key priority areas based on the data include: • Learning gains and attendance • Goal attainment and goal type • GED results – Transition to Postsecondary • Longitudinal participation
GED Score Analysis-Sample Program • Of those who passed the GED: • 46% passed without achieving a score of at least 450 in each of the subject areas but achieved 2,250 overall; and • Only 17% passed with a score of at least 500 in each of the subject areas.
255 Ability Level for Future Success D Level Test 245 C Level Test Approx. Average Functioning Level of Graduates 235 220 210 200 The Transition Gap The CASAS Scale
Questions? Thank you.