1 / 40

Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director

Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director. Session Overview. The Title I, Part D Data Collection Importance of Data Quality and Data Use Actively Using Data for Program Improvement. The Title I, Part D Data Collection. What are Title I, Part D and NDTAC?.

fisk
Download Presentation

Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director

  2. Session Overview • The Title I, Part D Data Collection • Importance of Data Quality and Data Use • Actively Using Data for Program Improvement

  3. The Title I, Part D Data Collection

  4. What are Title I, Part D and NDTAC? • Title I, Part D (TIPD) of the Elementary and Secondary Education Act of 2001 • Subpart 1-State Agency • Subpart 2-LEA • National Evaluation and Technical Assistance Center for the Education of Children and Youth who Are Neglected, Delinquent or At-Risk (NDTAC)

  5. NDTAC's Mission Related to Data and Evaluation • Develop a uniform evaluation model for State Education Agency (SEA) Title I, Part D, programs • Provide technical assistance (TA) to States in order to increase their capacity for data collection and their ability to use that data to improve educational programming for N & D youth

  6. Background: NDTAC’s Role in Reporting and Evaluation Specific to Title I, Part D, Collections • TA prior to collection Webinars, guides, and tip sheets • TA during collection Data reviews, direct calls, and summary reports for ED • Data analysis and dissemination GPRA, Annual Report, and online Fast Facts Related TA • Data use and program evaluation

  7. TIPD Basic Reporting and Evaluation Requirements Where do requirements come from? • Elementary and Secondary Education Act, amended in 2001 (No Child Left Behind) • Purpose of Title I, Part D (Sec. 1401) • Program evaluation for Title I, Part D (Sec. 1431-Subpart 3) How does ED use the data? • Government Performance and Results Act (GPRA) • Federal budget requests for Title I, Part D • Federal monitoring • Provide to NDTAC for dissemination

  8. Collection Categories for TIPD in the Consolidated State Performance Report (CSPR) • Types/number of students and programs funded • Demographics of students within programs • Academic and vocational outcomes • Pre-posttesting results in reading and math

  9. Title I, Part D in Pennsylvania

  10. Local Education Agency (S2) Academic Outcomes * 2010-11 data are preliminary

  11. Long-term Students Improvement in Reading (Subpart 2) * 2010-11 data are preliminary

  12. Long-term Students Improvement in Math (Subpart 2) * 2010-11 data are preliminary

  13. Data Quality & Data Use

  14. Functions of Data • Help us identify whether goals are being met (accountability) • Tell our departments, delegates, and communities about the value of our programs and the return on their investments (marketing) • Help us replace hunches and hypotheses with facts concerning the changes that are needed (program management and improvement) • Help us identify root causes of problems and monitor success of changes implemented (program management and improvement)

  15. Why Is Data Quality Important? You need to TRUST your data as it informs: • Funding decisions • Technical assistance (TA) needs • Student/facility programming

  16. What Is “high data quality”? If data quality is high, the data can be used in the manner intended because they are: • Accurate • Consistent • Unbiased • Understandable • Transparent

  17. What data are the most useful? Useful data are those that can be used to answer critical questions and are… • Longitudinal • Actionable (current, user-friendly) • Contextual (comparable, part of bigger picture) • Interoperable (matched, linked, shared) Source: Data Quality Campaign Source: Data Quality Campaign

  18. Should you use data that has lower quality data? YES!! You can use these data to… • Become familiar with the data and readily ID problems • Know when the data are ready to be used more broadly or how they can be used • Incentivize and motivate others

  19. Data Quality Support Systems • Insure systems, practices, processes, and/or policies are in place • Understand the collection process • Provide/request TA in advance • Develop relationships • Develop multilevel verification processes • Track problems over time • Use the data (even when problematic) • Link decisions (funding, hiring, etc.) to data evidence • Indicate needs to others

  20. Using Data Actively

  21. Essential Steps Related to Data Use • Identify problem or goal to address • Explore & analyze existing data • Develop and implement change • Set targets and goals • Develop processes to monitor and review data

  22. Step 1: Identify concerns or goals Identify your level of interest • State • Facility / School • Classroom Define, issue, priorities or goals • Upcoming decisions • State or district goals or initiatives • Information from needs assessments (or, conduct one) Identify how data will be used & questions Resource: NDTAC Program Administration Planning Guide-Tool 3 on Needs Assessments

  23. Program Components by Data Function

  24. Focusing the Questions Break the question into inputs and outcomes: • Inputs (what your program contributes): • Teacher education, experience, full-time/part-time • Instructional curriculum • Hours of instruction per week • Outcomes (indicators of results): • Improved posttest scores • Completed high school • Earned GED credentials

  25. Focusing/Refining the Question Weak Question: • Does my school have good teachers? Good Question: • Does student learning differ by teacher? Better Question: • Do students in classes taught by instructors who have more teaching experience have higher test scores than those taught by new teachers?

  26. Step 2: Explore Existing Data • Locate the data you do have • Put it in a useful format • Trends, comparisons • What story is the data telling you? • What jumps out at you about the data? • Are the data telling you something that is timely and actionable? • What questions arise? What is the data not telling you that you wish you knew?** • What data could help answer those questions?

  27. Local Education Agency (S2) Academic Outcomes

  28. LEA 1: Comparison data (1)Percent of Students Earning HS CC State Average LEA Average

  29. Comparison Data (2): Context

  30. Longitudinal data: more context

  31. Do you know enough? Sometimes, the data will lead to more questions and a need for more information… • Compare to other LEA’s facilities • Use student-level data and disaggregate • Look at monitoring information and applications • Collect additional information-surveys, interviews *Keep data quality in mind

  32. Step 3: Implement improvement plan • Implement new programming, change, etc. • Set benchmarks, performance targets • In terms of your priorities, where do you want your subgrantees and facilities to be in one year? Two years? Three years? • What performance benchmarks might you set to measure progress along the way? • How will you know when to target a subgrantee or facility for technical assistance? At what point might you sound the alarm?

  33. Step 4: Develop processes for reviewing data Keep using it! • Monitor change and compare against benchmarks • Review data in real time • Share it and discuss it

  34. Keep in mind • Data use is not easy* • Data should be a flashlight, not a hammer* • Change takes time-set realistic goals • “No outcome” can be a useful finding • Aggregated data can usually be shared *Source: Data Quality Campaign

  35. Data Capacity Exists !(Data Quality Campaign, 2011 Report)

  36. Next Step: Data Use (DQC-2011)

  37. Accessible Data – N or D Related Title I, Part D Data • ED Data Express: www.eddataexpress.ed.gov • NDTAC State Fast Facts Pages: http://data.neglected-delinquent.org/index.php?id=01 • Title I, Part D, Annual Report: www.neglected-delinquent.org/nd/data/annual_report.asp Civil Rights Data Collection (district and school) http://ocrdata.ed.gov/

  38. Accessible Data – N or D Related OSEP Data Collection https://www.ideadata.org/default.asp Youth Behavior Survey (CDC) http://www.cdc.gov/healthyyouth/yrbs/index.htm OJJDP Juvenile Justice Surveys /Data Book http://www.ojjdp.gov/ojstatbb/

  39. Resources • NDTAC reporting and evaluation resources: http://www.neglected-delinquent.org/nd/topics/index2.php?id=9 • Data Quality Campaign: www.dataqualitycampaign.org Data for Action 2011—Empower With Data

  40. Questions? Stephanie Lampron NDTAC Deputy Director slampron@air.org 202-403-6822 NDTAC Data Team • Dory Seidel: dseidel@air.org • Liann Seiter: lseiter@air.org

More Related