1 / 26

Matching Efforts to Outcomes: Are you really sure?

Matching Efforts to Outcomes: Are you really sure?. Keynote Address for Nonpublic Special Education Programs Annual Conference Hank Bohanon, Ph.D. hbohano@luc.edu http://www.luc.edu/cseit Center for School Evaluation, Intervention and Training (CSEIT), Loyola University, Chicago.

fergal
Download Presentation

Matching Efforts to Outcomes: Are you really sure?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Matching Efforts to Outcomes: Are you really sure? Keynote Address for Nonpublic Special Education Programs Annual Conference Hank Bohanon, Ph.D. hbohano@luc.edu http://www.luc.edu/cseit Center for School Evaluation, Intervention and Training (CSEIT), Loyola University, Chicago

  2. Sarah Sebert and Paul Nijensohn (ISBE) Barbara Simms (ISBE), Illinois State Technical Assistance Center Kelly Raucher (ISBE), SEL Kathy Cox (ISBE), Illinois ASPIRE Dean David Prasse, Loyola University of Chicago Thank you’s

  3. Dr. Pamela Fenning, CSEIT Sara Golomb, CSEIT Agnieszka Kielian, CSEIT Lisa Lewis, CSEIT Dr. Diane Morrison, CSEIT Audrey Shulruff, CSEIT Additional Thank yous

  4. Increase participants awareness of how program data decision making and evaluation efforts can be integrated. Data process Data integration Examples Goal and Objective

  5. Data Process

  6. What are your questions? What are your data sources? What reports do you need? Who needs access? What resources do you have? Process

  7. Align with targets/strategic plan Align with state initiatives Training objectives Evaluation questions: If you train, do people implement? Do people implement with fidelity Do the interventions sustain? What is the impact on your constituents? Questions

  8. Process Instruments (e.g., fidelity tools) What is your process? Outcomes Performance data Curriculum-based measures Classroom checklists Office discipline referrals Transition information (e.g., post-secondary) Data

  9. Item development (align with questions) Systems, Practice, Data, Outcomes Schoolwide, Classroom, Non-Classroom, Individual Judgmental validity (Expert Judgment) Concurrent validity (With Reliable Tool) Instrument development

  10. Pilot instrument Item analysis (factor analysis) Review and update Balance input with process Instrument development

  11. Access • What types of decisions need to be made based on the data and who makes them • Statewide personnel • Administrators • Direct service providers

  12. In what format do you need the data? Graphs Specific instruments Combinations (process and outcomes) Summaries & reports Output files (flat/rectangular format) Common identifiers Clean your data (SOP) Reports

  13. The more you can draw upon existing resources, the less time it will take Resources

  14. Data Integration

  15. Data inputs Data already exists SOP for cleaning data Additional data from direct assessments SOP for reliability Data integration Requirements, common identifier National Educational Technology Standards(NETS) Universal data system for educational data Technology Data decision making

  16. Data cleaning Reliability of data entry Tips

  17. Static Graphs, tables, etc. based on questions Dynamic Analysis based requires export of flat files Reports

  18. Statewide Summative and formative: Annual and Quarterly reports Administration Formative: Professional development, tracking data collection, reports Direct Service Diagnostic: How much of the interventions is in place, what is the impact, and what changes do you need to make? Decision Making

  19. Example of questions Sample instrument Sample report Integration of Data

  20. ConsiderationsKey Components from Three-Tiered Intervention Programs

  21. STUDENTS WHO RESPOND TO A CONTINUM OF SUPPORTS Students who respond to intensive academic and behavior support 1-7% Individual Support 5-15% Group Support Students who respond to less intensive academic and behavior support 80-90% Students who would respond to effective core academic and behavior curriculum Schoolwide support OSEP-PBS National Standard

  22. Systems Administrative Commitment Priority for Staff Representative Team Audit of practices Action Plan Data System Internal Coaching External Coaching Practices Based on evidence Data Process and impact What and with whom? Key Elements

  23. Identify expectations of the setting Evaluate implementation and evaluation of core curriculum Develop team/plan/support Directly teach expectations Consistent consequences Acknowledgement Collect data Process, academics and behavior Communicate with staff On-going evaluation Schoolwide Supports

  24. What does your current system provide information that is: Efficient, reliable, dependable, user friendly Who needs to see your data? Who are your stakeholders What is your timeline? Questions to ask

  25. Visit our center website for more information http://www.luc.edu/cseit/learning.shtml Useful example of reports http://www.pbisillinois.org/ Example of coaching and data collection http://flpbs.fmhi.usf.edu/coachescorner.asp Resource

  26. Thanks!

More Related