1 / 22

Using Individual Project and Program Evaluations to Improve the Part D Programs

Using Individual Project and Program Evaluations to Improve the Part D Programs. Dr. Herbert M. Baum. Session Theme # 3. Part D programs have their own priorities for evidence, which project directors need to support. Objectives. By the end of this presentation you will:

banyan
Download Presentation

Using Individual Project and Program Evaluations to Improve the Part D Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum

  2. Session Theme # 3 Part D programs have their own priorities for evidence, which project directors need to support

  3. Objectives By the end of this presentation you will: • Know the difference between program measurement and program management • Understand how to use logic models for developing program measures • See examples of how current program information is being used to enhance the Part D Programs.

  4. The trickle up approach (1) • Each OSEP/RTP Program has a logic model • Each OSEP/RTP Program has an approved series of performance measures. • Each OSEP/RTP Program funds projects

  5. The trickle up approach (2) Number and Type of Measures by OSEP/RTP Program

  6. The trickle up approach (3) • How does each OSEP/RTP Program know that they are meeting their targets? • Your projects need to provide that • How does each OSEP/RTP Program use the data to manage? • Keeping project officers informed of where the Program is relative to the target. • Using the measures to direct new grant applications to address how they will help OSEP/RTP meet their targets. • By meeting targets OSPE/RTP is in a stronger position to ask for additional funding.

  7. Performance Measurement

  8. Performance Management

  9. Evaluation in Performance Management

  10. Performance Management Steps • Assess – review program purpose and design • Plan – set strategic performance targets • Measure – measure program performance • Analyze – evaluate program results • Improve – implement program enhancements • Sustain – manage program effectiveness

  11. Evaluation in the Context of PART • The Program Assessment Rating Tool (PART) was developed to assess the effectiveness of federal programs, and help inform management actions, budget requests, and legislative proposals directed at achieving results. • The PART Assess if and how program evaluation is used to inform program planning and to corroborate program results. • 4 sections of a PART review: • Program purpose and design (20%) • Strategic planning (10%) • Program management (20%) • Program results (50%)

  12. Evaluation and PART (2) • Question 2.6 “…[a]re independent evaluations of sufficient scope and quality conducted on a regular basis, or as needed to support program improvements, and evaluate effectiveness and relevance to the problem, interest, or need.” • Question 4.5 asks if “independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results.”

  13. What is a logic model? The underlying rationale for the evaluand's design, usually an explanation of why the various components of the program (for example) have been created and what-and how-each of them is supposed to contribute towards achieving the desired outcomes. Some logic models include environmental factors, some do not. Note that we are talking about the alleged ‘theory of operation’ and the evaluation may discover considerable discrepancies between this-the view of the designers and possibly also the managers of the program-and the views of the service deliverers who are the hands-on staff engaged in dealing with the recipients of service or product.”

  14. Simplified Logic Model Inputs What the program needs to accomplish it outcomes. Activities What programs do to accomplish their outcomes. Outputs What programs produce to accomplish their outcomes. Outcomes What changes the program expects based on their inputs, activities and outputs. [Short-term, Intermediate, and Long-term (impact)]

  15. OSEP Personnel Development Program - Logic Model A blueprint to enhance understanding of the Program Goal: To improve results for children with disabilities and their families INPUTS OUTPUTS OUTCOMES Intermediate Program Investments Activities Participation Short Term Long Term CONTEXT Federal Law & Regs Time Increased supply of fully qualified personnel* with awareness and knowledge of EBP & best practices Increased collaboration - SEAs, IHEs, LEAs & lead agencies Increased training opportunities Project Officers Funding Evidence-Based & Best Practices Research Program & Grants Policy Technology Increased placement of fully qualified* personnel Improved personnel development infrastructures • Train personnel • Redesign & • build models & networks for collaboration • Develop and disseminate resources Grantees Faculty Students in IHEs SEAs & LEAs Lead Agencies Practitioners Administrators Children Families Increased retention of fully qualified* personnel in workforce – schools & programs, educational & lead agencies, & IHEs. Develop priorities & manage competitions Monitor grants Build models & networks for collaboration Process Measures Outcome Measures *Fully Qualified = Highly Qualified for special education teacher; Qualified for paraprofessional/aide; Fully Certified for administrator/coordinator, for related or supportive services in a school setting, or for teacher, related services, or supportive services in early intervention, early childhood.

  16. Measure 1.1 (Annual) The percentage of Special Education Personnel Preparation projects that incorporate evidence-based practices in the curriculum. Measure 1.2 (Long-Term) The percentage of scholars completing Special Education Personnel Preparation funded training programs who are knowledgeable and skilled in evidence-based practices for infants, toddlers, children, and youth with disabilities. Home

  17. Measure 2.1 (Annual) The percentage of Special Education Personnel Preparation funded scholars who exit training programs prior to completion due to poor academic performance. Home

  18. Measure 2.2 (Long-Term) The percentage of low incidence positions that are filled by personnel who are fully qualified under IDEA. Measure 2.4 (Annual) The percentage of Special Education Personnel Preparation funded degree/certification recipients who are working in the area(s) for which they were trained upon program completion and who are fully qualified under IDEA. Home

  19. Measure 2.5 (Long-Term) The percentage of degree/certification recipients who maintain employment for 3 or more years in the area(s) for which they were trained and who are fully qualified under IDEA. Home

  20. Example from the Personnel Development Program Measure 1.1 of 2 (Annual) The percentage of Special Education Personnel Preparation projects that incorporate evidence-based practices in the curriculum.

  21. Example from the TA& D Program Measure 1.1 of 1 (Long-term) The percentage of school districts and service agencies receiving Special Education Technical Assistance and Dissemination services regarding scientifically- or evidence-based practices for infants, toddlers, children and youth with disabilities that implement those practices.

  22. Questions Contact information Herbert. M. Baum, Ph.D. ICF Macro 301-572-0816 Herbert.M.Baum@macrointernational.com

More Related