1 / 21

Implementation Drivers

Implementation Drivers. Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute University of North Carolina – Chapel Hill. SPDG Program Meeting Project Directors Conference

dore
Download Presentation

Implementation Drivers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementation Drivers Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute University of North Carolina – Chapel Hill SPDG Program Meeting Project Directors Conference The OSEP TA Center on State Implementation and Scaling-up of Evidence-Based Practices (SISEP) July 20, 2009

  2. What are Implementation Drivers? • Implementation Drivers are mechanisms that help to develop, improve, and sustain one’s ability to implement an intervention to benefit students. (Competency Drivers) • Implementation Drivers are mechanisms to create and sustain hospitable organizational and systems environments for effective educational services. (Organization Drivers)

  3. Performance Assessment (Fidelity) Coaching Training Selection Student Benefits Competency Drivers Implementation Lens © Fixsen & Blase, 2008

  4. Recruitment and Selection Purposes: • Select for the “unteachables” • Screen for pre-requisites • Make expectations explicit • Allow for mutual selection through interview process • Improve likelihood of retention after “investment” • Improve likelihood that training, coaching and supervision will result in implementation

  5. Recruitment and Selection Implementation Best Practices: • Job or role description clarity about accountability and expectations • Sampling of skills and experience is related to “new practices” and expectations • Interactive Interview Process • Skilled interviewers • Using Data for Integration and Compensatory Features • Feed interview information forward to trainers, coaches, school administrators

  6. Pre-Service and In-Service Training Purposes: • Knowledge acquisition • Basic Skill Development • “Buy-in” Implementation Best Practices: • Theory grounded (adult learning) • Skill-based • Behavior Rehearsals vs. Role Plays • Knowledgeable Feedback Providers • Practice to Criteria • Feedback to Selection and Feed Forward to Supervision • Data-based (pre and post testing)

  7. Supervision and Coaching Purposes: • Ensure implementation • Develop good judgment • Ensure fidelity • Provide feedback to selection and training processes

  8. Coaching Implementation Best Practices: • Design a Coaching Service Delivery Plan • Develop accountability structures for Coaching – Coach the Coach! • Regular satisfaction feedback from employees and volunteers • Regular review of adherence to Coaching Service Delivery Plan • Look at data – Fidelity, Teacher Satisfaction with Support, Skill Acquisition

  9. Performance Assessment (Fidelity) Coaching Training Selection Student Benefits Competency Drivers Organization Drivers Implementation Lens © Fixsen & Blase, 2008

  10. Performance Assessment Purposes: • Ensure implementation • Reinforce teachers and build on strengths • Develop skills and abilities • Measure fidelity • Interpret Outcome Data • Feedback to school, District(s), Implementation Teams on functioning of • Recruitment and Selection Practices • Training Programs (pre and in-service) • Supervision and Coaching Systems

  11. Performance Assessment Implementation Best Practices: • Transparent Processes – Orientation • What, When, How, Why • Use of Multiple Data Sources • Content • Competency • Context • Tied to positive recognition – not used ‘punitively’

  12. Performance Assessment (Fidelity) Systems Intervention Coaching Facilitative Administration Training Organization Drivers Competency Drivers Decision Support Data System Selection Student Benefits © Fixsen & Blase, 2008

  13. Decision Support Data Systems Purposes: • Provide information to assess effectiveness of new educational practices strategies • To guide further program and practice development • Celebrate success • Engage in continuous quality improvement • Be accountable for quality infrastructure (are Drivers ‘working’) and for outcomes

  14. Decision Support Data Systems Implementation Best Practices: • Includes intermediate and longer term outcome measures • Includes process measures (fidelity) • Measures are “socially important” • Useful data are: • Reliable (standardized protocols, trained data gatherers) • Reported frequently (e.g. weekly, quarterly) • Reported at relevant and “actionable” levels (e.g. student, classroom, school) • Widely shared • Practical to collect • Useful for and used for making decisions (PDSA)

  15. Facilitative Administrative Supports No such thing as a purely administrative decision!!....They are all decisions about Quality Education!! Purposes: • Facilitates moving through implementation stages • Ensures effective use of Competency Drivers • Takes the lead on Systems Interventions • Utilizes data for improvement • Looks for ways to make work of teachers and staff easier and more effective!!

  16. Facilitative Administrative Supports Implementation Best Practices: • An Implementation Team (e.g. School, District Leadership team) is formed and functional • Uses feedback to make changes in Implementation Drivers • Revises policies and procedures to support the new way of work • Solicits and uses feedback from teachers and staff • Reduces administrative barriers

  17. Systems Intervention Purposes: • Identify barriers and facilitators for the new way of work • Create a “hospitable” environment for the new way of work • Contribute to cumulative learning in multi-site projects.

  18. Systems Intervention Implementation Best Practices • Match leadership level needed to intervene • Engage and grow “champions” and “opinion leaders” • Objectively document barriers • Establishes formal PEP – PIP cycles • Uses Transformation Zones to • Identify Systems Issues • Create time-limited, barrier busting processes • Make constructive recommendations and assist in implementing and evaluating them (PDSA)

  19. Integrated and Compensatory Integrated • Consistency in philosophy, goals, knowledge and skills across these processes Compensatory • Assessment of weaknesses and strengths in Driver functioning • Installation of Drivers at multiple levels of the system (teacher, school, district, etc)

  20. For More Information Dean L. Fixsen, Ph.D. 919-966-3892 fixsen@mail.fpg.unc.edu Karen A. Blase, Ph.D. 919-966-9050 blase@mail.fpg.unc.edu State Implementation and Scaling up of Evidence-based Practices National Implementation Research Network www.scalingup.org http://nirn.fpg.unc.edu

  21. Thank You We thank the following for their support Annie E. Casey Foundation (EBPs and cultural competence) William T. Grant Foundation (implementation literature review) Substance Abuse and Mental Health Services Administration (implementation strategies grants; NREPP reviews; SOC analyses of implementation; national implementation awards) Centers for Disease Control & Prevention (implementation research contract) National Institute of Mental Health (research and training grants) Juvenile Justice and Delinquency Prevention (program development and evaluation grants Office of Special Education Programs (Capacity Development Center contract) Agency for Children and Families (Child Welfare Leadership Development contract)

More Related