1 / 35

Some Considerations for Developing, Implementing, & Sustaining Clinical Experiences

This article discusses the importance of clinical experiences in improving student outcomes and highlights the use of evidence-based practice and implementation science principles for selecting and teaching effective interventions. It also explores training components for preparing candidates to use practices and provides insights on sustaining changes in education systems.

Download Presentation

Some Considerations for Developing, Implementing, & Sustaining Clinical Experiences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some Considerations for Developing, Implementing, & Sustaining Clinical Experiences Larry Maheady, PhD Exceptional Education Department SUNY Buffalo State maheadlj@buffalostate.edu June 28, 2017

  2. Some Important Considerations • Function of clinical experiences • Increased emphasis on improving “student outcomes” as basis for clinical experiences & partnerships • Use of EBP as a decision-making process for deciding which practices to use and how to teach candidates to use them fluently • Use of implementation science principles to sustain change

  3. Shared Vision for Clinical Experiences

  4. Using Evidence-Based Practice as a Decision-Making Process in Selection & Monitoring of Effects

  5. Two Ways to Think about Evidence-based Practice? • Practice/intervention that has met some evidentiary standard AND • Broader framework for decision-making (Detrich, 2015)

  6. Evidence-based Practice as Decision-Making Process • EBP is a decision-making approach that places emphasis on evidence to: • guide decisions about which interventions to use; • evaluate the effects of any intervention. Detrich (2008) Professional Wisdom Best available evidence Student Needs Sackett et al (2000) Best Available Evidence Student Needs ProfessionalWisdom

  7. Identify the most important practices • Should have rigorous empirical support • Relevant to “high priority” instructional needs • Used frequently in classroom (high incidence) • Broadly applicable across content areas (high prevalence)

  8. Challenge for Practitioners • Solve specific problem for specific students in specific context • Research can vary in strength (weak to strong) & can be more or less relevant • Even with insufficient evidence, decisions must be made. • No Evidence vs. “Best Available Evidence” • If EBP is implemented well, still must evaluate impact. • Can’t predict which students will benefit; No intervention will work for all students • Progress monitoring is practice-based evidence about the effects of an evidence-based practice (Detrich, 2013)

  9. RIGOR Weak Strong Preferred outcome; implementation priority Try out; monitor effects High Relevance Adapt & monitor effects Low Don’t implement

  10. Essential or Core Practices

  11. How should candidates be prepared to use practices?

  12. Training Components and Attainment of Outcomes in Terms of Percent of Participants Joyce & Showers (2002)

  13. Continuum of Options for Developing Practice High Impact • Any of these options may be useful for improving practice • Some are much more useful than others Low Impact High Effort Low Effort Case-study instruction; application papers In class simulations, inter- & micro-teaching Early clinical experiences; tutoring programs Studentteaching Clinical year & coaching

  14. IMPACT Low High Preferred outcome; implementation priority Try out; monitor effects Low EFFORT Positive outcome; cost-benefit analysis High Don’t implement

  15. How can we sustain changes?

  16. A Systems Perspective • Students do well when teachers (candidates & cooperating) do well • Teachers do well when they are effectively supported at the building and University levels • Principals/EPPs are effective when they are supported by the district & University • Districts/Universities perform well when they are supported State Education Agency. • Requires an aligned system. • Data about student performance and implementation.

  17. Takeaways • Improving P-12 student outcomes should drive decision-making at all levels (classroom, building, district, & university) • Evidence-based practice (EBP) is systematic decision-making that can improve professional practice at all levels • EPPs & partners should be guided by principles of implementation science • Avoid politics of distraction (e.g., class size, vouchers, charter schools, grade retention, summer school)

  18. Empirical Support Relevance

  19. Formative evaluation = .90 Feedback = .73 Spaced practice = .71

  20. Effectiveness Cost Ratio = Effect Size/Cost Per Student

  21. Impact on Practice Effort

  22. Consider Interteaching

  23. Provide students with preparation guide in advance Students work in pairs or small groups for ½ to 2/3 of time Instructor reviews records & develops clarifying lectures Interteaching process Students complete interteachingrecord forms

  24. Nature of the Problem • In education innovations come and go in 18-48 months (Latham, 1988). • Alderman & Taylor (2003) Optimally, sustainability should be a focus from the day a project is implemented. With most projects, the pressure of just becoming operational often postpones such a focus until well into the 2nd year.

  25. Excellent Evidence for What Doesn’t Work • Disseminating information alone (research findings, mailings, & practice guidelines) • Training alone, no matter how well done, does not lead to successful implementation • Implementation by edict/accountability • Implementation by “following the money” • Implementation without changing support roles & responsibilities (Fixsenet al., 2008)

  26. Why Such a Short Life Span? • High Effort • Innovation more difficult than expected. • Causes too much change. • Takes too much time. • Poor system design • Supporters leave. • Personnel lack training. • External funds run out. • Inadequate supervision. • No accountability. • No consequences for early termination.

  27. Even Well Tested Programs Fail to Sustain • Elliott & Mihalic (2004) review Blueprint Model Programs (violence prevention and drug prevention programs) replication in community settings. • Critical elements in site readiness • Well connected local champion • Strong administrative support • Solid training • Adhere to requirements for training, skills, and education. • Hire all staff before scheduling training. • Good technical assistance • Fidelity monitor • Some measure of sustainability

  28. Cultural Analysis and Sustainability • Diffusion of Innovations (Rogers, 2003) • Diffusion is a kind of social change, defined as the process by which alteration occurs in the structure and function of a social system. When new ideas are invented, diffused, and adopted or rejected, leading to certain consequences, social change occurs. • Diffusion of innovation is a social process, even more than a technical matter. • The adoption rate of innovation is a function of its compatibility with the values beliefs, and past experiences of the individuals in the social system.

  29. Cultural Analysis and Sustainability • Harris (1979): practices are adopted and maintained to the extent that they have favorable, fundamental outcomes at a lower cost than alternative practices. • Fundamental outcomes are subsistence and survival.

  30. Important Funding Outcomes for Cultural Institutions • Schools: Average Daily Attendance. • Schools: Unit cost for a classroom. • Special Education: # of students identified • Special Education services are often specified as # minutes per session or # sessions per week. • Mental health services: # of clients seen/time. • These all represent process measures rather than outcome measures.

  31. Implications of Current Measures • If key outcome is survival of cultural practice then innovations in service must accomplish these outcomes at a much lower cost than current practice. • Nothing in the current unit of analysis specifies effectiveness as critical dimension of the practice.

  32. How Can We Increase Sustainability of Practices? • When developing innovative practices demonstrate how they address basic funding outcomes for schools. • Monitor performance outcomes. • Even though not directly tied to fundamental outcomes, the larger culture has expectations that schools will educate students in a safe environment. • Find champions who are part of the system. • Champion should control important reinforcers for others within the system. • Champion needs to plan on “sticking around.”

  33. How Can We Increase Sustainability of Practices? • Pro-active technical assistance. • Help solve the real problems of implementation. • Monitor integrity of implementation. • Without monitoring, the system likely to drift back to previous practices. • Anticipate 3-5 years before system is fully operational. • Emphasizes the need to plan for multigenerational support. • Use external funding and support with extreme caution.

  34. QUESTIONS????

  35. What considerations should participants think about to create coordinated & comprehensive fieldwork in their own contexts?

More Related