1.27k likes | 1.44k Views
Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida. 2011 FASP Conference November 4 th , 2011 Kevin Stockslager Kelly Justice Beth Hardcastle. Advanced Organizer. Accountability and Evaluation MTSS and Program Evaluation in the Schools
E N D
Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4th, 2011 Kevin Stockslager Kelly Justice Beth Hardcastle
Advanced Organizer • Accountability and Evaluation • MTSS and Program Evaluation in the Schools • Example of an MTSS Evaluation Model • Review of Potential Data Sources • Surveys • Self-Assessments • Permanent Product Reviews
What does… • Accountability mean to you? • Evaluation mean to you?
Accountability in Florida • Increasing accountability focus the last decade • Examples include: • School grading • AYP • Special education rules • DA • FEAPs & Teacher evaluation systems
Impact of Accountability Criticisms Positives Establishes and maintains standards for performance Reinforces use of data to monitor student outcomes Reinforces need to examine resource use Student outcome rather than process focus Success stories • Lack of educator involvement • Controversy • Consequence driven • Compliance driven • Conflicting requirements • Duck and cover approach
Accountability & Evaluation Issues • Compliance driven versus informative evaluation • Evaluation often done to meet accountability requirements • Evaluation can serve to help integrate and improve school and district services • Evaluation is fundamental to MTSS • MTSS has the potential to: • Be viewed as one more thing we have to do OR • Help address accountability & evaluation demands through the multi-tier framework
Important MTSS Evaluation Issues • Stakeholders should be involved in all aspects of planning and carrying out the evaluation process as well as in decision-making • Goals through planning should drive the process • Information obtained to: • Determine where you currently are (needs) • Take ongoing looks at how things are working • Make decisions about what to keep doing and what to change or eliminate
MTSS Evaluation Issues cont. • The data you collect should be driven by the evaluation questions you want to answer • Are students meeting expectations? Academically? Behaviorally? Social-emotionally? • Are we implementing MTSS with fidelity? • Do we have the capacity to implement successfully? • Do staff buy into implementing MTSS? *Example questions
Table Top Activity • Brainstorm and discuss some additional evaluation questions that you might want to answer at your schools • (2-3 minutes then report out)
How Are Students Performing? Examples of data sources • Academics • FCAT • FAIR • Core K-12 • End of Course Exams • Behavior • Attendance • Tardies • Suspensions • Discipline referrals • Global Outcomes • Graduation Rates
Are Schools Implementing MTSS with Fidelity? Examples of data sources • Curriculum and Instruction/Intervention • Principal walkthroughs • Lesson plans • Intervention Documentation Worksheets • Components of MTSS and Data-Based Problem-Solving* • BOQ, PIC, BAT • SAPSI, Tier I & II CCCs, Tier III CCCs * See http://flpbs.fmhi.usf.edu/ and http://floridarti.usf.edu for more information
Do We Have the Capacity to Implement MTSS with Fidelity? Examples of data sources • Leadership Team structure and functioning • Organizational charts • Minutes/meeting summaries • SAPSI, BOQ, PIC • Staff knowledge and skills • FEAPs & teacher evaluation system • Staff development evaluations • Work samples • Resources allocated to match needs • SIP, DIP • Master calendar/schedule • School rosters • Resource maps
Do Staff Buy Into Implementing MTSS? Examples of data sources • Leadership vision and commitment • SAPSI, BOQ, PIC • Required and non-required plans • Staff buy in • SAPSI, BOQ, PIC • District/school staff and climate surveys • Dialogue • Brief interviews with key personnel
Table Top Activity • Mock Small-Group Planning and Problem-Solving Process
Small-Group Planning and Problem-Solving Process • What is our desired goal? • Brainstorm the resources and barriers to achieving our goal • Select a barrier/group or related barriers to address first • Brainstorm strategies to reduce or eliminate our selected barrier • Develop an action plan to reduce or eliminate our selected barrier • Include who, what, when (Be specific!) • Develop a follow-up plan for each action • Include who, what, when • Develop a plan to evaluate the reduction or elimination of our chosen barrier • Develop a plan to evaluate progress towards achieving our goal from Step 1
Mock Small-Group Planning and Problem-Solving • Goal: Develop and implement a data-based evaluation system in my school and/or district • Brainstorm the resources and barriers to achieving our goal • Select a barrier/group or related barriers to address first • Brainstorm strategies to reduce or eliminate our selected barrier
Perceptions of RtI Skills Survey Assessing Perceptions of Skills Integral to PS/RtI Practices
Briefly… • Role of survey data • Beliefs Survey • Perceptions of Practices Survey
Perceptions of Skills The likelihood of embracing new practices increases when: • Educators understand the need for the practice • Educators perceive they either have the skills to implement the practice or will be supported in developing required skills (Showers, Joyce, Bennett, 1987)
Description and Purpose Perceptions of RtI Skills Survey
Perceptions of Skills—Description and Purpose • Theoretical Background: • Assess educators’ perceptions of skills they possess to implement PS/RtI • Understand perceptions of skills and how perceptions change as function of professional development to facilitate PS/RtI implementation
Description of Survey • Assesses skills/amount of support needed for: • Applying PS/RtI practices to academic content • Applying PS/RtI practices to behavior content • Data manipulation and technology use • 20 items; 5-point Likert scale • 1= I do not have the skill at all (NS)…5= I am highly skilled in this area and could teacher others (VHS)
Purpose of Instrument Purpose of the Perceptions of RtI Skills Survey: • Assess impact of professional development • Identify “comfort level” with PS/RtI practices to inform PD; allocate resources
Administration Procedures & Scoring Perceptions of RtI Skills Survey
Administration procedures-Intended Audience • Who should complete? • SBLT members • Instructional staff • Who should use results? • SBLTs • DBLTs
Directions for Administration • Methods for administration/dissemination • Completed individually • Anonymity • Opportunity for questions • Role of school principal—explain the “why” • Role of RtI coach/coordinator/SBLT member • Frequency of use: resources, rationale, recommendations
Scoring Two techniques to analyze survey responses: • Mean rating for each item calculated to determine average perceived skill level • Frequency of each response option selected calculated for each item
Calculating Item Mean • Overall assessment of perceived skills of educators within a school/district • Can be done at domain(factor) and/or individual item level • Domain level: examine patterns in perceived skills re: academic content, behavior content, data manipulation/technology use • Item level: identify specific skills staff perceive possessing v. skills in need of support
Calculating Frequency of Response Options • Provides information on range of perceived skill levels • Can be used to determine what percentage of staff may require little, some, or high levels of support to implement PS/RtI • Informs professional development decisions
Answering Evaluation Questions • Use data to inform evaluation questions • Use data to answer broad/specific questions • Align analysis and data display with evaluation questions • Consider available technology resources to facilitate analyses of data—online administration, automatic analysis, knowledge and skill of personnel
Technical Adequacy Perceptions of RtI Skills Survey
Technical Adequacy Content validity: • Item set developed to represent perceived skills important to implementing PS/RtI • Reviewed by Educator Expert Validation Panel (EEVP) Construct validity: • Factor analysis conducted using sample of 2,184 educators • Three resultant factors
Technical Adequacy (cont.) Internal Consistency Reliability: • Factor 1 (Perceptions of RtI skills applied to academic content): α = .97 • Factor 2 (Perceptions of RtI skills applied to behavior content): α = .97 • Factor 3 (Perceptions of Data Manipulation and Technology Use Skills): α = .94
Interpretation and use of data Perceptions of RtI Skills Survey
Interpretation & Use of Data • Three domains: • Perceptions of skills applied to academic content • Perceptions of skills applied to behavior content • Perceptions of data manipulation and technology use skills • Three methodologies: • Calculate mean at domain level • Calculate mean at item level • Frequency/percentage of who selected each response option • Identify specific skills/skills sets for PS/support
Interpretation & Use of Data (cont.) • Sharing data with stakeholders: • DBLTs, SBLTs, instructional staff • Use data to: • Develop/adjust PD goals • Design training/coaching activities • Facilitate consensus-building discussions re: rationale for PD, patterns, barriers
Facilitating Discussions Sample guiding questions… • To what extent do you believe your school possesses the skills to use school-based data to evaluate core instruction (Tier 1)? Supplemental instruction (Tier 2)? • Based on what staff has learned about data-based decision-making, how consistent are those skills with PS/RtIpractices (i.e., to what degree do teams evaluate the effectiveness of core and supplemental instruction?