810 likes | 939 Views
Evaluating the Performance of Specialized Professionals in a MTSS. AMM - September 11, 2012. Evaluating Specialized Personnel in a Multi-tiered System of Supports (MTSS). Student Support Services Project, USF Bureau of Exceptional Education and Student Services (BEESS )
E N D
Evaluating the Performance of Specialized Professionals in a MTSS AMM- September 11, 2012
Evaluating Specialized Personnel in a Multi-tiered System of Supports (MTSS) • Student Support Services Project, USF • Bureau of Exceptional Education and Student Services (BEESS) • Bureau of Educator Recruitment, Development, and Retention
Overview • Florida’s New Evaluation System • Student Services Role in a Multi-tiered Support System • Overview of Student Services Personnel Evaluation Model (SSPEM) • Update on Development of ESE Evaluation Model for Specialized Personnel
Link to Resources http://sss.usf.edu/resources/format/presentations/2012/amm.html
Florida’s New Evaluation System Eileen McDaniel, Chief, Bureau of Educator Recruitment, Development, and Retention
Purpose for Personnel Evaluations As set forth in the Student Success Act and Race to the Top, teacher evaluations are: • Designed to support effective instruction and student learning growth. • Results used when developing district and school level improvement plans. • Results used to identify professional development and other human capital decisions for instructional personnel and school administrators.
Purpose for Personnel Evaluations • Evaluations mustdifferentiate among 4 levels of performance: • Highly effective • Effective • Needs improvement, or for instructional personnel in first 3 years of employment, Developing • Unsatisfactory • State Boardof Education must establish student growth standards for each performance level (no date required). • Commissioner must consult with experts, instructional personnel, school administrators and education stakeholders in developing the criteria for the performance levels.
Purpose for Personnel Evaluations To support those objectives, the law also sets forth that teacher evaluations are to be based on sound educational principles and contemporary research in effective practices in three major areas: • The performance of students • Instructional practice • Professional and job responsibilities
Two Major Components of the Evaluation System Instructional Practice measured by the District’s Instructional Practice Framework
Instructional Practice Section 1012.34, F. S., requires that instructional practice evaluate the following: • For Classroom teachers, excluding substitutes: • Florida Educator Accomplished Practices (FEAPs) • For Instructional personnel, not classroom teachers: • FEAPs • May include specific job expectations related to student support Instructional Framework goal: An expectation that all teachers can increase their expertise from year to year which produces gains in student achievement from year to year with a powerful cumulative effect
Two Major Components of the Evaluation System Performance of Students is focused primarily on student learning growth Instructional Practice 50% Performance of Students 50%
Performance of Students Performance of Students. At least 50% of a performance evaluation must be based upon data and indicators of student learning growth assessed annually and measured by statewide assessments or, for subjects and grade levels not measured by statewide assessments, by district assessments as provided in s. 1008.22(8), F.S. • Section 1012.34(3)(a)1., Florida Statutes
Performance of Students For subjects and grades not assessed by statewide assessments: • By 2014-15, districts shall measure growth using equally appropriate formulas. DOE shall provide models. • Allows district to request through evaluation system review process to: • Use student achievement, rather than growth, or combination of growth and achievement for classroom teachers where achievement is more appropriate; • For courses measured by district assessments, include growth on FCAT Reading and/or Mathematics as part of a teacher’s growth measure, with a rationale. In this instance, growth on district assessment must receive the greater weight.
Performance of Students Section 1012.34(7)(e) • For classroom teachers of courses for which there are no appropriate assessments under s. 1008.22(8), F.S., and the district has not adopted growth measures: • Student growth must be measured by using results of assigned students on statewide assessments, OR • If the teacher’s assigned students do not take statewide assessments, by established learning targets approved by principal that support the school improvement plan. • The superintendent may assign instructional personnel in an instructional team the growth of the team’s students on statewide assessments. • These provisions expire July 1, 2015.
Performance of Students • The performance of students represents 50% of a teacher’s evaluation, with performance based on student learning growth • Growth data for 3 years of students assigned to the teacher. • If less than 3 years of data are available, years for which data are available must be used, and percentage of evaluation based on growth may be reduced to not less than 40%. • To meet the above requirement, the development of a fair and transparent measure of student growth is essential.
Florida’s Value Added Model Overview of the Model to Measure Student Learning Growth on FCAT as developed by the Student Growth Implementation Committee
The Measure: Value-Added Analysis • A value-added model measures the impact of a teacher on student learning, by accounting for other factors that may impact the learning process. • These models do not: • Evaluate teachers based on a single year of student performance or proficiency (status model). • Evaluate teachers based on simple comparison of growth from one year to the next (simple growth).
Value-Added Example The difference between the predicted performance and the actual performance represents the value-added by the teacher’s instruction. The predicted performance represents the level of performance the student is expected to demonstrate after statistically accounting for factors through a value-added model.
Advantages of Value-Added Models • Teacher teach classes of students who enter with different levels of proficiency and possibly different student characteristics. • Value-added models “level the playing field” by accounting for differences in the proficiency and characteristics of students assigned to teachers. • Value-added models are designed to mitigate the influence of differences among the entering classes so that schools and teachers do not have advantages or disadvantages simply as a result of the students who attend a school or are assigned to a class.
Florida’s Value-Added Model Developed by Florida Educators • Florida’s VAM model begins by establishing expected growth for each student: • Based on historical data each year. • Represents the typical growth seen among students who have earned similar test scores the past two years, and share the other characteristics identified by the committee.
Factors Identified by the SGIC to “Level the Playing Field” To isolate the impact of the teacher on student learning growth, the model developed by the SGIC and approved by the Commissioner accounts for: • Student Characteristics • Classroom Characteristics • School Characteristics
Factors Identified by the SGIC to “Level the Playing Field” • Student Characteristics • Up to two prior years of achievement scores (the strongest predictor of student growth) • The number of subject-relevant courses in which the student is enrolled • Students with Disabilities (SWD) status • English Language Learner (ELL) status • Gifted status • Attendance • Mobility (number of transitions) • Difference from modal age in grade (as an indicator of retention) • Classroom Characteristics • Class size • Homogeneity of students’ entering test scores in the class • School Component
Factors Identified by the SGIC to “Level the Playing Field” The model recognizes that there is an independent factor related to the school that impacts student learning – a school component. • Statistically is simply the factors already controlled for in the model measured at the school level by grade and subject. • May represent the impact of the school’s leadership, the culture of the school, or the environment of the school on student learning. • Acts as another covariate, just like all other factors.
What does a teacher’s value-added score represent? • An estimate of a teacher’s impact on student learning, after accounting for other factors that may impact learning. • A score of “0” indicates that students performed no better or worse than expected based on the factors in the model. • A positive score indicates that students performed better than expected. • A negative score indicates that students performed worse than expected. • Individual teacher scores are expressed in terms of Developmental Scale Score (DSS) points.
What does a teacher’s value-added score represent? • If a teacher’s score is “20,” for example, what does a score of 20 points mean? • It means that students, on average, performed 20 DSS points higher than typical. • To account for differences in the FCAT vertical scale across grade levels, subject areas, and years, a teacher’s value-added scores are aggregated into one score, and then transformed into a proportion of an “average year’s growth.” • Proportion of an average year’s growth provides more context and helps describe the magnitude of the gain.
What does a teacher’s value-added score represent? • Thus, if the average amount of growth in a given grade, subject, and year is 40 scale score points, transforming a score of 20 points into a proportion yields a score of 0.50 (20 divided by 40). • Now, one can interpret the raw value-added score of 20 to say that on average students performed 50% higher than an average year’s growth. • These analyses using historical data (and those provided to school districts) use this metric of a proportion of an “average year’s growth.”
Additional Data Provided through the VAM – Percent Meeting Expectations • In addition to the value-added score, the model also yields information on the number and percent of students that met their statistical performance expectations. • Though these data do not provide information on how far students improved or declined, it does provide information on the quantity of students who met their expectations. • These data are used in analyzing the disaggregated performance of student subgroups.
Impact Analyses for the Preliminary 2011-12 VAM Scores • Analyses were run to determine the relationship of the teacher VAM scores with various classroom characteristics, including the percentage of students in a teacher’s classroom who were disabled. • In all cases, the correlations are negligible indicating no advantages or disadvantages in terms of the VAM score for teachers based on the group of students they serve.
Correlation of Teacher VAM Score and Percent Students with Disabilities
Rule Development Activities • Department is currently in rule development on proposed rule 6A-5.0411, Calculations of Student Learning Growth Using Statewide Assessment Data for Use in School Personnel Evaluations • Six workshops were held across the state to gather input • May 21, Naples • May 22, Coconut Creek (Broward) • May 23, Orlando • June 5, St. Augustine • June 6, Pensacola • June 7, Conference Call
Rule Development Activities • This proposed rule focuses on the statutory requirement in Section 1012.34(7) and (8), Florida Statutes, requiring the State Board of Education to adopt rules which establish: • Each formula for measuring student growth that is approved by the Commissioner. • Specific, discrete standards for each performance level to ensure clear and sufficient differentiation in the performance levels and to provide consistency in meaning across school districts. • The measurement of student learning growth and associated implementation procedures.
Rule Development Activities • Seeking Feedback on: • What standard should be used to evaluate and classify teachers and administrators based on VAM data? • What levels of standard error should be applied in determining performance categories? • What circumstances must be satisfied in order for the specific requirements in s. 1012.34(8), F.S., to apply in affecting a summative rating when the standard for learning growth is not achieved?
Rule Development Activities - Timeline • Currently – Continue to gather input on proposed rule • To provide public input, go to https://app1.fldoe.org/rules/default.aspx and submit comment under 6A-5.0411 • September/October 2012 – Publish proposed rule text • No earlier than December 2012 – State Board action on the rule (subject to change) • Implement state standard for VAM results with the 2013-14 teacher evaluation results; Districts will continue to set their own standards for 2012-13
Student Services Role in a Multi-tiered Support System George Batsche, Project Director, Student Support Services Project, USF
MTSS • A Multi-Tiered System of Supports (MTSS) is a term used to describe an evidence-based model of schooling that uses data-based problem-solving to integrate academic and behavioral instruction and intervention. • The integrated instruction and intervention is delivered to students in varying intensities (multiple tiers) based on student need. • “Need-driven” decision-making seeks to ensure that district resources reach the appropriate students (schools) at the appropriate levels to accelerate the performance of ALL students to achieve and/or exceed proficiency .
Why Organize an Evaluation System Around an MTSS Model? • Research supports that an integrated (academic/behavior/social emotional) service delivery system has greater impact on student performance than separate systems • Services and personnel in schools already are organized by levels of intensity of service delivery • Tier 1—What everybody gets—typically general education teacher led • Tier 2—What “some” get—typically more intensive, smaller groups • Tier 3—What “few” get—typically most intensive, specialized
Why Organize an Evaluation System Around an MTSS Model? • Existing and proposed statutes, regulations and practices support a multi-tiered system • IDEIA • NCLB • Learn Act • Achievement Through Prevention (PBIS) Act (SB 541) • Evaluation systems require clear responsibility for levels of service delivery and “stakeholders” who are one focus of the evaluation process
Why Organize an Evaluation System Around an MTSS Model? • Instructional support staff of all types typically provide instruction/intervention at all levels (Tiers 1,2 and 3) in a school and/or district • School-based research that identifies evidence-based practices is conducted at levels aligned with the Tiers • School-wide (e.g., PBIS, Crisis Prevention) • Classroom level (e.g., Group Procedures, Instructional Strategies • Group level (e.g., academic instruction, social skills training • Very Small Group/Individual (e.g., therapeutic, intense psychological skills training, academic skills)
Highly Effective Practices: Research • High quality academic instruction (e.g., content matched to student success level, frequent opportunity to respond, frequent feedback) by itself can reduce problem behavior (Filter & Horner, 2009; Preciado, Horner, Scott, & Baker, 2009, Sanford, 2006) • Implementation of school-wide positive behavior support leads to increased academic engaged time and enhanced academic outcomes (Algozzine & Algozzine, 2007; Horner et al., 2009; Lassen, Steele, & Sailor, 2006) • “Viewed as outcomes, achievement and behavior are related; viewed as causes of the other, achievement and behavior are unrelated. (Algozzine, et al., 2011) • Children who fall behind academically will be more likely to find academic work aversive and also find escape-maintained problem behaviors reinforcing (McIntosh, 2008; McIntosh, Sadler, & Brown, 2010)
School-wide Behavior & Reading Support The integration/combination of the two: • Are critical for school success • Utilize the three-tiered prevention model • Incorporate a team approach at school level, grade level, and individual level • Share the critical feature of data-based decision making • Produce larger gains in literacy skills than the reading-only model • (Stewart, Benner, Martella, & Marchand-Martella, 2007)
Student Services Role in an MTSS System • Academic Performance of students (educator appraisal factor) is influenced significantly by social, emotional and behavior factors—the professional practices of student services personnel • Combining evidence-based instructional strategies with evidence-based strategies to enhance student engagement results in the most dramatic student gains (LESSON STUDY) • Enhancing student engagement (at all levels) is a primary role of students services personnel
Student Services Role in an MTSS System • The continued viability and importance of student services personnel is influenced strongly by the impact of their practices on student performance-particularly academic performance • Services provided by student services personnel have a strong, evidence-based relationship with student academic performance • A blueprint for a clear, explicit relationship between the provision of evidence-based student services practices and positive student outcomes is critical in the context of school accountability • Student services personnel must PLAN in such a way as to demonstrate ACCOUTABILITY and COMMUNICATE those outcomes.
Overview of SSPEM Developing a State Model for Student Support Services Personnel Evaluations Gria Davison and David Wheeler, Student Support Services Project, USF/BEESS
Fundamental Principles • Fundamental Purpose: Improve academic & behavioral outcomes for students • Reflect a Multi-tiered System of Support (MTSS) framework. • Align with evidence and research-based practices and professional standards linked to positive student outcomes. • Integrate common practice standards across student services professions. • Support professional growth and continuous improvement.
Fundamental Principles (cont.) Offer a state-approved evaluation framework that is dynamic (flexible & fluid) and complies with the Student Success Act for districts to adopt, adapt, or use as a guide.
Developing the Model • Focus on “practices” component • Crosswalk Professional Practice Standards with FEAPs, Professional Competencies, & Teacher/Principal models • Identify • Domains of practice; Practices; Indicators for each practice (levels of performance/proficiency) • Research/evidence supporting practice • Develop an evaluation rubric • Vet model rubric with key stakeholders