340 likes | 348 Views
This webinar discusses the context of educator preparation programs in Massachusetts, including regulatory authority and review processes. It also highlights the moral imperative and driving belief behind educator preparation. The challenges and successes of building a strong review process are explored, along with lessons learned.
E N D
MA Educator Preparation Mid-Atlantic Comprehensive Center Webinar April 29, 2015
Ed Prep: MA Context • 80 Sponsoring Organizations (EPPs) • IHE & Alternatives All expectations the same • 13 SOs currently seek NCATE/TEAC accreditation • Largest producers – complete 50% of candidates annually • 1,800 + programs • E.g. Math 5-8 Initial, Post-Baccalaureate ; Math 8-12 Initial Baccalaureate • Approximately 6,500 program completers annually • 65% employed in MA Public schools
Ed Prep: MA Context • Regulatory Authority: • Approval determined by the Commissioner of Elementary & Secondary Education (not the Board) – tied to licensing authority • Single line of Statute granting authority Set of High-level Regulations Details sits in Guidelines • Separate from Dept. of Higher Education & Dept. of Early Childhood & Care • Three types of review: Informal, Formal, Interim • Formal review = every 7 years • Prep Team (5 people) sits within the Center for Educator Effectiveness (Evaluation, Recognition Programs, Licensure, Induction, Professional Development).
Moral Imperative & Driving Belief • Moral Imperative: All children in Massachusetts, especially students who need the most, must have access to effective teachers and leaders. • Driving Belief: Preparation CAN and SHOULD prepare educators to be ready on day one.
Reform Timeline New Approval Standards “Effective” requirement for supervising practitioners Program Approval Regulations Revised June 2012 New Accountability Levers Increased Practicum Hours Before Since • Two sets of pilot reviews (2009 & 2011) to aide in the development of new approval standards • Draft regulations out for a period of public comment • Moratorium on reviews for 2012-13 & 2013-14 • Built out review process (rubrics, tools, guidance, etc.) • Micro-pilots of new process during the 2013-14 year • Full implementation of process & standards 2014-15
New Program Approval Standards • Communicate a shift/Raised bar • Emphasis at organization (unit) level • Looking closely at K-12 partnerships and systems of continuous improvement • Focus on outcomes and evidence of impact • State Available Data Linkages (n=6) • Employment & Retention • Evaluation Ratings & Student Growth/Impact • Survey results (candidate, 1-year out completer, supervising practitioner & hiring employer).
The Challenge • Building a review process that is: • Effective • Efficient • Consistently Rigorous
Building a Strong Review ProcessGoal: implement a process that provides a solid evidence base for decision-making • Evidence-based decision making for: • ESE • Sponsoring Organization • Understanding Best Practices • “Informed Researcher” perspective
Important Early Decisions • Value human judgment • Emphasize accountability at the Org Level • No longer Review Program Specific Syllabi • Summative Evaluation • ESE not the expert • Descriptive of expectations, not prescriptive of approach • Transparency/Communication key
Review Process of Steel: Examples • Recruitment, selection & training of Reviewers • Needs Assessment for all dormant/low-enrollment programs • Eval Tool • Evaluates evidence, not criteria • Triangulates evidence – offsite, onsite and outputs • New Evidence Collection Methods • K-12 Partner survey • Use of live-polling technology during focus groups • Vetting Panel before release
Promising Early Results • 34% of programs up for review expired at point of needs assessments for 2014-2015 Formal Review (135/393) • Similar pattern observed in 2015-2016 cohort • Organizations have done things they have never done before (intentional conversations with PK12 partners, data-driven goal setting, advisory councils of recent completers) • More differentiation in terms of ratings (exemplary, proficient, needs improvement, unsatisfactory) across and within organizations than we have been able to establish previously • SOs and reviewers WANT to engage in the process
Greatest Successes • Quality of relationship with our field • Recruiting PK12 educators as reviewers • Dialogue around evidence of impact • Calibration/reliability of ratings & Differentiation of results
Ongoing Challenges/Considerations • Demand on reviewers • Evidence-driven narratives from SOs • Inclusion of data in the process • Establishing the right rewards and consequences • When/whether to provide examples or exemplars • Weighting criteria/domains • SEA Human Capital
Lessons Learned • You do not need to set numeric benchmarks in order for outcome measures to be influential in the judgments being made • Reviewers need a mechanism and structure through which to make difficult, high-stakes decisions – otherwise they will avoid doing so. • People are going to be wary/reluctant of the unknown – no matter what you do. Creating “early allies” is the most effective way to mitigate anxiety from the field.
Advice • Separate technical assistance from program evaluation • Embed accountability within a larger framework of reform initiatives – explicitly draw connections • Focus on continuous improvement • Build a system that your state can grow into • Walk the walk
Rubric Eval Tool Criteria: Admission criteria and processes are rigorous such that those admitted demonstrate success in the program and during employment in licensure role. Criteria: Admission criteria for post-baccalaureate candidates verify content knowledge upon entrance to the program.
Eval Tool Overall decision indicating whether there is sufficient evidence in support of a criteria being met List of pertinent evidence sources to be referenced by reviewers Domain Rating determination for evidence Box for reviewer to provide a rationale explaining the rating Space for review team to provide suggestions for improvement relative to the criterion Indication of whether outputs demonstrates support for criteria (+), contrasts with (-) criteria, or is inconclusive (?) Criterion
Review Criteria • Descriptive of expectations, not prescriptive of approach • Emphasize impact • Examples: • Admission Criteria: Admission criteria and processes are rigorous such that those admitted demonstrate success in the program and during employment in licensure role. • Diversity Criteria: Recruitment efforts yield a diverse candidate pool. • Not weighted (at this point) • Criteria not rated, evidence is
Worksheets • Offsite Submission • Linked to criteria • Streamlined – forcing clarity and best evidence choice • Manageable for reviewers • BIG shift
Key Components on all Worksheets Required Documents Prompts Optional Context Optional Additional Documents
Prompts Linked to Criteria Criteria: Waiver policy ensures that academic and professional standards of the licensure role are met. Criteria: Recruitment efforts yield a diverse candidate pool.
Reviewers • Conducted exploratory analysis – other states, past reviewers, current orgs • Raise prestige of the role: • Cohort Model • Intentional Selection • Market Teacher Leadership • Build a robust training model • Full day training & ongoing calibration • Future online modules/calibration assessments “I have worked with the state on many initiatives in my role as principal, and curriculum coordinator in a regional district. I have to say this was one of the more clear, focused and effective trainings I have been involved in.”
Needs Assessments • Policy Context • Significant impact • 12/30 new programs confirmed in 2014 Informal Cycle • 34% of programs up for review expired at point of needs assessments for 2014-2015 Formal Review (135/393) • Set precedent for the review
Evidence-Based Decision Making Evidence Base Findings & Commendations Exemplary Proficient Needs Improvement Unsatisfactory Approved Approved with Conditions Not Approved
2012 Program Approval Standards • Standard A: Continuous Improvement • Standard B: Collaboration & Program Impact • Standard C: Capacity • Standard D: Subject Matter Knowledge • Standard E & F: Professional Standards • Standard G: Educator Effectiveness
Program Approval Standards & Indicators Domain Strand • Criteria • Criteria Strand • Criteria • Criteria • Criteria Strand • Criteria • Criteria Domain Strand • Criteria • Criteria • Criteria Strand • Criteria • Criteria Standards & Indicators Domains