1 / 34

MA Educator Preparation

This webinar discusses the context of educator preparation programs in Massachusetts, including regulatory authority and review processes. It also highlights the moral imperative and driving belief behind educator preparation. The challenges and successes of building a strong review process are explored, along with lessons learned.

tejadac
Download Presentation

MA Educator Preparation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MA Educator Preparation Mid-Atlantic Comprehensive Center Webinar April 29, 2015

  2. Ed Prep: MA Context • 80 Sponsoring Organizations (EPPs) • IHE & Alternatives  All expectations the same • 13 SOs currently seek NCATE/TEAC accreditation • Largest producers – complete 50% of candidates annually • 1,800 + programs • E.g. Math 5-8 Initial, Post-Baccalaureate ; Math 8-12 Initial Baccalaureate • Approximately 6,500 program completers annually • 65% employed in MA Public schools

  3. Ed Prep: MA Context • Regulatory Authority: • Approval determined by the Commissioner of Elementary & Secondary Education (not the Board) – tied to licensing authority • Single line of Statute granting authority  Set of High-level Regulations  Details sits in Guidelines • Separate from Dept. of Higher Education & Dept. of Early Childhood & Care • Three types of review: Informal, Formal, Interim • Formal review = every 7 years • Prep Team (5 people) sits within the Center for Educator Effectiveness (Evaluation, Recognition Programs, Licensure, Induction, Professional Development).

  4. Moral Imperative & Driving Belief • Moral Imperative: All children in Massachusetts, especially students who need the most, must have access to effective teachers and leaders. • Driving Belief: Preparation CAN and SHOULD prepare educators to be ready on day one.

  5. Reform Timeline New Approval Standards “Effective” requirement for supervising practitioners Program Approval Regulations Revised June 2012 New Accountability Levers Increased Practicum Hours Before Since • Two sets of pilot reviews (2009 & 2011) to aide in the development of new approval standards • Draft regulations out for a period of public comment • Moratorium on reviews for 2012-13 & 2013-14 • Built out review process (rubrics, tools, guidance, etc.) • Micro-pilots of new process during the 2013-14 year • Full implementation of process & standards 2014-15

  6. New Program Approval Standards • Communicate a shift/Raised bar • Emphasis at organization (unit) level • Looking closely at K-12 partnerships and systems of continuous improvement • Focus on outcomes and evidence of impact • State Available Data Linkages (n=6) • Employment & Retention • Evaluation Ratings & Student Growth/Impact • Survey results (candidate, 1-year out completer, supervising practitioner & hiring employer).

  7. The Challenge • Building a review process that is: • Effective • Efficient • Consistently Rigorous

  8. Building a Strong Review ProcessGoal: implement a process that provides a solid evidence base for decision-making • Evidence-based decision making for: • ESE • Sponsoring Organization • Understanding Best Practices • “Informed Researcher” perspective

  9. Important Early Decisions • Value human judgment • Emphasize accountability at the Org Level • No longer Review Program Specific Syllabi • Summative Evaluation • ESE not the expert • Descriptive of expectations, not prescriptive of approach • Transparency/Communication key

  10. Review Process of Steel: Examples • Recruitment, selection & training of Reviewers • Needs Assessment for all dormant/low-enrollment programs • Eval Tool • Evaluates evidence, not criteria • Triangulates evidence – offsite, onsite and outputs • New Evidence Collection Methods • K-12 Partner survey • Use of live-polling technology during focus groups • Vetting Panel before release

  11. Promising Early Results • 34% of programs up for review expired at point of needs assessments for 2014-2015 Formal Review (135/393) • Similar pattern observed in 2015-2016 cohort • Organizations have done things they have never done before (intentional conversations with PK12 partners, data-driven goal setting, advisory councils of recent completers) • More differentiation in terms of ratings (exemplary, proficient, needs improvement, unsatisfactory) across and within organizations than we have been able to establish previously • SOs and reviewers WANT to engage in the process

  12. Questions

  13. Greatest Successes • Quality of relationship with our field • Recruiting PK12 educators as reviewers • Dialogue around evidence of impact • Calibration/reliability of ratings & Differentiation of results

  14. Ongoing Challenges/Considerations • Demand on reviewers • Evidence-driven narratives from SOs • Inclusion of data in the process • Establishing the right rewards and consequences • When/whether to provide examples or exemplars • Weighting criteria/domains • SEA Human Capital

  15. Lessons Learned • You do not need to set numeric benchmarks in order for outcome measures to be influential in the judgments being made • Reviewers need a mechanism and structure through which to make difficult, high-stakes decisions – otherwise they will avoid doing so. • People are going to be wary/reluctant of the unknown – no matter what you do. Creating “early allies” is the most effective way to mitigate anxiety from the field.

  16. Advice • Separate technical assistance from program evaluation • Embed accountability within a larger framework of reform initiatives – explicitly draw connections • Focus on continuous improvement • Build a system that your state can grow into • Walk the walk

  17. Questions

  18. Just IN Case Slides

  19. Rubric  Eval Tool Criteria: Admission criteria and processes are rigorous such that those admitted demonstrate success in the program and during employment in licensure role. Criteria: Admission criteria for post-baccalaureate candidates verify content knowledge upon entrance to the program.

  20. Eval Tool Overall decision indicating whether there is sufficient evidence in support of a criteria being met List of pertinent evidence sources to be referenced by reviewers Domain Rating determination for evidence Box for reviewer to provide a rationale explaining the rating Space for review team to provide suggestions for improvement relative to the criterion Indication of whether outputs demonstrates support for criteria (+), contrasts with (-) criteria, or is inconclusive (?) Criterion

  21. Review Criteria • Descriptive of expectations, not prescriptive of approach • Emphasize impact • Examples: • Admission Criteria: Admission criteria and processes are rigorous such that those admitted demonstrate success in the program and during employment in licensure role. • Diversity Criteria: Recruitment efforts yield a diverse candidate pool. • Not weighted (at this point) • Criteria not rated, evidence is

  22. Evaluating Evidence, Not Criteria

  23. Worksheets • Offsite Submission • Linked to criteria • Streamlined – forcing clarity and best evidence choice • Manageable for reviewers • BIG shift

  24. Key Components on all Worksheets Required Documents Prompts Optional Context Optional Additional Documents

  25. Prompts Linked to Criteria Criteria: Waiver policy ensures that academic and professional standards of the licensure role are met. Criteria: Recruitment efforts yield a diverse candidate pool.

  26. Reviewers • Conducted exploratory analysis – other states, past reviewers, current orgs • Raise prestige of the role: • Cohort Model • Intentional Selection • Market  Teacher Leadership • Build a robust training model • Full day training & ongoing calibration • Future  online modules/calibration assessments “I have worked with the state on many initiatives in my role as principal, and curriculum coordinator in a regional district.  I have to say this was one of the more clear, focused and effective trainings I have been involved in.”

  27. Needs Assessments • Policy Context • Significant impact • 12/30 new programs confirmed in 2014 Informal Cycle • 34% of programs up for review expired at point of needs assessments for 2014-2015 Formal Review (135/393) • Set precedent for the review

  28. Evidence-Based Decision Making Evidence Base Findings & Commendations Exemplary Proficient Needs Improvement Unsatisfactory Approved Approved with Conditions Not Approved

  29. 2012 Program Approval Standards • Standard A: Continuous Improvement • Standard B: Collaboration & Program Impact • Standard C: Capacity • Standard D: Subject Matter Knowledge • Standard E & F: Professional Standards • Standard G: Educator Effectiveness

  30. Program Approval Standards & Indicators Domain Strand • Criteria • Criteria Strand • Criteria • Criteria • Criteria Strand • Criteria • Criteria Domain Strand • Criteria • Criteria • Criteria Strand • Criteria • Criteria Standards & Indicators  Domains

More Related