200 likes | 217 Views
Comprehensive guide on CSP grant peer review process, roles, timelines, and criteria. Valuable insights for applicants and reviewers.
E N D
Charter School Planning Grant Peer Reviews Webinar Presentation by the Public School Academies Unit June 13, 2012
Webinar Purpose • The purpose of this webinar is to: • Provide potential peer reviewers with an initial orientation before engaging in the actual peer review process, and • Provide experienced peer reviewers with a refresher on the process, and • Provide potential and current applicants with an understanding of the process and the timelines associated with each step.
Expected Outcomes • Participants will be able to: • describe the peer review process for the CSP planning grant, and • describe the consensus review exercise including the similarities and differences between the PSA unit staff & the peer reviewers role, and • understand how cut scores are determined, recommendations are made, and how award procedures are implemented, and • identify and articulate what a “Level 4 –Excellent” response looks like for some critical questions.
Purpose of Peer Reviews • Federal CSP Grant Requires It • Fair process • Better transparency • Separation of authority • Leverage the expertise in the charter community
Review Team Selection • Teams should include representatives of different stakeholder groups: MDE School leaders Authorizers Teachers and other SMEs EMO/CMOs Board members TA providers Developers • Peer reviewers are vetted through resume review • Peer reviewers are VOLUNTEERS. • Reviewers assigned to teams based on: experience, employment status, subject matter expertise
Application Assignments • A single application is selected for the norming exercise for it’s strengths. • Other applications are assigned to teams based on the type of school being proposed, and the knowledge of specific team members. • Attempts are made to avoid potential conflicts of interest and to promote fairness and impartiality. • Teams are generally assigned five applications or less
General Timelines • With three rounds a year, at least three peer reviews are necessary • Peer reviewers receive applications within three days of grant window closing • Consensus meeting normally planned for three weeks after grant application window closes • Planning & Dissemination are completed at the same time • Appeals are completed with peer reviews for the next round
Consensus Review Exercise • Two rooms are normally used: one for teams going through the norming process and one that is not. • Rooms are set up for team work with six chairs at each set of tables. • Each table is provided a working laptop with a memory drive that contains the applications and the rubrics for that team. • One peer reviewer from each team is assigned (volunteered) to record the team’s results on that memory drive.
Consensus Review Exercise • Facilitated and moderated by PSA staff members • Process begins with introductory comments • Review teams in one room get started while the teams in the other room starts the “norming exercise” • Exercise starts with the first question, moving through the “common” application, reaching a consensus rubric score for each question. • Assign the remainder of application to one of the review teams.
PSA Unit Role • Set up the teams, assign and distribute applications • Facilitate/Moderate the Process • Answer technical questions • Collect rubric scores & comments • Verify & collate the scores • Ensure every non-4 score has a comment associated with it • Distribute and collect completed conflict of interest disclosure forms • Distribute and collect evaluation forms
Peer Reviewer Role • Volunteer by notifying MDE PSA unit • Review this webinar • Provide a resume’ to MDE PSA unit • Read and score applications with comments before the peer review • Participate in full-day peer review activity • Work as a team with other peer reviewers • Reach a consensus score on all questions for all applications assigned • Complete the Conflict of Interest Disclosure Form
Establishing a Cut Score • Scores are listed in a table with the application receiving the highest score list first. • List is reviewed to determine if a cut score can be assigned at or near 70% • The current rubric includes 112 points, which makes 70% between 78 and 79 points. • In the event a list were to include one application with a score of 75, one with a score of 78, and another with a score of 79, we would assign a cut score of 77.
Recommending Awards • The Superintendent of Public Instruction is the awarding official at MDE, so all recommendations for awards are forwarded to him for his approval. • All applicants with scores above the cut score are recommended for a subgrant, and those with scores below the cut score are not. • The recommendation packet includes the notification letters to all applicants that had their applications reviewed. • No announcement is made until all of the letters are signed and returned to the PSA unit.
Level 4 – Excellent Answers • Did they answer the questions as they are contained in the narrative? • Did they use the rubric to formulate answers to the questions? • Were they clear, concise, with simple responses? • Were they specific and detailed when necessary? • Did they make it easy to find the questions and the answers in the narrative? • Were the answers confusing, incomplete or lacking clarity, logic, or common sense?
Examples • Question #2. Provide a thoughtful and detailed description of the unmet educational needs of the community, with enough specificity that it becomes apparent throughout the narrative how the proposed school will serve these unmet needs. Answer. The narrative response to this question contains a series of graphs and charters displaying the MEAP scores of schools within the Detroit metropolitan area along with some demographic data including free and reduced lunch statistics, ethnic breakdown, as well as some data on high school dropout rates, unemployment, and crime. Did they answer the question?
Examples • No, they didn’t. • Detroit metropolitan area is not a community • MEAP scores are academic performance indicators • Graduation rates are academic performance indicators • Free and reduced lunch statistics are economic indicators • Unemployment is an economic indicator • Ethnicity is a demographic factor • Crime is an economic indicator linked to unemployment None of these are “unmet educational needs of the community”
What are Unmet Educational Needs? • Resources to practice • Resources to reach mastery • Resources to participate in experiential learning • Resources to research • Safety and security • Access to science, technology, engineering, math • Access to differentiated instruction • Access to arts, music, theater • Access to special niche programs
Points of Contact • Beatrice Barajas, Secretary • Kim Sidel, Analyst • Neil Beckwith, Consultant • Karla Browning, Consultant • Ron Schneider, Consultant • Mark Eitrem, Supervisor (517) 373-4631 MDEPSAGRANT@MICHIGAN.GOV http://www.michigan.gov/charters