1 / 50

Space Grant 20 th Year Evaluation

Reviewer Role. Scoring Rubric. Special Considerations. Summary. PPR Reviewer Training. Space Grant 20 th Year Evaluation. Reviewer Role. Scoring Rubric. Special Considerations. Program Performance and Results Report Reviewer Training Atlanta, GA October 27th, 2008. Summary.

Download Presentation

Space Grant 20 th Year Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reviewer Role Scoring Rubric Special Considerations Summary PPR Reviewer Training Space Grant 20th Year Evaluation Reviewer Role Scoring Rubric Special Considerations Program Performance and Results Report Reviewer Training Atlanta, GA October 27th, 2008 Summary

  2. Reviewer Role Scoring Rubric Special Considerations Summary Agenda PPR Reviewer Training • Reviewer Role • Scoring Rubric • Guiding Principles • Rubric Areas • Scoring • Strengths/Weaknesses • Special Considerations • Summary Reviewer Role Scoring Rubric Special Considerations Summary

  3. Reviewer Role Scoring Rubric Special Considerations Summary Reviewers Reviewer Role • Reviewers are invited or selected by NASA headquarters because of the ability to make an expertjudgment based on available data. • Reviewers are... • Space Grant Directors • NASA Headquarters Personnel • Field Center Personnel • Former Space Grant Directors • Other individuals invited by NASA

  4. Reviewer Role Scoring Rubric Special Considerations Summary Reviewers Reviewer Role • The Reviewer role is... • To apply knowledge of Space Grant program to make an independent, unbiased assessment of the assigned consortia.

  5. “In my PPR…” Reviewer Role Scoring Rubric Special Considerations Summary Reviewers Reviewer Role

  6. Reviewer Role Scoring Rubric Special Considerations Summary Reviewers Reviewer Role • Develop a working understanding of the NASA Education Outcomes • Contribute to the development of the Science, Technology, Engineering, and Mathematics (STEM) workforce in disciplines needed to achieve NASA’s strategic goals (Employ and Educate). • Attract and retain students in STEM disciplines through a progression of educational opportunities for students, teachers, and faculty (Educate and Engage). • Build strategic partnerships and linkages between STEM formal and informal education providers that promote STEM literacy and awareness of NASA’s mission (Engage and Inspire).

  7. Reviewer Role Scoring Rubric Special Considerations Summary What is a Rubric? Scoring Rubric • A tool that defines and communicates criteria to assess performance. • Standardizes assessment in areas where a great deal of subjective judgment is required. The reviewer makes a judgment based on the outlined criteria. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  8. Reviewer Role Scoring Rubric Special Considerations Summary Methodology Scoring Rubric • An expert panel was identified to develop the rubric. The base panel included three Space Grant Program content experts and one measurement professional. • The scoring rubrics are based on and directly aligned with the guidelines. • Consensus was reached between all panel members for the Final Rubric. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  9. Reviewer Role Scoring Rubric Special Considerations Summary Scoring Categories Scoring Rubric Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  10. Reviewer Role Scoring Rubric Special Considerations Summary Sample Rubric Scoring Rubric Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process **In the consortium specific rubrics, the option “NR” is available and represents “No Rating.” This means that there were no consortium specific elements

  11. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric Five Guiding Principles • Alignment • Rigor • Context • Consistency • Results Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  12. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric Alignment • The PPR Report and data demonstrate alignment with the Legislation, Program Objectives, and NASA programmatic guidance. • The Reviewer judges how well the consortium delineates the state needs and aligns its programs with the Space Grant legislation, national program objectives, and NASA programmatic guidance Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  13. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric Rigor • The PPR Report articulates its purpose, SMART goals and objectives. It articulates a clear understanding of what the consortium was trying to accomplish and how its activities will be assessed. • The Reviewer judges how well the consortium articulates its purpose, goals and objectives, and its assessment and evaluation plans. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  14. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric Context • Context refers to having an understanding of the resources the consortium dedicates to an area. • Context also refers to understanding the level of resources a consortium has based on its grant type (Page 20 and 21 of the PPR Guidelines) • The Reviewer judges how well the consortium justifies the portion of its resources allocated to each program element. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  15. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric Consistency • The CMIS data, where appropriate, validate the results reported in the PPR Report. Significant inconsistencies might indicate that PPR Report statements are questionable. • The Reviewer judges the degree of consistency between the PPR Report analysis and the CMIS data. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  16. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric Results • The PPR Report and CMIS data give evidence that the consortium is making important achievements. The consortium is able to demonstrate tangible results. • The Reviewer judges the results achieved relative to the resources allocated to each program element. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  17. Reviewer Role Scoring Rubric Special Considerations Summary Guiding Principles Scoring Rubric The Guiding Principles create a Foundation for each reviewer. This foundation enables the reviewer to make consortium specific judgments that are independent of other consortia. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  18. Guiding Principles Alignment Results Rigor Rating Guided by Principles Consistency Context Reviewer Role Scoring Rubric Special Considerations Summary Scoring Rubric Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process 17

  19. Reviewer Role Scoring Rubric Special Considerations Summary Rubric Areas Scoring Rubric The Rubric is designed with the same format as the Program Performance and Results Report. • Each element of the PPR Report is unique. Because of this uniqueness, a rubric is customized for each element. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  20. Reviewer Role Scoring Rubric Special Considerations Summary Rubric Types Scoring Rubric • Each programmatic element has three rubric types: • Description • Core Criteria (The number of criteria vary by outcome) • Impact/Results or Evidence of Success Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  21. Reviewer Role Scoring Rubric Special Considerations Summary Rubric Areas Scoring Rubric • Executive Summary and Consortium Impact • Foreword • Consortium Management • Description • Core Criteria • Strategic Plan, Consortium Structure/Network (Internal), Diversity, Consortium Operations, Resource Management, Collaborations and Partnerships Outside the Consortium • Impact/Results Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  22. Reviewer Role Scoring Rubric Special Considerations Summary Rubric Areas Scoring Rubric NASA Education Outcome I • Fellowships/Scholarship Program • Research Infrastructure • Higher Education NASA Education Outcome I: National Program Emphases • Diversity • Workforce Development • Longitudinal Tracking • Minority Serving Institutions Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  23. Reviewer Role Scoring Rubric Special Considerations Summary Rubric Areas Scoring Rubric NASA Education Outcome 2 • Precollege Programs NASA Education Outcome 3 • Public Service Program Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  24. Reviewer Role Scoring Rubric Special Considerations Summary Scoring Process Scoring Rubric • Review the rubric for the section of the PPR Report being assessed. • Read PPR Report section being assessed. • Consider CMIS Data and other data sources associated with the section being assessed. • Using rubric, make qualitativejudgment on whether or not the consortium is “excellent,” “good,” or “poor.” • After a qualitative judgment is made on the level of the consortium, make a quantitative judgment on what integer score to assign to the consortium within the level. • “Close the loop” by re-assessing your rating considering the qualitative and quantitative judgments. This is the italicized statement within each rubric qualitative area. Definition Methodology Categories Sample Rubric Guiding Principles Rubric Types/Areas Scoring Process

  25. Reviewer Role Scoring Rubric Special Considerations Summary Scoring Process Scoring Rubric Definition Methodology Categories Sample Rubric 1. Qualitative Judgment Guiding Principles Rubric Types/Areas Scoring Process 3. Close the loop 2. Quantitative Judgment

  26. Reviewer Role Scoring Rubric Special Considerations Summary Comments Special Considerations • Statement Guidelines • Maintain Self-Anonymity • Avoid Referencing Individuals by Name • State Complete Thoughts • Make Specific, Concise Comments • Maintain Objectivity in Positive and Negative Comments Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  27. Reviewer Role Scoring Rubric Special Considerations Summary Data Special Considerations CMIS Data may be a Starting Point • The CMIS Data may not be representative of all data that are presented in the PPR Report. A consortium may cite data that are outside the realm of the variables included in the CMIS database. These data should be considered in addition to any available CMIS data. Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  28. Reviewer Role Scoring Rubric Special Considerations Summary Reviewer Expertise Special Considerations Poor or Good? Good or Excellent? • It is possible that a consortium in any PPR Report area being judged has characteristics of poor, good, and/or excellent performance. The expertise of the reviewer is the deciding factor in these cases. The reviewer makes a judgment based on the preponderance of the available evidence of whether the consortium is excellent, good, or poor. Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  29. Reviewer Role Scoring Rubric Special Considerations Summary Not Rated Special Considerations NR? • It is possible that the consortium specific elements were not a focus of the consortium. As noted in the PPR Report Guidelines, the consortium is to specifically state in the description if an element was not applicable If the Description provides an explicit statement that an element was not a focus, the consortium specific rubric will be rated as “NR”. Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  30. Reviewer Role Scoring Rubric Special Considerations Summary Not Rated Special Considerations Is a Consortium Evaluation Harmed by NRs? • No. NRs will not be included in the assessment compilations of criteria and impact/results. Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  31. Reviewer Role Scoring Rubric Special Considerations Summary Demographics Special Considerations Impacts Can Differ Based on State Demographics. • The demographics of the state may make it appear that the impact a consortium is having is insufficient based on the amount of resources dedicated to the area. • Refer to the PPR Report Foreword to review the described consortium landscape • If a reviewer is from a state with demographics much different than the consortium being reviewed, the reviewer should utilize his/her expertise but not apply an unfair bias against a consortium.(This refers to the “context” guiding principle). Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  32. Reviewer Role Scoring Rubric Special Considerations Summary Grant Types Special Considerations • The PPR Report Guidelines (page 20-21) outline the Space Grant Types. • Designated • Program Grant • Capability Enhancement • An in-depth understanding of the grant types is required so that a consortium’s PPR receives a fair review (This refers to the “context” guiding principle). Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  33. Reviewer Role Scoring Rubric Special Considerations Summary Consortium Concurrence Special Considerations • The reviewer provides no rating related to concurrence • The Executive Panel will review this requirement Comments Data Expertise Not Rated Demographics Grant Types Concurrence

  34. Reviewer Role Scoring Rubric Special Considerations Summary Summary Summary • The Guiding Principles create a foundation for the reviewers. • Use of the rubric standardizes scoring for the reviewers. • Scoring • Qualitative (Excellent, Good, Poor, Incomplete) • Quantitative [7-6 (Excellent), 5-3 (Good), 2-1 (Poor), 0 (Incomplete)] • Reviewers are the experts invited or selected to use their knowledge as a basis to make judgments. Summary Comments Application Site Review

  35. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • The following slides contain actual reviewer comments from the 15th year evaluation • Consider the guidelines reviewed earlier and judge if the comments are appropriate or inappropriate Summary Comments Application Site Review

  36. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • Effective Comments: • The translation of science.nasa.gov into Spanish provides on-going impact to the Hispanic community in STATE and around the world. Excellent examples of collaboration with NASA Center. Very impressive impact through pre-college efforts -- not only bringing the Program to STATE, but the design and oversight of statewide professional development. This clearly demonstrates alignment and coordination with the state systemic reform efforts. • While the purpose is clear, the description was lacking a discussion of measurable objectives with clearly defined metrics. The description was lacking a discussion of assessment and evaluation plan. According to the CMIS data, there has not been an underrepresented minority student award since 1998. In fact, according to CMIS, that’s the only underrepresented minority student in five years. Student participation research and mentoring with field centers and industry is not as conclusive as it could be. The discussion is a bit too general and appears to center around outreach activities. Summary Comments Application Site Review

  37. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • There is not much analysis of what the needs are and how the consortium is organizing its resources to best address those. I would recommend that the director should convene a planning group in his state, including the principals and one or two outside persons, and go through the planning process. Summary Comments Application Site Review

  38. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • There is not much analysis of what the needs are and how the consortium is organizing its resources to best address those. (Appropriate comment)I would recommend that the director should convene a planning group in his state, including the principals and one or two outside persons, and go through the planning process. (Inappropriate comment-it is not the reviewers role to make recommendations) Summary Comments Application Site Review

  39. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • The strategic implementation plan is clearly derived from the National program’s strategic plan… promotes a variety of activities and is effectively working to meet the needs of its citizens. • Strategic objectives clearly derived from National priorities. Evidence of analysis of state needs. • Very complete. Summary Comments Application Site Review

  40. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • The strategic implementation plan is clearly derived from the National program’s strategic plan… promotes a variety of activities and is effectively working to meet the needs of its citizens. (Appropriate comment: states why area is a strength) • Strategic objectives clearly derived from National priorities. (Appropriate comment) Evidence of analysis of state needs. (Inappropriate comment: does not provide a qualitative assessment) • Very complete. (Inappropriate comment: that’s it?) Summary Comments Application Site Review

  41. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Comment Evaluation Summary • Both are appropriate and comprehensive comments: • The translation of science.nasa.gov into Spanish provides on-going impact to the Hispanic community in STATE and around the world. Excellent examples of collaboration with NASA Center. Very impressive impact through pre-college efforts -- not only bringing the Program to STATE, but the design and oversight of statewide professional development. This clearly demonstrates alignment and coordination with the state systemic reform efforts. • While the purpose is clear, the description was lacking a discussion of measurable objectives with clearly defined metrics. The description was lacking a discussion of assessment and evaluation plan. According to the CMIS data, there has not been an underrepresented minority student award since 1998. In fact, according to CMIS, that’s the only underrepresented minority student in five years. Student participation research and mentoring with field centers and industry is not as conclusive as it could be. The discussion is a bit too general and appears to center around outreach activities. Summary Comments Application Site Review

  42. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Rubric Application Summary • NASA Ties rubric from 15th Year Evaluation Summary Comments Application Site Review

  43. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Rubric Application Summary • This consortium was rated as Excellent by all reviewers for this submission (Potential identifying information removed) • NASA Ties: Strong ties exist between … NASA Centers. The Consortium works with NASA … through the Undergraduate Student Research Program (USRP). An employee from each Center, generally in the University Affairs Office, is assigned to work with … staff as Center Coordinator for USRP. This relationship is strengthened throughout the program cycle as… staff work closely with Center Coordinators on the application review and selection process, program marketing efforts, student placement and evaluation process… formally became a.. member in July 2003 and … serve on our Advisory Council. A … partnership exists with… In this effort, we also work… The Consortia has funded a… position… We continue our relationship… by funding one or two students each year. Additionally, we work with … experiments through two … universities. NASA… supports Consortium … projects and supported an … project. We manage the … Program for NASA … and the…Enterprise. Our ties to… are strengthened through … other joint educational projects. … provides a… administrative coordinator slot for NASA …. was a supporter of the… Experiment Program for which we sponsored … educators. Our working network with NASA Centers continues to expand as our program grows. Summary Comments Application Site Review

  44. Reviewer Role Scoring Rubric Special Considerations Summary Activity: Rubric Application Summary • Why Excellent? • NASA Ties Strong ties exist between … NASA Centers. The Consortium works with NASA … through the Undergraduate Student Research Program (USRP). An employee from each Center, generally in the University Affairs Office, is assigned to work with … staff as Center Coordinator for USRP. This relationship is strengthened throughout the program cycle as… staff work closely with Center Coordinators on the application review and selection process, program marketing efforts, student placement and evaluation process… formally became a.. member in July 2003 and … serve on our Advisory Council. A … partnership exists with… In this effort, we also work… The Consortia has funded a… position… We continue our relationship… by funding one or two students each year. Additionally, we work with … experiments through two … universities. NASA… supports Consortium … projects and supported an … project. We manage the … Program for NASA … and the…Enterprise. Our ties to… are strengthened through … other joint educational projects. … provides a… administrative coordinator slot for NASA …. was a supporter of the… Experiment Program for which we sponsored … educators. Our working network with NASA Centers continues to expand as our program grows. Summary Comments Application Site Review

  45. Site Review Reviewer Role Scoring Rubric Special Considerations Summary Summary Summary Log into the review site at https://secure.spacegrant.org/20th/review/ Comments Application Site Review Enter your email and password here 44

  46. Site Review Reviewer Role Scoring Rubric Special Considerations Summary Summary Summary Logging in brings you to the review summary page Comments Application Site Review This page displays the consortia you will review. Click on the consortia name to go to enter your score and comments Scores you have entered and saved will be displayed. Scores you still need to enter will be grayed out. 45

  47. Site Review Reviewer Role Scoring Rubric Special Considerations Summary Summary Summary These links allow the reviewer to advance to other rubric sections Comments Application Site Review Select “Review Summary” to return to the summary page. Enter your rating by selecting the radio button 46

  48. Site Review Reviewer Role Scoring Rubric Special Considerations Summary Summary Summary Comments Application Site Review You must press “Save Page” to save your data. If you return to the review summary or select next or previous page, your data is not saved. 47

  49. Site Review Reviewer Role Scoring Rubric Special Considerations Summary Summary Summary Comments Application Site Review The “Submit All Program Performance and Results Reviews” button is at the bottom of the Review Summary page. Select this button only when you have completed all reviews. You close your review process when you select this button. 48

  50. Reviewer Role Scoring Rubric Special Considerations Summary Questions? Summary • Content related Questions: • Katherine.M.Pruzan@nasa.gov • Technical Questions: • Mark.Fischer@spacegrant.org Summary Comments Application Site Review

More Related