200 likes | 348 Views
Kent Milfeld, TACC TG Allocations Coordinator Sept. 20, 2009. TRAC Allocations Report. Outline. Allocations, Q2 and Q3; Q4 prelims Operations Processing Stats Reviewer Survey Summary Future www.teragrid.org/userinfo/access/allocationspolicy.php. Allocations – SU Stats. March, 2009.
E N D
Kent Milfeld, TACC TG Allocations Coordinator Sept. 20, 2009 TRAC Allocations Report
Outline • Allocations, Q2 and Q3; Q4 prelims • Operations • Processing Stats • Reviewer Survey Summary • Futurewww.teragrid.org/userinfo/access/allocationspolicy.php
Allocations – SU Stats March, 2009 “LRAC Cycle” June, 2009 “MRAC Cycle” * 2 outliers removed
Allocations – SU Stats Sept, 2009 “LRAC Cycle” * Includes 80M SUsSurplus from previous period
Request/Allocation Distribution March, 2009 June,2009
Request/Allocation Distribution March, 2009 Sept,2009
Processing Stats • TRAC Requests • 95 in March • 88 in June • 125 in September • Adaptive Reviews • 21 (out of 95) in March • 16 (out of 88) in June • Advances • 15 in March • 14 in June • Startup • New + Renew (requests)
System Stats Requested SUs (M) • SMP Systems: • Small Total Request Size • Significant Number of Users • Mid-Range Systems: • Significant Total Request Size • Large Number of Users Request Count • Peta-scale Systems: • Large Total Request Size • Large Number of Users
Operations: Full Array of Requests Renewal Supplement TRAC (Research) Advance Extension Transfer Justifications Supplement Extension Startup Renewal
Operations New • Changes in Meetings: • Panel Discussions Convene for one day (the caucus is held the evening before). • Increasing number of Requests • Small Requests • Much less than a percent of any resource • Adaptive Reviews (Not reassessed at Panel Discussions) • Same Review Process by a Second Committee *(2 reviews) • *User Services Committee of Staff Reviewers in FOS • Advances
Richard Moore (SDSC) David Hart (SDSC) Dan Katz (UC) Kent Milfeld (TACC) Ralph Roskies (PSC) Craig Stewart (IU) June 22, 2009 TRAC Survey Results and Allocations Recommendations
Survey Context -- • In response to SAB discussion in Jan. 2009 • Survey solicited at March 2009 TRAC & email • 24 responses from 43 committee members (56%) • 14 multiple choice, with text comments
Full text of survey statements(respondents asked to rank level of agreement/disagreement) • 4. Makes good use of submitting PIs’ time and effort. • 6. Makes good use of reviewers’ time in terms of individual proposal reviews. • 7. Makes good use of reviewers’ time in terms of face-to-face discussion of quarterly submissions. • 9. Makes good use of reviewers’ time in dealing with supplemental/rebuttals between meetings. • 11. Are reviewers sufficiently qualified to review the range of submissions received. • 12. Are appropriately assigned to submissions in their fields of expertise. • 13. Reviewers sufficiently represents the spectrum of disciplines for allocation requests. • 8. Evening caucus is a useful part of the allocation review process. • 10. Three two-day meetings or four one-day meetings per year. • 1. Recommendations are generally consistent across requests (for size/FOS) • 2 Recommendations largely commensurate with PI’s potential for science or engineering impact. • 3. Deals appropriately with the dramatic differences in scale between the smaller and larger requests. • 5. Deals appropriately with annual and multi-year allocations. • 14. The diversity of reviewer styles produces sufficient review feedback to requesting PIs. (Open-ended request) comments on questions and how to improve allocations process. Good Use of Time Meeting Arrangement Feedback Review Appropriateness Reviewer’s Qualifications
Ranked Responses (ranked from strongest to least agreement) • Q6: Reviewers' time for proposal reviews is good? • Q11: Are current reviewers sufficiently qualified? • Q13: Are all disciplines reflected in current set of reviewers? • Q2: Sound impact potential of allocation decisions? • Q4: Good use of PI time in allocation process? • Q9: Reviewers' time for supplemental or rebuttal submissions is good? • Q12: Are current reviewers getting submissions in their fields of expertise? • Q1: Consistent allocation decisions across all requests? • Q14: Does reviewer styles produce sufficient feedback to PIs? • Q3: Appropriate allocations vis-à-vis scale of requests? • Q5: Allocation process handles annual and multi-year allocations? • Q10: Would three 2-day meetings be better than four 1-day meetings? • Q7: Reviewers’ time for quarterly submission discussion is good? • Q8: Is the evening caucus useful? Strength of Agreement Question 1: Strongly Agree 2: Agree 3: Neutral 4: Disagree 5: Strongly disagree Average is shown with +/- 1 std dev of responses
Interpretation of Statistical Results • Fairly uniform responses across questions • There may be some modest differences in opinion amongst questions at high/low end of ranking • Some peculiar results, e.g. value of evening caucus was least agreement of any question, yet text comments consistently reflected its importance. • More insight gained from individual comments
Conclusions from Survey • Statistical and comments indicate support for current processes • Areas with the most comments • Multi-year allocations process • Several comments that smaller requests get more scrutiny per SU • Consider parallel domain-oriented sessions for greater discussion • Better feedback to PIs from reviewers – establish some guidelines? • Recruiting sufficient numbers of TRAC members, and being vigilant about assigning qualified reviewers, must remain high priorities • Leave number and length of meetings as-is (including caucuses) • Increase adaptive thresholds for smaller requests to lower TRAC workload
Revisiting SAB presentation in Jan 09 Observations: • Not enough opportunity for discussion between different reviewers • Reviewers with less FOS expertise can (overly) influence a decision • No chance at end of meeting to survey proposals to ensure uniformity of decisions • Some reviews do not explain decisions to proposers (PIs) Suggestions: • Acquire more and “better” qualified; include input from NSF on reviewers • More deliberation in discussions & weighted to reviewers closer to the PI’s FOS • Add science impact to review criteria • Encourage multi-year proposals • Separate panels for life sciences and physical sciences • 3 meetings per year instead of 4 (more time in meetings)
Action Items Derived from Survey Discussions • Recruit sufficient TRAC members across fields of expertise. • Be vigilant in assigning proposals to best-qualified experts. • Pilot: Parallel Sessions at September 2009 TRAC Meeting • Plenary Session 2 parallel Session Reconciliation Session. • Review Template now has 3 new sections for the 3 Review Criteria (thanks David Hart). • Multi-year proposals: develop an improved process as soon as possible. • A “last call for inequities” will allow reviewers to assess relative treatment of proposals • An “adaptive” threshold for Staff Reviews should be used to free up panel discussion time. • For now, maintain strict COI policy as previously written. • Keep 4 sessions per year (barrier to entry issue), 1 day/meeting & evening caucus.
Potential recommendations:for discussion with SAB and NSF • Leave science impact issue as-is, unless NSF requests a change. • Consider a per diem for reviewers (~20% increase costs). • Allow Allocations Coordinator to waive COI for reviewers that submit requests that are “small” compared average request size.
Future/Suggestions • TG-Wide Roaming Access • Has evolved into ineffective mechanism for using multiple resources. • Mid-range system to be pooled in TG Extension “D.24”Abe (NCS), QueenB (LONI), Lonestar (TACC), Steele (Purdue) • Provide Proposal Stats for Request Submitters • Proposal “Tracking” (like FedEx). • Minutes of TRAC Meeting (general discussion) • Secure Wiki for Reviewers