200 likes | 209 Views
The review aims to update grant selection processes to match evolving research landscapes, focus on efficiency, ensure fair evaluations, and improve communication. Recommendations include restructuring committees for better specialization and efficiency, separating scientific evaluations from funding decisions, implementing "binning" of scientific assessments, and focusing on research costs in funding decisions.
E N D
Grant Selection Committee Structure ReviewRecommendations Meeting of the Canadian Council of Deans of Science June 26, 2008
Goal of Review To ensure that the peer review process can accommodate the rapid emergence of new areas, the increase in research crossing traditional disciplinary boundaries and the growing workload of many committees
Advisory Committee Membership Adel Sedra (Chair) Dean of Engineering, University of Waterloo Mark Bisby Previous VP Research, CIHR Elizabeth Cannon Dean of Engineering, University of Calgary Nick Cercone Dean of Science and Engineering, York University Patrick Desjardins Professeur titulaire, CRC, École Polytechnique Michael Gibbons, MBE Sussex University; Previous Secretary General, Association of Commonwealth Universities Peter March Director, Mathematics Division, NSF Nils Petersen Director General, NINT, Edmonton Susan Pfeiffer Dean of Graduate Studies, University of Toronto Mario Pinto Vice President-Research, Simon Fraser University Gary Slater Dean of Graduate Studies, University of Ottawa Nancy Van Wagoner Associate VP Research, Thompson Rivers University Warwick Vincent Professeur, CRC, Université Laval Carolyn Watters Dean of Graduate Studies, Dalhousie University
Current System • The current system is facing several challenges: • Changing research landscape • Rapid development of new areas of research • Splitting of many committees to deal with increasing workload to create committees of increased specialization
Consultation Highlights • Input from GSC Chairs • Meetings with Deans, Department Chairs, Professional Societies • Web-based survey of DG applicants (>4,400 responses) • Briefs from universities and professional societies • Large Focus Group session in March 2008
Principles and Goals Principles Structure and processes that: • Achieve the objectives of the Program within NSERC’s vision of helping to make Canada a country of discoverers and innovators for the benefit of all Canadians • Are transparent to applicants and reviewers, and can be easily explained to NSERC stakeholders • Are expert, fair and efficient • Effectively allocate funding
Principles and Goals Specific Goals • A grant evaluation structure based on a comprehensive analysis of the current research environment • Protocols that maintain confidence in the Program from the research community • A dynamic and flexible structure that responds to a changing research environment • Consistently high quality Committee review of proposals in established as well as new and emerging areas • In-depth review of all proposals through innovative and flexible processes, while ensuring a manageable workload for Committee members, referees and staff • Effective communication of exciting Canadian research • Keeping administration costs reasonable
Committee Structure NSERC is urged to implement a structure based on the “Conference Model” • Replace the current 28 GSCs by 10-12 Groups • Each to have a number of Sections meeting in three or four parallel streams • The Groups will largely be organized along disciplinary lines • Where it is appropriate for the area (e.g. Environment), thematic Groups may be established
Separation of Scientific Evaluation and Funding Recommendations • Scientific Evaluations should be made by the Sections • Funding Recommendations should be made by the Groups
Group Executive Committee (Group Chair plus Section Chairs) NSERC Funding Recommendations Quality Evaluation + Budget Assessment Quality Evaluation + Budget Assessment Section Members Section Members Funding Recommendations
“Binning” of evaluations NSERC should implement a scheme for binning scientific evaluations (i.e. grouping them into several discrete levels) • The scientific assessment to be communicated to applicants
Scientific Assessment • Sections to assess the quality of proposals in terms of a numeric grade according to each of the following criteria: • Scientific or engineering excellence of the researcher(s) • Merit of the proposal • Contribution to the training of highly qualified personnel • And: • The appropriateness of the budget justification • The relative cost of the proposed program of research (low, medium or high) for the topic area • The ratings on these will be a classification of applications into quality categories or bins, qualified by a “Cost of Research” factor
Forced Distribution of Grades • Very difficult to compare applications from different fields in an absolute sense • Therefore, use a forced distribution of applications across bins • For example, require that a particular category (bin) contain no less than 10 per cent and no more than 20 per cent of the applications
Research Costs “Research costs” to replace current “need for funds” selection criterion • The availability or otherwise of funds from other sources (e.g. federal or provincial funding, or university research support) will provide needed context for understanding the applicant’s total research program • Applicants should not be penalized because they have been successful in obtaining additional funding from other sources
Research Costs Sections to comment on whether the proposed budget to be funded by NSERC is reasonable and well-justified
Budget allocations to Groups • Funding allocation processes to be based on population dynamics and the cost of research • The “population dynamics” factor to be based on the number of applicants each year in a given discipline • The cost of research to be based on NSERC studies of reported expenditures and StatsCan data
Funding Recommendations • Group Executive Committees will use a grid to translate the recommendations on quality and on cost of research into funding recommendations • Grid appropriate to the research area • A grid for each group of research areas with similar research costs
Periodic Review of the Structure • A comprehensive review, and if warranted a re-design, approximately every 10 years; and • Minor fine-tuning at any time