110 likes | 253 Views
COST Trans Domain Proposals Final Report on TDP SAB TDP Pilot Evaluation and Selection Procedure. 187 th CSO Meeting , Brussels, 15th May 2013. COST Office.
E N D
COST TransDomainProposals Final Reporton TDP SAB TDP PilotEvaluation and SelectionProcedure 187th CSO Meeting, Brussels, 15th May 2013 COST Office
The Trans-Domain Proposals Standing Assessment Board (TDP-SAB) was introduced in 2008 in parallel to the launch of the Trans-Domain Proposals evaluation as a means for fostering interdisciplinary and trans-disciplinary approaches within COST • Preliminary Proposal evaluated remotely by TDP-SAB • A varying amount and profile of DC Members / DC Experts involved at each call, depending on evaluation needs and availability • Full Proposal evaluated first remotely and then at a consensus meeting by an External Expert Panel (EEP) • Seen the large variety of topics, a sub-set of the remote evaluators (typically, 40 to 50 per call) has been invited to the EEP meeting • TDP-SAB Hearings: final ranking of the proposals and suggestion for CSO approval TDP-SAB Evaluation
COST Doc. 205/08: 9 DC Chairs, 18 members nominated by DC Chairs and TDP-SAB Coordinator • External consultant (Georges Wanet) appointed by former CSO President as TDP-SAB Coordinator on the basis of 8 working days a year and for 450 EUR / day • Five Open Calls: 2008-1, 2008,2, 2009-1, 2009-2, 2010-1 • COST Doc. 4115/10: 9 DC Chairs, TDP-SAB members with broad interdisciplinary expertise nominated by CNCs and TDP-SAB Chair • This form of TDP-SAB was never fully activated seen the low number of nominated TDP-SAB members (12 in total) • Therefore, Since 2010-2, a specific TDP-SAB composition was put in place: 9 DC Chairs, 18 members nominated by DC Chairs, TDP-SAB members nominated by CNCs and TDP-SAB Coordinator • Georges Wanet continued to serve as TDP-SAB Coordinator until 2012-2 TDP-SAB Composition
A Total of 810 proposals evaluated over 10 calls • 34 running Actions (+ 4 suggested for approval) • Average success rate: 4,8% • Actions allocated to Domains for administrative and monitoring purposes • Actions unevenly distributed by Domain TDP-SAB Summary of Calls and Approved Actions
Following the 2010 Mid-Term Review of COST, the CSO established the Working Group “Implementation” to prepare measures and decisions for the implementation of the CSO Strategy “Shaping COST for the future” • In 2012, the CSO decided to use TDP to test a pilot evaluation and selection procedure to achieve a simple, fair and fast procedure fully transparent towards the proposers. • In the TDP Pilot, the TDP Panel replaces the TDP-SAB as the body supervising the whole TDP Pilot evaluation procedure in order to ensure transparency, fairness and excellence of the process. From TDP-SAB to TDP Panel
New Proposal Template • More concise, aligned with the evaluation criteria • 1st Step “Challenge Selection”: • Independent External Experts evaluate remotely the scientific and technological soundness of the proposal • 2ndStep “Implementation Plan”: • TDP Panel members with invited DC Members evaluate remotely the coherence and feasibility of the implementation plan supporting COST mission and criteria • TDP Panel prepares consensus reports at a dedicated meeting • Hearings “Management Capacity”: • TDP Panel with invited DC Chairs evaluate leadership potential, motivation and drive, organisational and communication skills of the proposers • Final ranking list established TDP Pilot Evaluation and SelectionProcedure
Composition: • Former CNC nominated TDP-SAB members with broad trans-disciplinary expertise are formally appointed as TDP Panel members • Chair and Vice-Chairs appointed by Presidency • Responsible for monitoring all the TDP Pilot procedure • Keeper of transparency, fairness and excellence of the evaluation process • Involved at all steps of the evaluation • 1st step: Validating the independent external experts identified by COST Office • 2nd step: Evaluating the COST mission and criteria • Hearings: Selecting the final list of proposals to be recommended to the CSO for funding Proposal Evaluation – TDP Panel
Proposal Evaluation JAF (16-17/10) Approval CSO (14/11) Hearings (30/9) (final list) Step 2 - Feedback (13/9) Step 1 - Feedback (9/8) Collection date (14/6) Closing of Registrations (29/3) Identification of External Experts (now to 5/6) Quality Check (29/7-2/8) COST Office CoI / Eligibility (14-19/6) Validation of External Experts (14-26/6) Consensus Meeting (9-10/9) Implementation Plan: Remote Evaluation (5/8 – 24/8) Hearings (30/9) TDP Panel Challenge selection:External experts evaluation (1/7 – 19/7) DC Chairs invited to the Hearings Implementation Plan: invited DC Members Remote Evaluation Other Evaluators Supporting TDP Panel
Newly developed (in house) e-COST feature • Used for a timely identification of relevant experts • As of 13 May 2013: • ~ 9773 registered experts • Gender balance: 65% male, 35% female experts • Age balance: ~64% of experts are younger than 45 • Additional effort needed in covering all Research Areas • Special access allows CNCs to contact the external experts in their country invited to evaluate TDP proposals (CNC bound by confidentiality) COST Experts Database
TDP Registrations (closed March 29th): 163 out of 192 intentions to submit are eligible • Proposal Submission: Deadline June 14th at 5pm (1 proposal already submitted) • External Experts are needed for 163 potential proposals • Based on TDP past experience (1-2 proposals assigned per expert) • 4-5 External Experts to assign to each proposal to ensure at least 3 evaluations (total: ~550 experts) • TDP Panel validation of experts to assign on the basis of set of 7-8 experts/proposal available for evaluation (total: ~ 870 experts available) • Availability check to be sent to ~20 experts/proposal (total: ~ 3200 availability checks) • All these steps are supported by a tailor-made in house developed IT system on e-COST Allocation of External Experts
COST Office Avenue Louise 149 1050 Brussels, Belgium Thank you www.cost.eu