1 / 82

NSERC DG Advices

NSERC DG Advices. yann-gael.gueheneuc@polytmtl.ca Version 0.6.2 2016/06/08. Questions: they are all welcome at yann-gael.gueheneuc@polymtl.ca. Disclaimer: I cannot be held responsible for the failure or success of your applications, were you to follow or not these advices.

Download Presentation

NSERC DG Advices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NSERCDG Advices yann-gael.gueheneuc@polytmtl.ca Version 0.6.2 2016/06/08

  2. Questions: they are all welcome at yann-gael.gueheneuc@polymtl.ca

  3. Disclaimer: I cannot be held responsible for the failure or success of your applications, were you to follow or not these advices

  4. Disclaimer: NSERC does not endorse in all or in part the content or the form of these slides in any ways

  5. NSERC DG Advices • Each NSERC DG application is evaluated according to 4 criteria and 6 merit indicators • Excellence of the researcher • Merit of the proposal • Contribution to the training HQP • Cost of research

  6. NSERC DG Advices • Each NSERC DG application is evaluated according to 4 criteria and 6 merit indicators • Exceptional • Outstanding • Very strong • Strong • Moderate • Insufficient

  7. NSERC DG Advices • How are these criteria rated by the reviewers using the indicators? • How to ease the reviewers’ jobs? And, how to be possibly more successful?

  8. Process in a nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP CCV (Form 100) Application (Form 101) Conclusion Further readings Outline

  9. Process in a nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP CCV (Form 100) Application (Form 101) Conclusion Further readings Outline

  10. Process in a Nutshell • From the candidate’s point of view • August 1st: submission of Notice of Intent • NoI, formerly Form 180 • November 1st: final submission of • CCV (formerly Form 100) • Application (formerly Form 101) • Publication samples • March/April: announcement of the results

  11. Process in a Nutshell • Each application is reviewed by • 5 colleagues: first and second internal reviewers and 3 readers • 0 to 5 external reviewers • From the internal reviewer’s point of view • Two main parts • Off-line work, e-mails/readings • Competition week in Ottawa

  12. Process in a Nutshell • Off-line work • August 27th: reception of all the submissions • In 2012, 322 submissions • In 2014, 324 submissions • In 2015, 327 submissions • September 7th: ratings (expertise levels and conflicts) of all the submissions • High, Medium, Low, Very Low, Conflict, X (language)

  13. Process in a Nutshell • Off-line work • September 24th: final choice of the 1st internal reviewers for each applications • In 2012, 14 applications as 1st internal reviewer, 15 as 2nd internal reviewer, 17 as reader = 46 • In 2014, 9 applications as 1st internal reviewer, 2 as 2nd internal reviewer, 28 as reader = 39 • But was session chair… • In 2015, 13 applications as 1st internal reviewer, 7 as 2nd internal reviewer, 29 as reader = 49

  14. Process in a Nutshell • Off-line work • October 5th: choice by the 1st internal reviewer of 5 external referees • In 2012, 14 applications = 70 • In 2014, 9 applications = 45 • In 2015, 13 applications = 65 • May include referees suggested by the candidate but may also replace all of them

  15. Process in a Nutshell • Off-line work • October 22nd: ratings of applications from other evaluation groups • In 2012, 1 application • In 2014, 0 applications • 39 applications to evaluate for joint reviews or not • In 2015, 0 applications

  16. Process in a Nutshell • Off-line work • Early December: final list of readings • In 2012, 47 applications • In 2014, 39 applications • In 2015, 49 applications

  17. Process in a Nutshell • Off-line work • January/February: reception of the reports from the external referees • In 2012, 123 reports • In 2014, 128 reports • In 2015, 128 reports • February 18th to 22nd: competition week in Ottawa during which each application is discussed and rated

  18. Make it easier for the reviewers Process in a Nutshell • Off-line work • In 2012 (and every year…), a lot of work! • 322 submissions • 47 applications (including joint publications) • 70 referees • 123 referee reports

  19. Process in a Nutshell • Competition week • February 18th to 22nd: competition week in Ottawa during which each application is discussed and rated • 5 days • Very intense, demanding, and tiring

  20. Process in a Nutshell • Competition day • Starts at 8:30am • Divides into • 31 15-minute slots • 2 15-minute breaks • 1 45-minute lunch • Ends at 5:15pm

  21. Process in a Nutshell • Competition slot • In a 15-minute slot, the ratings of an application are chosen by the reviewers • Or the application is “deferred”, to be re-discussed at the end of the day • In 2012, 1 application • In 2014, 4 applications • In 2015, 3 applications

  22. Process in a Nutshell • Competition slot • 1st internal reviewer gives ratings with justifications, which must be facts in the forms • Summarises external reviews • 2nd internal reviewers contrasts, supports, adds missing facts from the forms • The readers complement or challenge ratings given by 1st and 2nd internal reviewers, must be supported by facts from the forms

  23. Not exactly the NSERC criteria Process in a Nutshell • Competition slot • 1st internal reviewer gives ratings with justifications, which must be facts in the forms • A typical presentation follow this pattern • Candidate: career, funding, visibility, publications, HPQ record, planned training • Proposal: context, lacks, characteristics (Incremental? Applicable? Feasible?) • External: summary of the referees' reviews, summary of the provided contributions then, the reviewer would give his ratings

  24. Process in a Nutshell • Competition slot • Session chair keeps the time strictly • Session chairs makes sure that any discussion sticks to the facts

  25. Process in a Nutshell • Competition slot • Ratings are anonymous • Secret electronic vote • Session chair announce results • Ratings are consensual • If reviewers/readers strongly disagree, the application will be re-discussed at the end of the day • I did not see any strong debates: mostly 1st and 2nd internal reviewers agreed, backed-up by the readers • Some facts were sometimes highlighted and ratings were changed accordingly

  26. Process in a Nutshell • Competition slot • Any criteria rated as moderate or insufficient receives comments from the committee, reflecting the reviewers’ consensus • NSERC provided typical comments, for example: “The Evaluation Group was of the opinion that the applicant did not take advantage of the available space to provide compelling information on how the proposed problems could be solved. The applicant is encouraged to situate his/her work and contributions more carefully within the context of the [subject].”

  27. Process in a Nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP CCV (Form 100) Application (Form 101) Conclusion Further readings Outline

  28. Funding Decisions • In a nutshell • Each proposal is rated by the reviewers secretly after the discussions • The medians of the ratings are used for criteria • For example • Excellence of researcher: {S, S, M, M, M}, rating is M • Merit of the proposal: {V, V, S, S, M}, rating is S • Impact of HQP: {V, S, S, S, M}, rating is S • The application rating is therefore {M, S, S}

  29. Funding Decisions • Bins • The numeric “values” of the ratings are “added” • For example, {M, S, S}  2+3+3 = 8 • The application is placed into one of 16 bins • The bins are labelled A through to P and correspond numerically to 18 down to 3

  30. Funding Decisions • Bins • Bins A and P are uniquely mapped to {E, E, E} and {I, I, I} while other bins contain a mix of numerically equivalent ratings, e.g., {V, S, M} is in the same bin as {S, S, S} and {M, S, V} • For example, the application rated {M, S, S} is in K • Not all applications in a bin are funded: {S, S, S} may be funded while {M, S, V} is not • Because of the moderate indicator for the first criteria • Cut-off point depends on year

  31. Funding Decisions • ER vs. ECR • Candidates are divided into • ER: established researchers, who already applied (funded?) to NSERC DG • ECR: early-career researchers, who are “within two years of the start date of an NSERC eligible position” • ECR are funded one bin “lower” (better) than ER

  32. Process in a Nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP CCV (Form 100) Application (Form 101) Conclusion Further readings Outline

  33. Criteria and Indicators • “Values” of criteria • Excellence of the researcher • Merit of the proposal • Contribution to the training HQP • Cost of research

  34. Criteria and Indicators • “Values” of criteria • Excellence of the researcher • Knowledge, expertise and experience • Quality of contributions to, and impact on, research areas in the NSE • Importance of contributions

  35. Not really important Amounts of previous grants (in particular NSERC DG) should be ignored Criteria and Indicators • “Values” of criteria • Merit of the proposal • Originality and innovation • Significance and expected contributions to research; potential for technological impact • Clarity and scope of objectives • Methodology and feasibility • Extent to which the scope of the proposal addresses all relevant issues • Appropriateness of, and justification for, the budget • Relationship to other sources of funds

  36. Criteria and Indicators • “Values” of criteria • Contribution to the training HQP • Quality and impact of contributions • Appropriateness of the proposal for the training of HQP in the NSE • Enhancement of training arising from a collaborative or interdisciplinary environment, where applicable

  37. Not really important but you cannot have more than what you ask, no matter the merit Criteria and Indicators • “Values” of criteria • Cost of research • Rationale

  38. Criteria and Indicators • “Meanings” of indicators • Exceptional • Outstanding • Very strong • Strong • Moderate • Insufficient

  39. Criteria and Indicators • “Meanings” of indicators • Exceptional • I did not see many exceptional ratings, obviously • Outstanding • Very strong • Strong • Moderate • Insufficient

  40. Criteria and Indicators

  41. Criteria and Indicators • NSERC rating form • NSERC provides a 2-page rating form • I found that this rating form does not follow the presentation pattern during the competition slot because it spreads information • However, each application was obviously rated according to the 4 criteria and 6 indicators

  42. Criteria and Indicators • NSERC rating form (1/2)

  43. Criteria and Indicators • NSERC rating form (2/2)

  44. Researcher Proposal HPQ Criteria and Indicators • My own rating form

  45. Process in a Nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP CCV (Form 100) Application (Form 101) Conclusion Further readings Outline

  46. Advices • Introduction • Reviewers receive 2-3 dozens of applications • Overall, upon first review, the quality is impressive, thus generating a positive reaction • The objective is to discriminate, however, initiating a vigorous search for flaws

  47. Advices • Introduction • Reviewers may perceive aspects of applications as confusing, ambiguous, incomplete, or just not compelling • They will not give the benefits of the doubt • I witnessed some excellent researchers (IMHO) receiving low ratings because of sloppiness in their applications

  48. Advices • Introduction • Reviewers will most likely “mine” the CCV, application, and publications to make up their minds regarding the 4 criteria Make it easy for them to mine your applications!

  49. Advices • Introduction • CCV (Form 100) • Is used for two of the three important criteria • Application (Form 101) • Is necessary for the merit of the proposal

  50. Excellence of the Researcher • CCV (Form 100) • Career: pages 1-2 • Funding: pages 3-… • Visibility • “Other Evidence of Impact and Contributions” • Awards, chairing, editorship, organisation, seminars: anything showing external acknowledgments • Publications • “Other Research Contributions” • Quantity vs. quality

More Related