1 / 22

Writing Proposals with Strong Methodology and Implementation

Writing Proposals with Strong Methodology and Implementation. Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation. Goals. Encourage you to seek funding from NSF for your research.

ince
Download Presentation

Writing Proposals with Strong Methodology and Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Writing Proposals with Strong Methodology and Implementation Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation

  2. Goals • Encourage you to seek funding from NSF for your research. • Help you develop rigorous methodology, data collection and analysis plans that will make your proposal competitive. • Help you consider the level of detail appropriate for implementation projects.

  3. Describing Your Project’s Methodology

  4. Expectations for Methods in DRL • The DRL Programs welcome research using a variety of evidence. • The program is open to qualitative, quantitative, and mixed methods. • Methods must be rigorous and appropriate to the proposed research questions or hypotheses. • Design, methods, and analytic techniques should have a coherent and logical link. • Research methods should be described in adequate detail.

  5. Details of Methods to Include – 1 • Provide a rationale for your research design • Make it clear how the research design and analyses answer the research questions (RQs) • Include a description of study population and sampling method, sample size, expected effect size • Power analysis should inform sample size decision

  6. Details of Methods to Include – 2 • Instruments or protocols to be used • Validity, reliability, and triangulation of measures • Reviewers are cautious about development of new measures • Data analysis plans • Statistical Models, procedures for analysis of text/video/observation data • All of these need to have a rationale for them that connects to your RQs

  7. Quantitative research • Research design (e.g. experimental, quasi-experimental and non-experimental designs, issues of internal & external validity) • Measurement (e.g. data to be collected, constructs, measures, validity & reliability of measures) • Data analysis (e.g. statistical decisions, models & procedures)

  8. Qualitative Research • Identify the methodology as a systematic research design (e.g. case study, discourse analysis etc.) • Describe how and what data will be collected • Consider issues of validity, and triangulation • Include plans for analysis of textual data (coding scheme, themes etc.) • Find good balance between planned approach to analysis and flexibility to respond to findings

  9. Find the Expertise You Need • Content experts are not necessarily methods experts; so partner with research methodologists • Sooner is better than later (in proposal writing stage) • Especially necessary if design is complex or you use innovative methods • Find a colleague • As co-PI or as consultant

  10. Common Missteps in Methods -1 • Overly generic language and description • “We will use constant comparative methods.” • “We will use HLM.” • Lack of consistent link between the theory, the RQs, the data collected, and the analyses • Reviewers will notice. • Methods and planned analyses inadequate to answer RQs. • Try developing a matrix of RQs, data/measures, and analyses – even if only for you during planning

  11. Common Missteps in Methods -2 • Too little or too much data without clear analysis plan • Reviewers will wonder if you understand the task. • Method is novel and not well understood in field • Needs more detail, examples and citations to justify that it is appropriate

  12. Summary of Main Points • Articulate clearly your research questions or research hypotheses • Think about the most appropriate and rigorous methods to answer your research questions • Give a clear and concise description of the research methods • Include your rationale for research design decisions • Include a research methods expert in your team • Articulate clearly why your research is important and how it would contribute to theory and practice

  13. Describing an Implementation

  14. Details of Implementation • There are important implementation issues that need to be addressed if your project includes • Curriculum development • Professional development • Interventions

  15. For All Implementation Projects • Consider the method(s) used to gauge the quality of the implementation • Whether as “Fidelity of Implementation” (FOI), Intended/Enacted Curriculum, or other approaches • Be specific on the STEM content, ages/grades, settings • Be clear on the roles of the team • Who will lead PD or curriculum, who will oversee implementation? • Who will collect evaluative data on implementation?

  16. Issues for Curriculum Development • Specify the STEM content of interest and age range(s) for which you are developing curriculum • Specify the role(s) of the PI team, outside experts, participating teachers, or others • Identify the process for development, revision, and field-testing • Provide justification for the design process you will use • Make sure the measures match the materials/curriculum under development

  17. Issues for Professional Development • Be specific on the professional development (PD) • STEM content, grades, and school settings • Role(s) of the PI team, outside experts, participating teachers, or others • Format of professional development (e.g., online, workshops) • Duration and location of PD • Evaluation • Identify the model for PD you will use • Train-the-trainer • Master teacher • Professional Learning Community • Provide justification for the model, the format, and your team’s expertise

  18. Issues for Intervention • Describe development history and its prior use • Provide evidence, if any, for intervention’s potential effects • Describe in detail: • Population and sample; • Setting, duration, and content; • Design process, if the intervention will be revised iteratively

  19. Consider Generalizability • If you are developing a new curriculum/PD model: • How will the intervention, curriculum, or the professional development developed in your setting apply to new settings that may differ from the study? • If you are applying an intervention, PD model, or curriculum adopted from another setting: • How well does that intervention apply to your setting? • Will promising prior results be replicable in this project?

  20. Evaluation Plan • Evaluation should be useful for improving the research project • Design and content of the plan should be appropriate to what would enhance or benefit the project • Formative or summative, internal or external may be appropriate, depending on the project. • For example, advisory committees are appropriate for the evaluation of projects. • Go to specific session on Project and Program Evaluation later in the conference for more details.

  21. Any Questions?? Don’t be shy.

  22. Thank you! Feel free to contact Kusum Singh for follow-up and tips for finding a good methodologist: ksingh@vt.edu

More Related