1 / 18

Data Collection Strategies, Methods and Standardization

Data Collection Strategies, Methods and Standardization. July 24, 2006 Jeff Mossey Assistant Director, Kentucky NSF EPSCoR. Brief Review of Typical Activities. Annual Reporting (Fastlane) State Reporting (likely different) Brochure Annual Report Nuggets/Success Stories Newsletters

abattin
Download Presentation

Data Collection Strategies, Methods and Standardization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Collection Strategies, Methods and Standardization July 24, 2006 Jeff Mossey Assistant Director, Kentucky NSF EPSCoR

  2. Brief Review of Typical Activities • Annual Reporting (Fastlane) • State Reporting (likely different) • Brochure Annual Report • Nuggets/Success Stories • Newsletters • Online Reporting Systems • Diversity and Evaluation Reports/Activities • Reverse Site Visits

  3. Agenda • Update on Evaluation Activities • Identify Some (Evaluation) Problems and Possible Solutions • Demonstrate the “EPSCoR Connections” Site • Strengths/Weaknesses/Improvements • Is this something that we can do as a program? • Open Discussion of Issues/Concerns/Ideas • Present it to NSF; Nat. Conf. Breakout

  4. Status of Evaluation Activities • Report from Norman Webb coming soon • “Three Phases” of activities: Phase I – (Conceptual) Framework • Conduct Workshops • Establish Indicators • Develop Report Phase II – (Detail) Development  • Establish metrics and standards • Find out what data is available Phase III – Implementation and Testing

  5. Six Major Program Indicators • Human Resource Development • Research Production • Research Portfolio Quality • Research Investments and Materials • Research Collaboration and Networking • Research Climate, Culture, and Communications

  6. Norman’s Comments • Data availability? Both nationally and at the state-level (e.g. funding/publications) • Fastlane/EPSCoR Database Relationship • Common System or Common File Format • Strength of the Connection ??? • Attribution to the project (Direct/Indirect) • Timing of the Reporting (Fed. Fiscal Year)

  7. Problems We Must Overcome • LSAMP Program = • (Goal) Increasing the graduation rate of underrepresented students in the sciences. • (Metric) Did they go to grad school? • EPSCoR Program = • (Goal) Research Infrastructure Development • (Goal) Increased Grant Funding (Competitiveness) • (Goal) Diversity • (Goal) Economic Development • (Goal) Outreach and Education (K-12) • (Goal) Networking and Strategic Development with Industrial Governmental, and Academic Partners

  8. Problems We Must Overcome Meaningful, Aggregated Success Stories EPSCoR Investment/ Activities 1.) Use common metrics, stories, and techniques for gathering success indicators 2.) Use common systems for collecting the data 3.) Gather the right metrics/stories

  9. Two Layers of Success • Aggregated, Impressive Numbers “Annually over 200,000 people participate in this program” -LSAMP Brochure • Nugget Short Stories

  10. http://www.kynsfepscor.org/test/EpscorTest/test.php

  11. Weaknesses • Won’t accommodate “other” successes (e.g. NIH awards, Nobel Prizes) • Progress toward milestones and other “narrative” updates will still be needed • Programs newer to EPSCoR will have fewer stories

  12. Strengths • Success stories are program-wide and standardized • Tracing backward, or “backing into” a story is easier than constant monitoring “outward” • Attribution to EPSCoR can vary • Minimal time commitment • More dated stories aren’t neglected • By definition, connects EPSCoR to successful outcomes … follow-on awards = increased competitiveness • Aggregate connections to show an impressive percentage of involvement. • Stories (i.e., history of EPSCoR investments) is archived

  13. Next Steps … Be Proactive! • Is this something we want to do? • Standardizing the story gathering process? Develop “users manual” • Document other evaluation comments • Share with Norm/Sherry

More Related