1 / 34

Food and Agricultural Education Information System (FAEIS) Project User Opinion Survey 2005

Food and Agricultural Education Information System (FAEIS) Project User Opinion Survey 2005. Thursday, June 23, 2005 Washington, D.C. H. Dean Sutphin Yasamin Miller Professor Director, SRI Agriculture & Extension Education Cornell University Virginia Tech University yd17@cornell.edu

amena-beach
Download Presentation

Food and Agricultural Education Information System (FAEIS) Project User Opinion Survey 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Food and Agricultural EducationInformation System (FAEIS) ProjectUser Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller Professor Director, SRI Agriculture & Extension Education Cornell University Virginia Tech University yd17@cornell.edu sutphind@vt.edu www.sri.cornell.edu

  2. Purpose • To understand from the FAEIS users, their experiences with the system and its usefulness • To gather input from FAEIS users in higher education on improvements and changes that are needed in the future

  3. Objectives 1. Solicit input about the user friendliness of FAEIS 2. How well is FAEIS meeting its goal of supporting higher education in food, agriculture, and natural resources sciences 3. How well is FAEIS meeting its goals to support the national network of universities 4. Determine the comprehensive view of current and future use of FAEIS as reported by feedback from expert in the field

  4. Objectives 5. Determine the effectiveness of FAEIS and future direction in terms of context, input, process, and product 5.1 CONTEXT Includes the elements of the university system in which FAEIS is operating including alignment with needs of the national network of universities, reliability of FAEIS, and interactions that university personnel have with FAEIS

  5. Objectives 5.2 INPUT Asks the user to consider the human, fiscal, and other resources required for the FAEIS to operate within the university system such as commitment of staff time and variables related to data entry.

  6. Objectives 5.3 PROCESS • The user is asked to evaluate process variables (i.e. timing of data entry and help desk activity) required for development, maintenance, and continuous improvement.

  7. Objectives 5.4 PRODUCT • Characteristics of project outcomes and uses of the system such as quality of report generators, reliability of the system components, and need for further refinements are evaluated.

  8. Methodology • Survey Research using Case Study Methodology • Administer Survey Instrument to Population of FAEIS User Group • Focus Group using Expert Panel

  9. Methodology • Structure of evaluation distributed to FAEIS panel of experts Fall 04 for review and reaction • Review and refinement of structure based on input • Decision to delay evaluation until Spring 05. • The survey instrument was developed by SRI in collaboration with the FAEIS central project staff and USDA Higher Education administration responsible for the project.

  10. Population and Sample 1862 Land-Grant 779 1890 Land-Grant 135 1994 Land-Grant 26 Non-Land-Grant 419 Other (US, USDA, Priv, Org) 57

  11. Instrument Development 1. Outlining overall survey objectives 2. Development of survey dimensions 3. Development of items within dimensions 4. Internal testing-administration of survey to individuals on the FAEIS administrative team 5. Revision of instrument based upon internal testing

  12. Instrument Development 6. Pilot test - via web • N=35 individuals 10 pre-selected due to qualifications 25 randomly selected • Personalized invitation e-mails sent with unique URL links on 5/26/05 • Reminder e-mail sent to non-respondents on 5/31/05 • Total: 17/35 completions • Response rate: 48.6%

  13. Instrument Development 7. Revision of questionnaire based upon pilot results • Panel of experts used to determine instrument validity • Variance in responses • Not all questions applicable to all respondent types • Additional response categories added, few minor changes made • Conclusion: revised instrument was valid and reliable and able to capture the desired information

  14. Data Collection 8. Launching of tested instrument • Personalized invitation e-mail with unique URL link sent on June 6, 2005

  15. Response Rate Number of Responses by Institution Type June 17 Institution Type Total Completions Refusals Response Rate 1862 Land-Grant 779 216 28 27.7% 1890 Land-Grant 135 25 3 18.5% 1994 Land-Grant 26 4 0 15.4% Non-Land-Grant 419 99 10 23.6% Other (US, USDA, Priv, Org) 57 8 3 14.0% TOTAL 1,416 352 44 24.8%

  16. Follow Up of Non Respondents • Send up to five reminder e-mails to non-respondents • Identify institution types with low response rates and follow-up with reminder phone calls

  17. Control for Non Repondent Error • Comparison of non respondent to respondents data • Determine reason for lack of participation in survey

  18. Indepth Analysis and Probing • Use FAEIS Expert Panel and Focus Group • Data Transcribed and Reported Internally • Anonymity of Response

  19. Final Report • Analyses to portray variables and comparison on dimensions of particular interest • Conclusions • Recommendations • Executive Summary

  20. Preliminary Findings Who are the respondents: 352 respondents out of 1416 by June 17 • 82.4% use the FAEIS system and enter data • 4.4% do not enter data, but use the system • 13.2% do not use the system at all (of these significantly more NLG (20.2%) and Other (62.5%) are non-users)

  21. Preliminary Findings: Context Q: Can you obtain student placement data? (significantly more NLG (30%) cannot get student placement data) Q: Are we collecting data from the institutions which you consider your peers? Q: Do a sufficient number of institutions report data to allow for meaningful and helpful analysis?

  22. Preliminary Findings: Input Q: How would you rate your ability to obtain the following data in the format required by FAEIS:

  23. Preliminary Findings: Process Q: Are the instructions in the FAEIS system clear?

  24. Preliminary Findings: Process (continued) Q: Is the support offered by the Help Desk adequate?

  25. Preliminary Findings: Product Q: How useful is the information collected in FAEIS:

  26. Preliminary Findings: Product (continued) Q: To what extend does FAEIS add value to other information sources that are available?

  27. Preliminary Findings: Product (continued) • 84.8% do not use Custom Report Builder

  28. Preliminary Findings: Product (continued) Q: Who are users of FAEIS data at your institution?

  29. Preliminary Findings: Product (continued) Q: In general, to what extent is it useful for your institution to participate in the FAEIS database?

  30. Preliminary Thoughts on Conclusions and Recommendations • Need further analyses to explain the high rate of “don’t know”. Is this response isolated to a particular population? How can we correct this lack of information? • Preliminary data show a positive view of FAEIS on most variables • Need to determine why it is more difficult for non land grants to access data • Custom report builder is in early stages of release and current data may not reflect full potential

  31. Preliminary Thoughts on Conclusions and Recommendations • FAEIS seems to be satisfactory in terms of comprehensive elements involving context, input, process, and product • Respondents were not forthcoming with significant new and innovative future development recommendations • FAEIS seems to be meeting its goals in terms of user friendly and meeting needs of the university community.

  32. Final Report Format • Are there recommendations for the format, style or types of analyses? • Are there recommendations on developing the web report or the written report? • Any suggestions on dissemination of findings?

  33. Questions Your thoughts – opinions?

  34. Thank you.

More Related