400 likes | 519 Views
Beginning Farmer and Rancher Development Program. Best Practices Webinar #2 Evaluating Your BFRDP Project. Introduction and Welcome. Welcome – one of the recommendations of the 2011 PD meeting is to have webinars to share best practices and discuss common issues.
E N D
Beginning Farmer and Rancher Development Program Best Practices Webinar #2 Evaluating Your BFRDP Project
Introduction and Welcome • Welcome – one of the recommendations of the 2011 PD meeting is to have webinars to share best practices and discuss common issues. • This presentation by Stephanie Ritchie, USDA-National Agricultural Library, is about evaluation of participants and annual reporting of outcomes for all BFRDP grants. • Feel free to ask any questions • This meeting is being recorded and the link will be posted on the BFRDP website
Evaluating Your BFRDP Project Stephanie Ritchie USDA – National Agricultural Library Start2Farm.gov
Goals for This Session • Understand why program evaluation is important • Learn current BFRDP evaluation requirements to improve data collection and reporting process • Discover best practices and tools for program evaluation that will help you be more successful • Explore rules for data collection from individuals • Investigate how to develop instruments
Goals for This Session • Understand why program evaluation is important • Learn current BFRDP evaluation requirements to improve data collection and reporting process • Discover best practices and tools for program evaluation that will help you be more successful • Explore rules for data collection from individuals • Investigate how to develop instruments
OutcomesWhat are they and why should you care? Outcomes are the change that occurs as a result of our activities and investments Government is often asked to account for how the provided resources were used and to demonstrate the outcomes of projects that funded This information is used for legislative and funding appropriation decisions and demonstrate to the public that taxpayer monies have been put to good use.
NIFA Outcome Reporting Requirements • Acknowledge funding in all materials • Participate in annual Project Director’s meeting • Provide any information that is freely available to the public to the BFRDP Clearinghouse • Provide requested data through CRIS annual reports • Format of NIFA system challenging for BFRDP outcome data • May send additional information by e-mail • BFRDP Outcome will be used to: • Respond to questions from the Congress and the public • Produce two annual reports – overall BFRDP program results for public and detailed project data for NIFA internal program management • Achieve overall goals: Improve Performance - Enhance Visibility – Increase Accountability
Evaluation to Show BFRDP Successes – CRIS Report • Use Activity and Outcome Measures Recommended by BFRDP • Can be compiled across projects to create an overall picture of BFRDP activities and outcomes • Measures impacts of projects on participating individuals/families/farms
Evaluation for Project Improvement • Projects may wish to include questions in their evaluation instrument about the techniques/methods/practices specific to their work • Examples of what might be evaluated • How did you hear about our program? • Instructor performance • Satisfaction with services received • Please submit your internal program evaluation questions/instruments to Start2Farm Extranet - http://www.s2fdata.org/ • Get an Extranet login at: https://docs.google.com/spreadsheet/viewform?pli=1&formkey=dHBxR0dQRmRLdmNTZzMtb1lCaGtGZ2c6MQ#gid=0
Meta Changes / System Changes • Evaluation to show changes beyond BFRDP program to the (beginning) farmer community at large • No plan yet to do this type of evaluation • Ability to quantify significant social/economic/environmental change is beyond the scope of BFRDP goals and individual projects • Maybe an Education Enhancement Project could be appropriate entity to study the beginning farmer training in general – or a group of BFR training organizations
Goals for This Session • Understand why program evaluation is important • Learn current BFRDP evaluation requirements to improve data collection and reporting process • Discover best practices and tools for program evaluation that will help you be more successful • Explore rules for data collection from individuals • Investigate how to develop instruments
Current BFRDP Outcome Reporting • Mandated Outcome Measures: • How many people are trained? • How many of the target audience are trained? • How many new farms have been created? • Outcomes not required, but of interest: • What are the best practices for beginning farmer training? • What lessons have been learned by grant projects? Activity Measures After (varied intervals) training program, record # and % : change in knowledge, attitudes, skills, or intention (KASI) change in behavior/approach Okay to measure planned (intention) and actual behavior changes Outcome Measures • Report amount of materials produced to publicize/recruit • Report number of workshops/ training programs offered • Provide total numbers and demographics of participants attending workshops and training programs
More Successful New Farmers ! • Increased Productivity • Increased Profitability/Efficiency • Increased Environmental Sustainability • Increased Quality of Life Long Term Goal Assess 1+ years after training • Application of New Practices/Skills • Increased Knowledge • Changed Attitude • Increased Skills • Intention to adapt new practice Assess right after training
7. End Results 6. Behavior (practice change) 5. KASI change (knowledge, attitude, skills, intentions) 4. Reactions 3. People involvement 2. Activities 1. Resources EVALUATION THEORYLevels of Evidence Impacts Outputs Inputs
New DRAFT Outcome Measurements • Of those who complete any part of a training program (baseline), immediately record the number and percent of participants: • plan to start farming • who are farming • plan to continue farming • plan to stop farming • change in knowledge • change in attitudes • change in skills • plan a change in behavior/approach • plan to continue participating in training • Other – explain • One-year after the initial training program, record the number and percent of participants who as a result of your training: • graduated • started farming • did not start farming • continued farming • stopped farming • changed farming\land management practices • changed marketing practices • changed business practices • developed farm plan • continue to participate in your training programs • Other – explain
New DRAFT Outcome Measurements http://www.s2fdata.org/sites/default/files/outcome_standard_grants_2012draft2.doc
Summary of Data Needed for BFRDP Outcomes Reporting How many changed? • # of participants, % of participants of total # participants • # of target audience versus # total participants How much change? • Knowledge, Attitude, Skill, Intention change • This may be difficult to measure uniformly across programs – but you should report #/% of people, NOT % of change (i.e., 30% score increased to 90% score)
On a key concepts quiz, participants scored 58% before the course and 82% on average after the course (N=36). 100% plan to use a soil test and 9 of 15 plan to use a cover crop (N=15). 1 year after the course 83% tested their soil (N=5). Ben Davies, who took a class on soils through the program, said, "I think how well the season went this year was because of the soils class. I calculated my amendments, applied the right amounts and the plants grew really well." Types of impact Introduction to Soils– 56 participants • Overall project impact • 406 new farmers participated in 19 courses in 7 counties. • 51% currently farming (n=186) • 99% increased knowledge • 63% increased knowledge “a great deal” (n=255) • 34% increased knowledge “a moderate amount” (n=255) • Average increase in real knowledge of 36% (3 grades) (n=115) • 65% plan to adopt 1+ new practices/make a change (n=235) • 40% plan to start farming (n=151) • 49% plan to continue farming (n=151) Students see the benefits of soil aggregates.
Goals for This Session • Understand why program evaluation is important • Learn current BFRDP evaluation requirements to improve data collection and reporting process • Discover best practices and tools for program evaluation that will help you be more successful • Explore rules for data collection from individuals • Investigate how to develop instruments
So how do we collect the data? BFRDP Outcome measure: % Change in knowledge Knowledge gain – perceived 1. Overall, after completing the course, do you think your knowledge of sheep management has increased: a. great deal b. a moderate amount c. a little d. not at all
So how do we collect the data? BFRDP Outcome measure: % Change in knowledge
Knowledge Gain – Real Pre – Post Test 2010 SHEEP MANAGEMENT SHORT COURSE (Please circle your answers) 1. Which breed of sheep has the finest (highest quality) wool? Suffolk Romney Merino Cheviot 2. Which breed will commonly reproduce in the spring as well as in the fall? Dorset Hampshire Suffolk Shropshire 3. The average length of the Estrous, or reproductive cycle, in the ewe is: 10-12 days 20-21 days 16-17 days 28 days 4. The average length of gestation (pregnancy) in the ewe is: 135-140 days 154-159 days 144-150 days 160-165 days BFRDP Outcome measure: % Change in knowledge
BFRDP Outcome measure: % planned change in behavior Change in Behavior 4. Listed below are some topics from this class. Please circle the best answer(s) for each item. Did Plan to do No Not before within 6 plans apply Techniques class months Take a soil test BEFORE PLAN NO PLAN NA Calibrate your sprayer BEFORE PLAN NO PLAN NA 5. What else did you learn that you plan to use this year?
Change in Attitude BFRDP Outcome measure: % change in attitude 2. Please think back to your knowledge before this class and what it is now at the end of the class. For each topic listed below, place a B at the point where your knowledge was at before the class and an N for where your knowledge is now, after the class. The importance of figuring true costs into estimates VERY IMPORTANT NOT IMPORTANT
Compiling Data across multiple classes • Using the same question format makes it easier to compile. • Using question formats that are flexible allows you to still retain detail at the class level and maintain the ability to compile the data.
BFRDP Outcome measure: % changed farming/ land management practice ONE YEAR AFTER
Our Current Template When we can’t do pre-post testing
Best Practices/Evaluation Tools Share under “Evaluation Tools and Best Practices” at http://www.s2fdata.org/ - Forum WANTED • Land Stewardship Project’s Animal and Vegetable Farming Skills Evaluations • Core competencies and skills checklist developed at Northeast Beginning Farmers Coalition meeting • YOUR EVALUATION TEMPLATES!!! Captured
Goals for This Session • Understand why program evaluation is important • Learn current BFRDP evaluation requirements to improve data collection and reporting process • Discover best practices and tools for program evaluation that will help you be more successful • Explore rules for data collection from individuals • Investigate how to develop instruments
Data Collection from Individuals Research projects which involve human subjects should: 1) not place subjects at undue risk; 2) get uncoerced, informed consent of subjects for their participation Guidance for USDA at 7 CFR Part 1c Title 7—Agriculture: PART 1c--PROTECTION OF HUMAN SUBJECTS http://www.access.gpo.gov/nara/cfr/waisidx_05/7cfr1c_05.html
The Belmont Report Based on The Belmont Report: Ethical Principles and Guidelines for the protection of human subjects of research - April 18, 1979 http://videocast.nih.gov/pdf/ohrp_belmont_report.pdf The three fundamental ethical principles for using any human subjects for research are (as found on http://ohsr.od.nih.gov/guidelines/belmont.html): • Respect for persons: protecting the autonomy of all people and treating them with courtesy and respect and allowing for informed consent. Researchers must be truthful and conduct no deception; • Beneficence: The philosophy of "Do no harm" while maximizing benefits for the research project and minimizing risks to the research subjects; and • Justice: ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly — the fair distribution of costs and benefits to potential research participants — and equally.
Data Collection from Individuals 7CFR Part 1C applies to all research involving human subjects conducted, supported or otherwise subject to regulation by any federal department or agency. Research activities exempt from this policy: (1) Research conducted in established or commonly accepted educational settings, involving normal educational practices, such as (i) Research on regular and special education instructional strategies, or (ii) research on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods. (2) Research involving the use of educational tests (cognitive, diagnostic, aptitude, achievement), survey procedures, interview procedures or observation of public behavior, unless: (i) Information obtained is recorded in such a manner that human subjects can be identified, directly or through identifiers linked to the subjects; and (ii) Any disclosure of the human subjects' responses outside the research could reasonably place the subjects at risk of criminal or civil liability or be damaging to the subjects' financial standing, employability, or reputation.
Institutional Review Boards • Most Universities and some other organizations have Institutional Review Boards (IRBs) that provide oversight for research that involves human subjects • NIFA does NOT require that BFRDP grantees have their research reviewed by such a Board • These reviews are generally in place for medical research, but some behavioral science research is also covered by University/Federal guidelines • Some BFRDP grantees may be subject to this process through their organization
Goals for This Session • Understand why program evaluation is important • Learn current BFRDP evaluation requirements to improve data collection and reporting process • Discover best practices and tools for program evaluation that will help you be more successful • Explore rules for data collection from individuals • Investigate how to develop instruments
Developing Survey Instruments Survey instruments should be both: • Reliable “Does our question yield the same result on repeated trials?“ Might measure over time or with a group experts • Valid “Do the questions measure the outcomes stated?” Usually involves review by experts, comparison to recognized valid measurements, or the appearance of correctly measuring
Example BFRDP Survey Instrument Design “Content validity was assured through a review of the instrument by this panel of experts consisting of approximately 47 project coalition members from agriculture, higher education, non‐profit organizations, and state agencies (Pedhazur & Schmelkin, 1991). Approximately fourteen individuals were asked to complete a pilot test of the instrument to ensure the online survey instrument functioned correctly. The Virginia Tech Institutional Review Board reviewed the instrument and gave approval for administration.” From Virginia Beginning Farm & Rancher Coalition Project Survey Report, October 2011, Prepared by Matt Benson, PhD Student, Agriculture & Extension Education http://www.vabeginningfarmer.aee.vt.edu/survey-finalreport.pdf and http://s2fdata.org Adapted from the Northeast Beginning Farmers Project survey instrument titled “Beginning Farmer Barrier ID Ranking: Ranking the needs of the beginning farmer...”
CLAIMING CREDIT • You want to show that changes are a result of your training. • To show your influence: • - Use the same observation item, survey question , activity (diagnose plant diseases) or other data points at baseline, before the program and after: • - Use a retrospective survey question that measures before and after both at the same time,after the program.
BFRDP Outcome measure: % changed farming/ land management practice ONE YEAR AFTER
Thank You Contact for more information Stephanie Ritchie 301-504-6153 stephanie.ritchie@ars.usda.gov