410 likes | 504 Views
Knowing What Audiences Learn: Outcomes and Program Planning. Association of Children’s Museums 2003. I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC , www.imls.gov. Overview. We will
E N D
Knowing What Audiences Learn: Outcomes and Program Planning Association of Children’s Museums 2003 I N S T I T U T E ofM U S E U Mand L I B R A R Y S E R V I C E S Washington, DC , www.imls.gov
Overview We will • Distinguish Outcome-based Planning and Evaluation (“OBE”) from other kinds • Talk about choosing outcomes • Talk about basic elements of a Logic Model (a project or program plan) • Talk about measuring outcomes • Review and summarize
What are Outcomes? • Outcomes are achievements or changes in • Skill – Painting, basketball • Knowledge – Zoology, state capitols • Behavior – Visits museums, reads daily • Attitude –I like science, I love animals
What are Outcomes? • Outcomes can be achievements or changes in • Status –In school, citizen • Life condition –Overweight, healthy
What are Outcomes? Ella goes to the Zoo’s summer program and (Immediate) Takes more interest in animals (Medium) Gets better biology grades (Long term) Becomes a veterinary technician
Outcomes Where do they come from? • Social Services and United Way • OMB: (GPRA) 1993 and PART • Trends in funding • The need to focus on audience • The need to communicate museum value • IMLS
Forms of Evaluation Formative Evaluation You want to know which brochure works best to bring school tours to the zoo Develop two or three prototypes and test on a sample of your target audience; decide which worked best
Forms of Evaluation Process Evaluation You want to see how efficiently your conservation education program is run You count the number of participants in program components, examine how the components are delivered, how long they take, and what it costs
Forms of Evaluation Summative Evaluation You want to know if your exhibit program made visitors aware of how they can protect their environment You run the program, then interview visitors to learn how many might use “green” products more in future
Forms of Evaluation Impact Evaluation (aggregated outcomes create impact) You want to know if your programs helped increase recycling in your area You find out the rate of recycling, run the program, then see if the recycling rate has increased
Forms of Evaluation Outcome-Based Program Evaluation You want to know if your conservation education program changes participants’ behaviors You identify audience needs, plan services to provide participant-oriented outcomes, and assess program results on a regular basis
What can OBE achieve? • Increase program effectiveness • Provide a logical, focused framework to guide program design • Inform decision-making • Document successes • Communicate program value
What are its limitations? • This is a management tool • OBE may suggest cause and effect – it doesn't try to prove it • OBE settles for contribution, not attribution • OBE uses some of the same methods as research, but …
What are its limitations? OBE is not the same as RESEARCH: • Doesn’t try to compare your program with another, similar program • Doesn’t try to compare your methods with different methods to create a similar result • Accepts “good enough” data
How to develop an outcome-based program: Example • What need do you see? • Many kids don’t understand basic science principles; many girls are intimidated by or uninterested in science • Head Start is an opportunity for early science experience, but many teachers have no science education and don’t know how to make it fun • Parents in at-risk families often have little science knowledge, limited child-development skills, and limited reading skills
How to develop an outcome-based program: Example • Who has the need (target audience)? • Kids need science experience • Head Start teachers could provide experience if they had knowledge and skills • Families might encourage science learning if we made it easy and fun • Who could you work with most easily and effectively? Head Start teachers
How to develop an outcome-based program: Example • What could your museum do? • Provide science education resources for Head Start Teachers – learning kits and training to use them • Provide take-home science kits • Partner with public libraries to provide age- and education-appropriate books on science subjects • MESS – Marvelous Explorations through Science and Stories!
Planning for Outcomes • Articulates process and outcomes of your program • Clarifies each element of your program • Identifies indicators of change to be measured • Identifies targets for program impact The Logic Model
Outcomes Inputs Activities Services Outputs Indicators Data Sources Applied to (Who) Data Interval (When) Targets (Goals) Observable and measurable behaviors or conditions Sources of information about conditions being measured The population to be measured When data is collected The amount of impact desired Outcomes Logic Model What goes into outcomes planning Mission + Influencers PROGRAM PURPOSE
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Influencer What they want to know How they will use information Your Organization • To improve the program • To end program • To start another program • Is the program meeting target audience needs? Funders • To fund program • To increase funding • To promote replication • Who does the program serve? • Is the program effective? Project Partners • Is responsibility equal? • Which services produce outcomes? • To change process • To add partners • To change responsibilities Influencers Staff, participants, agencies, funding sources, competition, community groups, professional associations
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Logic Model: Mission • How do the organization’s mission and the program connect? • Is the link between mission and program purpose reasonable? • What action words connect the mission to the program?
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal MESS We do what, for whom, for what outcome or benefit? Program Purpose The Science Museum, Public Libraries, and School District will partner to provide science kits, training, reference materials, and take-home kits for Head Start teachers to increase teachers’ ability to make science learning fun, increase kids’ science interest and knowledge, and engage parents in science play
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Logic Model: Activities/Services • Program activities make it possible to deliver services to program participants • (Most “activities” are managerial or administrative) • Program services engage participants and produce outcomes • Program services and activities are driven by audience characteristics
MESS A Museum /Library/School Collaboration • Activities • Create kits • Design training • Workshop Logistics • Services • Teacher training • MESS kits and books • Take home kits
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal MESS • Target Audience • Head Start teachers in Alachua Co. • 3 to 5 year-olds from at risk families in Alachua • Families in Alachua Co. • Knowing if you reaching the intended audience is critical–you must decide • What information is critical to know? • How will I get the information? • What are the confidentiality issues?
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Logic Model: Outcomes • Outcomes • State how you expect people to benefit from your program • State the intended results of your services • Describe changes in skills, attitudes, behaviors, knowledge
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal MESS • Samples • Outcome 1 • Teachers will be more confident helping students learn about science • Outcome 2 • Teachers will include science experiences in their classrooms
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Logic Model: Indicators • Indicators • Are measurableconditions or behaviors that can show an outcome was achieved • Saywhat you hope to see or know • Are observable evidence of accomplishments, changes, or gains • “Stand for” the outcome
Outcome 1 Indicators • Teachers will be more confident helping students learn about science • The # and % of teachers who say they have at least “some” confidence on a 5-point scale (none, a little, some, a lot, complete confidence) Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal MESS Indicators
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Data sources are tools and locations for information that will show what happened other forms of information • Pre/Post test scores • Program records • Assessment reports • Records from other organizations • Observations Logic Model: Data Sources
MESS Data Sources Outcome 1 Indicators Data Sources Teachers will be more confident helping students learn about science The # and % of teachers who say they have at least “some” confidence on a 5-point scale • Teacher • surveys
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Logic Model: Applied to (Who) • Decide if you will measure all participants, completers of the program, or another subgroup • Special characteristics of the target audience can further clarify the group to be measured
MESS Applied to Outcome 1 Indicators Data Sources Applied to Teachers will be more confident helping students learn about science The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teacher surveys All teachers N = 445
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Goal Logic Model: Data Intervals (When) • Outcome information can be collected at specific intervals (end of program, every 6 months) • Data can be collected at the end of an activity or phase and as follow-up (after 3 workshops, after 2 years • Data is usually collected at program start and end for comparison when increases in skill, behavior, or knowledge are expected
MESS Data Interval Outcome 1 Indicators Data Sources Applied to Data Interval All partici- pating teachers N = 445 At end of program year 1 Teachers will be more confident helping students learn about science The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teacher survey
Influencers Mission Program Purpose Activities Services Audience Outcomes Indicators Data Source Who When Target Logic Model: Targets (Goals) • Targets (goals) are chosen expectations for outcomes a program hopes to achieve - usually a percentage and/or number • Influencer expectations affect targets, which can also be based on a program’s past performance
MESS Target Outcome 1 Indicators Data Sources Applied to Target Data Interval 50% All partici- pating teachers N = 445 At end of program year 1 • Teachers will be more confident helping students learn about science The # and % of teachers who say they have at least “some” confidence on a 5-point scale Teacher surveys
What should reports say? • We wanted to do what? • We did what? • So what? • Above all • Reports should meet influencer needs for • information and program results • Reports should guide program staff to • improve outcomes for program participants
What will it cost? • Assume 7-10% of program costs for program evaluation (non-research) • What will you get? • Low cost –Know numbers, audience characteristics, and customer satisfaction • Low to moderate cost –Know changes in audience skills, knowledge, behaviors, and attitudes
What will you get? • Moderate to high cost – Comparison groups can show attribution, short-term changes due to program • High cost –Long-term follow up, can attribute long-term changes to audience due to program services (research)
For more information Karen Motylewski Institute of Museum and Library Services 1100 Pennsylvania Avenue, NW Washington, DC 20506 202-606-5551 http://www.imls.gov kmotylewski@imls.gov