220 likes | 332 Views
Building Strong Programs: Lessons from Working in the Schools (WITS). January 31 st , 2014. Discussion Topics. Introductions Overview of WITS Training, Expectations, & Strategies Evaluation and Assessments Small Group Discussions. WITS Video. Bridging the Gap.
E N D
Building Strong Programs: Lessons from Working in the Schools (WITS) January 31st, 2014
Discussion Topics • Introductions • Overview of WITS • Training, Expectations, & Strategies • Evaluation and Assessments • Small Group Discussions
Bridging the Gap • WITS believes that when mentors focus their energy and time on the success of our city’s students, their volunteer service directly impacts the lives of young people and helps support a better public school system.
WITS portfolio of programs Programs: • Mid-Day Mentoring • Workplace Mentoring • Early Childhood • Classroom Reading Tutors • Saturday Tutoring • Early Childhood Summer Program • WITSummer in the Parks
Overcoming Obstacles • Model Problem Solving • Name the problem • Talk through possible solutions • Model appropriate problem solving • Encourage the Student • Praise their process • They aren’t alone and—neither are you! The Goal: Students to see themselves as capable problem solvers and valuable individuals
Building Student Fluency • Model Fluency • Reading expressively • Use character voices • Modulating speed and volume • Encourage the Student • Help student choose books • Re-read to build confidence The Goal: students see reading as an opportunity to explore their interests—not a task to complete.
The five c’s of positive youth development (PYD) PYD: focus on the strengths of youth and the positive qualities and outcomes we wish youth to develop Competence Confidence Connection Character Caring *
using evaluation to strengthen programs • Types: • Formative • Summative • Methods: • experimental • quasi-experimental • observational • Considerations: • What is the purpose of the evaluation? • Who will use the evaluation results? • How will they use the evaluation results? • What do stakeholders need from the evaluation? *
The power of observation (formative evaluation) Questions: • Are we meeting our program goals? • Is there consistency across programs? • What immediate improvements can we make? • What changes can/should we make in the future? • Use observation forms/standardize
Workplace Mentoring logic model Program Goals Inputs Assumptions Situation Outputs Outcomes Priorities -need for consistent and caring mentors -need for literacy enrichment programming in the afterschool space -scaffold student self-efficacy -scaffold student attitude -all students are capable of becoming better readers -students are motivated to participate in WITS programs -volunteers are competent tutors/mentors -students and volunteers form a positive relationship over time -human resources: staff; corporate + university volunteer time investment -partners: teacher/principal time investment; experts -WITS Board; Associates Board -financial resources -build knowledge base -other: transportation; bus chaperones; snacks; maintain program materials -effective 1:1 literacy-focused mentoring programs -300 students from 13 schools participate -19 corporate and university groups provide 600 volunteers -curriculum implemented -volunteers coached to provide student with effective support Student: -improved attitude toward reading -increased self-efficacy -improved reading fluency Volunteer: - sustained school/community partnership -high volunteer satisfaction -high volunteer retention EVALUATION identify external factors
WITS evaluation goals • mixed method approach • multiple data sources • triangulate data • balance formative/summative • make decisions on multiple findings
Best practices • align data collection with program goals • think before you collect • weigh pros and cons of types of questions • balance formative and summative • pilot a new assessment • ask the hard/unpopular questions • consult peers • know your research domain
Example: leverage external databenefits of workplace volunteerism 2011 Deloitte Volunteer IMPACT Survey
Example: leverage internal dataStudent-reported satisfaction Youth Emotional Engagement Benchmark Survey adapted from: Herrera, C. (2004). School-Based Mentoring. A Closer Look. Public/Private Ventures, 42.
Example: leverage internal datalength of participation in Workplace mentoring program
Common evaluation errors Avoid: • failing to leverage the data you already have • collecting more data than you really need/can handle • using the same data points when a project has changed • poor survey/question design • wrong question type • vague wording/jargon • incorrect assumptions • leading questions • double-barreling • missing response options • incorrect use of scales
Small group discussion topics 1. Discuss some challenges you have had when working with youth. What changes did you make to your program structure, training or curriculum due to those challenges? 2. How do you gain feedback from your volunteers/staff? Provide an example of feedback your organization has received and programmatic changes you have made due to the feedback? 3. Discuss manageable formative evaluation activities that you and your organization can add to improve program quality. 4. Discuss realistic summative evaluation activities that you and your organization can add to improve program quality. 5. How should an organization respond to negative evaluation results? How should an organization respond to positive results?
Key ideas • Set clear and appropriate expectations! • Constantly observe and troubleshoot • Think critically (but constructively) • Small organizations can easily take big steps toward evaluating program quality