1.06k likes | 1.64k Views
School Counselors and Program Evaluation. Washington School Counselors Association John Carey National Center for School Counseling Outcome Research UMass Amherst www.cscor.org. Data-Driven School Counseling Programs.
E N D
School Counselors and Program Evaluation Washington School Counselors Association John Carey National Center for School Counseling Outcome Research UMass Amherst www.cscor.org
Data-Driven School Counseling Programs • Implement comprehensive programs based on national design and local need • Use data to determine directions (data driven decision making, needs assessment) • Measure results (program evaluation) • Share successes
What Data Do We Use? ASCA National Model asserts there are three broad categories of data sources: • Student Achievement Data • Achievement-Related Data • Standards and Competency Data
Student Achievement Data 1. Norm-Referenced Standardized Tests • Scores referenced to national average • PSAT, SAT, ACT, Iowa, Metropolitan • Predictive Validity 2. Criterion-Referenced Standardized Tests • Scores referenced to performance standards • State achievement tests (AIMS) • Content related to state curriculum frameworks • Content Validity
Student Achievement Data 3. Performance tests or changes in achievement levels(advancement in Math or English, for example) 4. Portfolios 5. Course grades and GPA 6. Completion of college prep requirements 7. Drop-out rate
Achievement-Related Data • Attendance rates • Behavioral problems • Student attitudes • Discipline referrals • Suspension rates • Drug, Tobacco, and Alcohol use patterns • Parent involvement • Extracurricular activities
Standards and Competency Related Data 1. College Placements 2. Financial Aid Offers 3. Vocational Placements 4. Percentage of students who: Have 4- or 6-year plans Participate in job shadowing Have completed career interest inventories ASCA National Standards
Data Sources vs. Data Types The ASCA National Model identifies three data types for use in program evaluation: • Process Data – What was done for whom? • Perception Data – Attitudes, opinions, beliefs - generally self-report data • Results Data – Objective and measurable student outcomes such as academic achievement, attendance, and disciplinary interventions
Program Evaluation: Process Data • Process Data: What was done for whom? • Who received services? • Ninth graders? Students at risk of failing math? • What did they receive? • Curriculum intervention? Small-group intervention? • When did they receive it? • All year? Twice? For 30 minutes? • Where and How was it provided? • In the classroom? After school?
Program Evaluation: Process Data • Process data alone does not tell us whether or not the student is different (in behavior, attitude or knowledge) as a result of this activity. • Coupled with results data, process data can help identify what factors may have led to success in an intervention.
Program Evaluation: Perception Data • Perception data measures how students are different as a result of an intervention • Did students gain competencies? • Every 10th grade student completed a career interest inventory. • 85% of 10th graders identified the 4 steps in the career decision making process. • Did they gain knowledge? • 87% of 9th graders demonstrated knowledge of graduation requirements. • Were there changes in their attitudes or beliefs? • 86% believe that pursuing a non-traditional by gender career is acceptable.
Program Evaluation: Perception Data Differences in student knowledge, competency and attitudes are measured through: • Pre-post tests • What do students know/believe before and after the intervention? • Completion of an activity • Completion of a 4-year plan • Surveys • What do students say they believe or know?
Program Evaluation: Results Data • Results data is the proof that the intervention has or has not influenced behavior. • An intervention may occur (process data), students may know the information (perception data), but the final question is whether or not the students are able to utilize the knowledge, attitudes and skills to affect behavior (results data).
Program Evaluation: Results Data • Results data can be complex because many factors impact behavior change. • An increase in enrollment at a vocational high school may be due to an intervention implemented for MS students. Conversely, finding no changes in results data does not mean that an intervention has necessarily been unsuccessful.
Clarifying Terms • Research • Action Research • Program Evaluation
Program Evaluation Process • Program Evaluation allows practitioners to evaluate programs and interventions in their specific contexts. • Practices can change immediately and in an ongoing manner as data are collected and analyzed.
Program Evaluation Process 1. Identify the construct(s) 2. Review what is known 3. Develop specific hypotheses or questions you would like to answer and plan the program evaluation accordingly 4. Gather the data 5. Analyze the data 6. Interpret results and disseminate and use findings 7. Evaluate the process
Conducting Program Evaluation: Identify the Construct(s) 1. Review the mission and goals of your program to identify the constructs you would like to look at. Ask yourself: • How are people different as a result of the school counseling program? • What is a question I want to answer? • What do I wish was different?
Conducting Program Evaluation: Identify the Construct(s) Constructs often have “sub-constructs” Define the construct/question in clear, specific language. The more specific you are, the easier it will be to measure later. You can use the ASCA National Model to help you define your construct
Conducting Program Evaluation: Identify the Construct(s) The ASCA National Model domains • Personal/Social Development Domain • Academic Domain • Career Domain
Conducting Program Evaluation: What is Already Known? 2. Review what is known about your questions. • Has anyone in your organization asked this question before? • Who might have information? • What is the relevant research in professional journals? • What does an Internet search find on this topic?
Conducting Program Evaluation: Develop Hypotheses 3. Develop hypotheses or questions you would like to answer and plan the research process accordingly. • Ask yourself what you think the answer(s) to your question(s) will be. • Identifying the hypothesis helps you identify your biases, which may impact your process. • What is the opposite of your hypothesis (the “null hypothesis”)? What would the data look like if you were wrong?
Considerations • What are your biases/Mental Models? How are they impacting the questions you’re asking and the places you’re looking? • Evaluating findings from research literature and internet searches • What is the source? How reliable is it? • What are the strengths/weaknesses of the research design, sampling, effect size, measures used, treatment fidelity, researcher bias, instrument reliability and validity?
Considerations • Data: • How accurate is the data you’ve chosen to use? • What’s missing? • Instruments: • Reliability and validity • Just because it exists doesn’t mean it’s well done
Considerations • Sampling and Research Design: • Size of sample • Comparability of sample and control • Matching vs. random assignment • Assuring fidelity of treatment • Doing same thing across different groups?
Considerations • Ethical Considerations: • Consent • Human Subjects Review • Denying access to interventions - remediation? • Data Analysis • Do you have the capacity to analyze the data?
Conducting Program Evaluation: Develop Hypotheses EXAMPLE: Project Explorers – after school program at vocational high school for MS students Hypotheses/Questions: Does participation in in Project Explorers lead to: • Increased enrollment? • Increased awareness of the offerings of the VHS? • Career exploration? • Improved career decision making abilities?
Activity #1 With your SC program in mind, think about these questions to help you identify constructs: • How are people different after participating in my program? • What is a question I want to answer? • What do I wish was different? • Use these questions to develop a specific, measurable question you would like to answer, and place the answer on the planning tool
Conducting Program Evaluation: Gather Data 4. Gather the data. • What information do you need in order to answer your question? • Use multiple sources of data, or multiple outcome measures, wherever possible (triangulation). • Decide whether you need to consider student achievement data, psychosocial data, career data, school data, process data, perception data, and/or results data.
Conducting Program Evaluation: Triangulation Results Data Process Data THE QUESTION Perception Data
Conducting Program Evaluation: Triangulation INCREASED ENROLLMENT Process Data THE QUESTION Perception Data
Conducting Program Evaluation: Triangulation INCREASED ENROLLMENT PARTICIPATION – # OF SESSIONS # OF STUDENTS THE QUESTION Perception Data
Conducting Program Evaluation: Triangulation INCREASED ENROLLMENT PARTICIPATION – # OF SESSIONS # OF STUDENTS THE QUESTION SURVEY DATA
Project Explorers Example • Results Data • Did the total number of students enrolling in vocational programs increase when compared to the last three years? • Perception Data • Survey using combination of validated items from pre-existing survey (MCGES), and “hand written” items
Project Explorers Example(Perception Data) Circle the correct response for questions 1 - 3. 1) Which school (s) prepares its graduates for skilled employment, such as a mechanic? VHS My local high school Both schools 2) Which school(s) prepares its graduates for 2-year colleges, such as Middlesex Community College? VHS My local high school Both schools 3) Which school(s) prepares its graduates for 4-year colleges, such as UMASS? VHS My local high school Both schools 4) List as many of the shops at VHS that you are aware of:
Project Explorers Example(Perception Data) 5) Check all that apply. I have used the following resources to learn about careers: ___Internet ___ Expert or person working in the field ___Counselor/Teacher ___Other (please specify)_____________ ___Family member ___ I have not researched any career 6) Circle the answer that indicates the level of your confidence for each item. (copyright laws prohibit duplication of these items – Missouri Comprehensive Guidance Evaluation Survey (MCGES), Lapan) 7) Please check the 3 areas of employment that you are most interested in at this time. ____ Agriculture ____ Education ____ Arts & Communication ____ Business and Computer Services ____ Construction ____ Health and Hospitality Services ____ Manufacturing ____ Transportation
Project Explorers Example(Process Data) • Process Data documents “what was done for who” • Number of students participating (overall) • Attendance at each shop
Activity #2 • Identify the PROCESS, PERCEPTION, and RESULTS data needed to answer your question / hypotheses • Use the planning tool!
Conducting Program Evaluation: Gather Data • Where is the data? • Does it already exist? • School records, intake forms, test results • Will you generate your own data? • Surveys • Interviews • Observations • Multiple data sources help you more accurately get at the complexity of a situation, whereas one measure or data source will give you a snapshot view.
Conducting Program Evaluation: Gather Data • Select and/or develop the instruments you will use to gather the data. Possibilities include: (more later) • Surveys • Tests (of achievement, aptitude, attitude, etc.) • Behavioral checklists or observations • School records • Performance assessments • Interviews
Conducting Program Evaluation:Gather Data • Identify and follow ethical and legal standards: • No participant should be exposed to physical or psychological harm. • Permission to use confidential data must be obtained. • Participation in a study is always voluntary. • Participants may withdraw from the study at any time. • Participants’ privacy rights must be respected.
Conducting Program Evaluation:Gather Data • Identify the group/sample to be studied: • Ideally either the entire sample is involved in the study (the class, grade, or school) or the group studied is a random sample of the population. • Stratified sampling uses a smaller sample which has the same proportions as the larger sample. • Systematic random sampling is when every x number of students is chosen from the whole population (every 4th student on the attendance list, for example).
Conducting Program Evaluation:Gather Data • Once data sources and measures are identified, ethical standards are considered, and the sample is identified, data can be gathered! • Ways to collect data include: • Questionnaires or surveys • School records • Interviews • Observation
Collecting Data: Demographics • Regardless of what other types of data you collect and analyze, demographic data is a critical first step • Collect the types of demographic data that are important for making sense of your results, and for describing who contributed to the data you collected
Collecting Data: Demographics • Questions to consider: • What information will I need to adequately describe my sample? • Think of the information you will need to provide to let unfamiliar people know who your program served • What information will I need to analyze the data the way I want to? • Think of the various ways you can describe your results . For example, the impact on different ethnic groups, special education vs. regular education, etc
Demographic Variables • Ethnicity • Gender • Class (Parent Educational Level) • Language Level (Limited English Proficient) • Low Income (Free or Reduced School Lunch) • Acculturation, Migration (Mobility) • Special Needs • School Performance (GPA, Achievement Quartile)
Demographic Variables (school) • Student Participation in School Programs • Student Participation in Extracurricular Activities • Grade Level • Age • Number of Years in Current School • Parent Participation Level
Identifying Effective Surveys • Key questions to consider: • How does the survey relate specifically to my program’s goals? • Is the survey appropriate for my age group of students? • How reliable and valid is the survey?
Creating Your Own Surveys 1. Define Constructs 2. Select/Write Items 3. Write Instructions 4. Test Survey 5. Edit Items and Instructions
1. Define Constructs 2. Select/Write Items 3. Write Instructions 4. Test Survey 5. Edit Items and Instructions Developing Your Own SurveysWriting Items Sometimes pre-existing surveys and measures are not able to adequately capture the attainment of the goals of your program, or the question you would like to answer Open-ended vs. closed questions: Open-ended questions can provide rich data, but are hard to summarizeWhat do you think about Project Explore? Why is career planning important? Closed questions are most common in surveys because the results are easy to summarize