390 likes | 488 Views
CHAPTER 5: Accountability: Assessing Needs, Determining Outcomes, and Evaluating Programs. Transforming The School Counseling Profession Fourth Edition Bradley T. Erford. Accountability.
E N D
CHAPTER 5:Accountability: Assessing Needs, Determining Outcomes, and Evaluating Programs Transforming The School Counseling Profession Fourth Edition Bradley T. Erford
Accountability Accountability involves responsibility for professional actions. In the context of a school counseling program, this may involve: Identifying and collaborating with stakeholder groups Collecting data and assessing needs of students, staff, and community Setting goals and establishing objectives based on data Implementing effective interventions Measuring the outcomes of interventions Using results for program improvement Sharing results with major stakeholders Numerous contemporary factors have called for greater accountability from professional school counselors and an increased focus on assessment.
5 Questions to Address Accountability in the School Counseling Program Is a comprehensive, standards-based program in place? (program evaluation or audit) What are the needs of the school’s student population when compared to these standards? (needs assessment) What services were implemented to address the identified needs and standards? (service assessment) What was the result of the implemented services? (results or outcomes studies) How well is the professional school counselor performing? (performance evaluation/appraisal)
Advantages & Disadvantages of Accountability Studies ADVANTAGES Data is better than perception when guiding decision-making. Accountability helps demonstrate necessity, efficiency, and effectiveness of program. Studies can help identify professional development needs. Professional school counselors can network to share program results. Conducting accountability studies is a professional responsibility and demonstrates commitment to improvement. DISADVANTAGES Outcome measures take some training and skill to develop. Quality outcome research and evaluation takes time and resources. Many do not understand the nature and purpose of accountability because of misperceptions or previous “bad” experiences. Data is sometimes “over interpreted” or given undue meaning. Comprehensive evaluations are seldom conducted, leaving the “big picture” incomplete.
Accountability: Evaluating Programs and Assessing Outcomes It is a professional and ethical responsibility for professional school counselors to evaluate their program to ensure that the services provided to students, school personnel, and parents are of high quality and effective in meeting intended needs. Without accountability data to back up service provision, school counseling services could be among the first “nonessential services” to go during budget cutbacks.
Needs Assessment Purposes of needs assessment: Helps professional school counselors understand the needs of various subpopulations of a school community. Helps establish the priorities that guide the construction of a comprehensive developmental school counseling program. Assesses not only what currently is, but what should be.
Data Driven Needs Assessment Data-driven decision-making deals with real needs and impact, and begins with an analysis of school-based performance data. Aggregated data: results are lumped together to show total grade level or school-wide (average) results. Helpful in understanding how the average students perform in a given class, grade, or school. Tells very little about the diversity of learner performance or needs. Tells nothing about how various subgroups or subpopulations performed.
Data Driven Needs Assessment (cont.) Disaggregated data: results are broken down by subpopulations so that performance differences between and among groups can be analyzed. Analysis usually involves intergroup differences (male vs. female, race, ethnicity) Helps to provide hard evidence of gaps in student performance, rather than relying on perceptions. Provides direction for the types of strategies and interventions needed to close these achievement gaps.
Interpreting Norm-referenced Scores: The Basics Percentile rank: Most easily understood by visualizing a line up of 100 individuals all with certain characteristics in common, with the 1st student in the line as the lowest performing student and the 100th student in the line the highest performing. A student’s place indicates their relative standing compared to other students the same age across the country (national percentile rank). Quartile: Interpretive statistic which divides the percentile rank distribution into 4 segments. Q1 = 0-25, Q2 = 26-50, Q3 = 51-75, Q4 = 76-100
Interpreting Norm-referenced Scores: The Basics (cont.) Stanine: Short for standard nine Divides a normal distribution into nine segments, the 1st represents the lowest level of performance and the 9th represents the highest level of performance Unlike quartiles, stanines represent one-half standard deviation units and may be comprised of varying percentages of the population
Perceptions-based Needs Assessments Professional school counselors are often interested in what teachers, parents, and students perceive as primary needs to be addressed in a developmental way.
Frequency of Conducting a Needs Assessment It is best to follow a continuous cycle of assessing programmatic needs. Breaking down needs into areas of academic, career, and personal-social development can guide the rotation of needs assessments. Assessing needs is part of a much bigger endeavor: implementing curricular changes to continuously improve the comprehensive developmental counseling program.
Populations to be Assessed It is most practical and efficient to seek out those stakeholders who are informed and likely to respond, such as teachers, administrators, students, and parents. Although more difficult to obtain large responses, valuable information can be garnered from community organizations, local businesses, and the general public as well. When considering the return rate, a higher return rate leads to a lower sampling error, which leads to greater confidence in the accuracy of the results. The highest priority needs should be those agreed to by all or most populations assessed.
Design Issues in an Efficient Needs Assessment Stone and Bradley (1994) recommended seven methods for determining needs: Questionnaires and inventories Analysis of records Personal interviews Counseling statistics Classroom visits Use of outside consultants Systematic evaluation of the guidance program What is most important is that the needs assessment uses objective methods for data gathering and analysis.
Steps to Form an Efficient Needs Assessment Decide what you need to know. Decide on the best approach to derive what you need to know. Develop the needs assessment instrument or method. Enlist the support of colleagues and a few individuals from the target groups to review and try out items for understanding. Implement the final version on the target groups. Tabulate, analyze, and interpret the results. Translate the results into programmatic goals and objectives.
Design Issues in an Efficient Needs Assessment (cont.) It is important to take the population into consideration when designing a scale: For teachers—should ask for grade level, number of students in the class, etc. For parents—should ask for the names of the parent’s children in case responses require contact by the counselor. For students—should ask for student’s grade and homeroom teacher’s name. Questions or response stems should be short, to the point, and easily understood. Reading level of the items should be appropriate for the target audience.
Response Formats If the purpose is to determine the importance or frequency of a potential problem, it is generally best to use a multipoint scale with 3-5 choices such as: Rarely Sometimes Frequently Most of the time Almost always It is important to provide a clear description of each choice. Be wary of response choices such as: “Never” and “always,” as they are rarely ever the case. “Yes” and “no”, as they force an all or nothing response.
Design Issues in an Efficient Needs Assessment (cont.) It is important to include a frequency count in a needs assessment in order for professional school counselors to not only assess the importance of an issue, but also determine how many, and which students are in need of a specific service. Computing the results of a needs assessment is probably best accomplished by assigning a number value to each response category and averaging all responses for a given item.
Converting Needs to Program Goals and Objectives If the needs assessment was designed correctly, translating the results into goals and learning objectives is relatively easy. Step 1: Prioritize the needs in order of importance and their relationship to existing components of the program. Step 2: Match up or translate the needs into goals included in the national standards. Step 3: Operationalize the goals through development of learning objectives.
Program (Process) Assessment Program Assessment: a systematic examination of a program’s content. Program evaluation asks 2 questions: Is there written program documentation? Is the written program being appropriately implemented?
Program Assessment (cont.) A program audit should be conducted at the end of the year and reports should address: Program strengths Areas in need of improvement Long and short-term goals ASCA suggested that program criteria be evaluated on the following response choices: None (not in place) In Progress (begun, but not completed) Completed (but perhaps not implemented) Implemented (fully implemented) Not Applicable (when the criteria do not apply)
Service Assessment Service assessments are sometimes requested by guidance supervisors and demanded by superintendents and school boards to document how counselors are spending their time. Two types are commonly used: Event-topic counts Involve documenting each time an individual is contacted or provided with a counseling service and the nature of the topic addressed. Time logs Involve documenting and categorizing activities for every minute of the workday. Unfortunately, service assessments give no information about the quality or effectiveness of counselor interventions.
Results or Outcomes Evaluation Answers the question, “How are students different as a result of the program?” (Johnson & Johnson, 2003) Assessment is not a discrete component, but rather an integrated part of a continuous process for program improvement. The evaluation cycle: QuestionsEvidenceInterpretationConclusionsChange
Results or Outcomes Evaluation (cont.) Assessment of school counseling programs IS: a way to answer important program-related questions. the responsibility of an accountable professional school counselor. a cooperative endeavor with other school counseling program advisory committee (SCPAC) members and stakeholders. ongoing and evolving,a means to a better end - better education for all students. Assessment of school counseling programs IS NOT: the evaluation of an individual professional school counselor. an assault on the professional school counselor’s freedom. a mandate for standardized tests or curricula. all figured out.
Important Assessment Terms Evaluation: the measurement of worth; indicates that a judgment will be made regarding the effectiveness of a program. Evidence: any data that will help make judgments or decisions; can be quantitative or qualitative. Formative evaluation: evaluative feedback that occurs during the implementation of a program; allows corrective action to occur if a program is shown to be off course. Summative evaluation: feedback collected at a specific endpoint in an evaluation process.
Important Assessment Terms (cont.) Stakeholder: anyone involved in or potentially benefiting from the school counseling program. Baseline: any data gathered to establish a starting point. Inputs: any resources that go into a program. Pretest: a measure administered before a program is implemented. Posttest: a measure administered after the program or intervention has been completed.
Sources of Evidence People Students Teachers Staff Administration Parents Employers Graduates Community resource agents Products Portfolios Performances Use of rating from external examiners Observations Local tests Purchased tests Student self-assessments Surveys Interviews Focus groups Student work
Practical Program Evaluation Considerations To be of practical value, assessment must be connected to real program concerns as well as the core values of the school or program. Suggestions: Avoid overwhelming the data collectors. Focus on only one or several important questions at a time. Always select measures that will yield reliable and valid scores for the purposes under study. Involve the relevant stakeholders. Use a variety of approaches. Do not reinvent the wheel—use what you are already doing to generate useful data. Don’t be afraid to call upon outside experts to consult on the development and evaluation of a program. Start small and build upon what is found to work. Remember: different can still be effective.
Aggregated Outcomes Aggregation can be a valuable tool in evaluating how well school counseling programs meet higher level standards, such as the ASCA National Standards. Curriculum begins with a statement of standards which are further described through a series of outcomes, which are broken down into measurable objectives. Objective 1 Outcome 1 Objective 2 Objective 3 Outcome 2 Standard 1 Objective 4 Objective 5 Outcome 3 Objective 6
Designing Outcome Studies In most instances, professional school counselors should measure outcomes or results by designing a research-type study. Answering the following questions can help the professional school counselor determine which research design to use: Has the treatment already been implemented? Can I randomly assign participants to treatment conditions? Can I conduct (one or several) pretests, posttests, or both? True experimental designs involve randomization of participants and allow causative conclusions to be reached. Professional school counselors need to select or design an outcome measure with sufficient sensitivity and direct applicability to the issue being studied in order to receive significant results.
Common Designs Used for Outcomes Research Pretest-posttest single group design Case study Static-group comparison Two-sample pretest-posttest design Nonequivalent control group design Time series design Randomized pretest-posttest control group design Randomized post-test only control group design
Principles to Consider When Developing an Outcome Measure from Scratch Use simple language. Ask only about things that the respondent can be expected to know. Make the question specific. Define terms that are in any way unclear. Avoid yes-no questions. Avoid double negatives. Don’t ask double-barreled questions. Use wording that has been adopted in the field. Include enough information to jog people’s memories. Look for second-hand opinions only when first hand information is unavailable. Be sensitive to cultural differences. Learn how to deal with difficult respondent groups.
Common Data Collection Methods Interviews Observations Written questionnaires, surveys and rating scales Program records and schedules Standardized and educator-made tests Academic performance indicators Products and portfolios
Action Research Action research allows professional school counselors to focus on changing social, ecological, or client conditions in particular situations by creating a study and intervention to explore and solve a particular problem, usually in the client’s environment. Involves the analysis of data to improve practice and solve practical problems. Advantages to action research: It requires minimal training, helps develop effective, practice-based solutions for practical problems, and creates a collaborative atmosphere where professionals work together to address and improve conditions affecting students.
Action Research (cont.) Steps to developing an action research plan: Identify and clarify the research problem Gather data (should be gathered using a specific or particular group of individuals) Analyze and interpret data Create an action plan (should support and validate gathered data) Evaluate and reflect (critically and periodically evaluate the plan’s effectiveness)
Reporting the Results It is important that professional school counselors involve the school counseling program advisory committee (SCPAC) in every step of the report writing process. A comprehensive report may be helpful for analysis by the SCPAC, however a 1-2 page executive summary should also be prepared for administrators, system administrators, and the school community. After results have been reviewed by the SCPAC, professional school counselors, and administration, it is important to release the results of the program outcomes to relevant stakeholder groups at regular intervals.
Reporting the Results (cont.) Possible ways to disseminate the school counseling program results: Written report Verbal presentation Multimedia presentation Journal articles Web page Videotape Posters E-mail Text messaging Newspaper article
Performance Appraisal The effectiveness of a school counseling program often relies on the competence and efficiency of its implementers. The services provided by, and responsibilities of, the professional school counselor are the most complex of any school employee. It is best practice to have counselor supervisors coordinate in the appraisal of counselor performance. Criteria for any evaluation always should be available well in advance.
Performance Appraisal (cont.) Most professional school counselor performance appraisals are comprised of a rating system that aggregates or averages responses across a variety of categories or work-related responsibilities. The primary focus of the appraisal system should be to enhance the quality and competence of all those evaluated. It is important that professional school counselors be held to performance standards such as those provided by the ASCA National Model (2005).