360 likes | 510 Views
D1.HRD.CL9.07 D1.HHR.CL8.08. EVALUATE THE EFFECTIVENESS OF AN ASSESSMENT SYSTEM. 1. Plan the evaluation of the assessment system. Performance criteria for this element include: 1.1 Identify the purpose and role of the evaluation 1.2 Define the assessment system
E N D
D1.HRD.CL9.07 D1.HHR.CL8.08 EVALUATE THE EFFECTIVENESS OF AN ASSESSMENT SYSTEM
1. Plan the evaluation of the assessment system Performance criteria for this element include: 1.1 Identify the purpose and role of the evaluation 1.2 Define the assessment system 1.3 Identify the needs of the stakeholders 1.4 Identify and obtain resources to enable the evaluation 1.5 Develop an evaluation plan 1.6 Determine the evidence that needs to be captured during the evaluation 1.7 Define the criteria for determining the effectiveness of the assessment system 1.8 Identify cost-effective methods for capturing and analysing evaluation data 1.9 Develop tools to capture identified evidence regarding the effectiveness of the assessment system
1.2 Define the assessment system In order to determine what will be evaluated in an assessment system, it is first necessary to define the assessment system. There are many different aspects to the assessment system that will need to be outlined in order for the purpose of the evaluation to be aligned with the different parts of the assessment system. The definition of the system should be clear, concise, complete and transparent.
1.3 Identify the needs of the stakeholders Stakeholders are a valuable resource for: • Determining and prioritising key evaluation questions • Trialling data collection methods and tools • Facilitating data collection • Implementing evaluation activities • Increasing the credibility of analysis and interpretation of evaluation information • Ensuring evaluation results are used
1.3 Identify the needs of the stakeholders cont’d Who are the stakeholders? • Learners • Trainers and assessors • Supervisors, managers, and business owners • Government agencies • Co-workers • Customers
1.4 Identify and obtain resources to enable the evaluation Brainstorming the possible resources that may be required to complete the evaluation will provide a clearer understanding of what is realistically achievable and what is not. In developing an idea of resources that may be needed, the following resource components should be considered: • Financial resources, including money to fund staff involved in the evaluation, to purchase necessary materials, to pay for travel, to support data gathering, analysis and interpretation • Physical resources, including equipment, materials, transportation, computer access, venue or office space • Human resources, including internal staff and external personnel, as appropriate
1.5 Develop an evaluation plan The evaluation plan needs to answer the questions relating to the assessment system about the: • What?The “What” reflects the description and accomplishments of the system. • How?The “How” answers the question, “How did you do it?” and assesses how the system is being implemented and if the system is operating with alignment to the system’s policies and procedures. • Why it matters?The “Why It Matters” represents how the system makes a difference to the organisation as a whole.
1.6 Determine the evidence that needs to be captured during the evaluation Some examples of evidence that needs to be captured may include: • Rolls and other attendance forms that record participation in assessment • Samples of assessment evidence used by assessors to determine competency • Reports relating to assessment-related meetings held by trainers, assessors and learners/candidates • Information provided to learners regarding their assessment • Sample assessment items, such as tests, requirements for demonstrations, third party reports, observation checklists, portfolios of work • Criteria used for judging competency • Proof of expenditure relating to the cost of assessments • Evidence of learner progress
1.7 Define the criteria for determining the effectiveness of the assessment system In determining the appropriate criteria to be used for the evaluation, it is important to check: • The evaluation questions • The stakeholder needs and desired outcomes • The purpose of the evaluation
1.7 Define the criteria for determining the effectiveness of the assessment system Depending on the circumstances of the program and the evaluation questions, examples of possible criteria include: • Purpose or goals prescribed by law or regulation, • Policies or procedures established by internal/external officials • Professional standards or norms • Expert opinions • Prior period’s performance • Performance of other entities or sectors used to benchmark performance
1.8 Identify cost-effective methods for capturing and analysing evaluation data To choose the appropriate methods in your evaluation plan, you will need to: • Keep in mind the purpose, assessment system description, stage of development of the system, evaluation questions, and what the evaluation can and cannot deliver • Confirm that the method(s) fits the question(s); there are a multitude of options, including but not limited to qualitative, quantitative, mixed-methods, multiple methods • Think about what will constitute credible evidence for stakeholders or users • Identify sources of evidence (i.e. persons, documents, observations, administrative databases, surveillance systems) and appropriate methods for obtaining quality (i.e., reliable and valid) data • Identify roles and responsibilities along with timelines to ensure the project remains on-time and on-track • Remain flexible and adaptive, and as always, transparent • Consider the cost of various methods
1.9 Develop tools to capture identified evidence regarding the effectiveness of the assessment system The basic components of an evaluation design include the following: • The evaluation questions, objectives, and scope; • Information sources and measures, or what information is needed • Data collection methods, including any sampling procedures, or how information or evidence will be obtained • An analysis plan, including evaluative criteria or comparisons, or how or on what basis program performance will be judged or evaluated • An assessment of study limitations
1.9 Develop tools to capture identified evidence regarding the effectiveness of the assessment system There are many tools that could be developed and utilised to gather relevant evidence for the evaluation and these could include: • Survey instruments such as questionnaires, diaries, logs, attitude scales, diagnostics • Interview schedules and records • Observation sheets and checklists • Objective product analysis sheets • Identified evidence such as: • Affective evidence, i.e. satisfaction with the program • Cognitive evidence, i.e. relating to knowledge and/or skills gained • Performance or behaviour, i.e. relating to quality of work performed after training/assessment, productivity
2. Undertake the evaluation of the assessment system Performance criteria for this element include: 2.1 Trial the data gathering tools and techniques 2.2 Revise the data gathering tools and techniques on the basis of trials conducted 2.3 Collect the identified evidence in accordance with the approved evaluation plan 2.4 Store data 2.5 Analyse the data
2.1 Trial the data gathering tools and techniques After gathering some sample results from different aspects of the evaluation, it is necessary to ask some questions that relate to the data, such as: • Was the process easy to implement? • Did all participants understand what they had to do? • Is the data relevant to the evaluation questions and purpose? • Were the techniques cost effective? • Are there enough people available to conduct the evaluation? • What issues need to be rectified? • Is the reporting mechanism effective? • Can the data be easily stored and retrieved?
2.2 Revise the data gathering tools and techniques on the basis of trials conducted Some issues that may arise from the pilot process could include: • Challenges in gathering the data: time, location, cost, compiling the data, not enough human resources, not enough data records available • Issues with the content: questionnaire questions, survey questions, too many questions or not enough, language challenges for understanding questions • Data doesn’t reflect evaluation questions or purpose
2.3 Collect the identified evidence in accordance with the approved evaluation plan
2.3 Collect the identified evidence in accordance with the approved evaluation plan When collecting the identified evidence it is important to: • Use the data gathering tools and techniques as planned • Make sure the evidence is sufficient • Ensure privacy and confidentiality • Check the accuracy of the data • Allow for changes in data collection if there is not enough or inappropriate data gathered • Communicate with stakeholders • Ignore irrelevant data • Follow-up on relevant issues that are identified as part of the evaluation process • Make sure that all relevant data is gathered
3. Prepare an evaluation report Performance criteria for this element include: 3.1 Produce a written evaluation report 3.2 Distribute report to stakeholders for comment 3.3 Make a verbal presentation to support the report 3.4 Gather feedback on the report 3.5 Revising draft recommendations on the basis of feedback received 3.6 Determine action to be taken as a result of the evaluation
3.1 Produce a written evaluation report The basic elements of an evaluation plan include: • Title page • Question overview • Intended use and users • Program description • Evaluation focus • Methods • Analysis and interpretation plan • Use, distribution of results, and recommendations
3.1 Produce a written evaluation report cont’d Checklist for an evaluation report: • Provide interim and final reports to intended users in time for use • Tailor the report content, format, and style for the audiences by involving audience members • Include an executive summary • Summarize the description of the stakeholders and how they were engaged • Describe essential features of the system (e.g., in appendices) • Explain the focus of the evaluation and its limitations • Include an adequate summary of the evaluation plan and procedures • Provide all necessary technical information (e.g., in appendices) • Specify the standards and criteria for evaluative judgments • Explain the evaluative judgments and how they are supported by the evidence • Include examples of research/evaluation tools used • List both strengths and weaknesses of the evaluation • Discuss recommendations for action with their advantages, disadvantages, and resource implications • Ensure protections for program clients and other stakeholders • Anticipate how people or organizations might be affected by the findings • Verify that the report is accurate and unbiased • Organize the report logically and include appropriate details • Remove technical jargon • Use examples, illustrations, graphics, charts and stories
3.2 Distribute report to stakeholders for comment When developing your communication or dissemination strategy, carefully consider the following: • With which target audiences or groups of stakeholders will you share findings? • What formats and channels will you use to share findings? • When and how often do you plan to share findings? • Who is responsible for carrying out dissemination strategies?
3.2 Distribute report to stakeholders for comment Some questions to ask about the potential audience(s) are the following: • Who is a priority? • What do they already know about the topic? • What is critical for them to know? • Where do they prefer to receive their information? • What is their preferred format? • What language level is appropriate? • Within what time frame are evaluation updates and reports necessary?
3.3 Make a verbal presentation The following checklist can be used to determine who your audience is: • What is the group size? How many people will be attending the presentation? • What is the average age of the audience? Is there a wide variety of ages represented or are all participants of a similar age? • What jobs or positions do the people in the audience hold? How does this relate to your position? • Why are people joining your presentation? • How much do the audience know about your topic? • Where are you presenting? What time is the presentation? • Is the room air conditioned/heated? Is it small, large, hot, cold or loud? • Has the audience been working all day or is it being presented very early in the morning?
3.3 Make a verbal presentation When selecting relevant information you should: • Know who your audience is and what they expect from your presentation • Have a clear purpose or goal for your presentation • Know how long your talk will go for • Write an outline for your presentation • Understand clearly the context or why you are presenting • Choose information from appropriate sources that relates to your outline
3.4 Gather feedback on the report A variety of activities can be included in your evaluation plan to solicit stakeholder feedback and facilitate interpretation of evaluation data including: • Meetings, surveys, feedback forms, interviews
3.5 Revising draft recommendations on the basis of feedback received Any recommendations need to be: • Clearly numbered and linked to the evaluation question • Based upon evidence that is sufficient and valid • Made available to relevant stakeholders in order for them to provide feedback • Reviewed by stakeholders and any changes agreed upon • Any changes to draft recommendations need to be documented and approved by relevant stakeholders
3.6 Determine action to be taken as a result of the evaluation The actions will be specific to the assessment system that has been evaluated and they could include: • Continuing with the existing assessment system arrangements • Modifying assessment arrangements with existing internal and/or external assessment providers • Discontinuing the existing assessment system arrangements • Moving internal assessment to an external provider and/or moving external assessments to an internal system • Changing external assessment providers • Communicating satisfaction and/or dissatisfaction with assessment service providers • Placing additional specific service requirements on assessment service providers • Altering the internal assessors used, or providing existing assessors with identified training to address deficiencies in their practice • Modifying existing assessments system components due to findings from the evaluation including changing assessment methods, venues, timings, assessment tools
3.6 Determine action to be taken as a result of the evaluation cont’d