310 likes | 318 Views
Learn how to plan and implement an instructional needs assessment and analyze learner, task, and situational characteristics in assessment. This material was developed by Columbia University and funded by the Department of Health and Human Services.
E N D
Training and Instructional Design Unit 6: Assessment This material (Comp20_Unit6) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by Columbia University under Award Number 90WT0004. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/.
Assessment Learning Objectives • Objective 1: Identify an instructional problem • Objective 2: Plan and implement an instructional needs assessment • Objective 3: Analyze learner, task, and situational characteristics
Evaluation and learner assessment in instructional design 6.1 Figure (Hall, M.V. and Zimmerman, J., 2012).
Purpose of program evaluation • Assessment determines the extent to which a program achieves its stated goals and objectives • The results can be used to inform: • Decision – making to improve the program • Revisions to program goals and objectives • Budget requests and resource allocation • Progress reports to management
The evaluation plan • How well are the stated training goals being met? • How will the information be collected? • How will the data be used to improve the program?
The evaluation plan (Cont’d – 1) • The assessment plan should outline: • Program goals • Intended outcomes • Data gathering and interpreting methods • Plan for using data as evidence to inform program improvement • When and how the evaluation will be conducted
Develop a logic model • Inputs • Resources • Contributions • Investments • Outputs • Activities • Services • Events • Products • Outcomes • Results or changes for participants
Example of a simple logic model • Inputs • Get pills • Take pills • Outputs • Feel better • Outcomes
Logic model: EHR training workshop 6.2 Figure (Hall, M.V. and Zimmerman, J., 2012).
Purpose of the logic model • The logic model helps us: • Match evaluation to the program • Determine what and when to measure • Decide on the type of evaluation, process, and / or outcomes • Determine the testing instruments and procedures • Prioritize where to spend evaluation resources
Assessment and evaluation definitions • Process evaluation: • Determines if activities are delivered to the target population as intended • Outcome evaluation: • Determines if the program successfully produced the desired changes or outcomes • Formative evaluation: • Gathers data to inform program improvement • Summative evaluation: • Makes summary judgments about a program’s performance
Writing assessment plans • For each evaluation objective, we need to identify: • Evaluation questions • Collection methods • Instruments used for data collection
Review of logic model: EHR training workshop 6.2 Figure (Hall, M.V. and Zimmerman, J., 2012)
Process and formative evaluation questions • What amount of money and time were invested? • Were all the sessions delivered? • How many persons were trained? • Was the intended population trained? • How well were the sessions delivered? • What was the level of participation? • Were participants satisfied?
Process evaluation plan 6.1 Table (Zimmerman, J., 2010).
Process evaluation plan (Cont’d – 1) 6.2 Table (Zimmerman, J., 2010).
Outcome evaluation questions • To what extent did skills improve? • To what extent did behaviors improve? • Which participants demonstrated a change in skills and behaviors?
Outcome / learner assessment • Learner assessments start with good learning objectives • Create learning objectives that: • Focus on student performance • Outline measurable and observable tasks • State how you will know that the objective has been reached
Outcome / learner assessment (Cont’d – 1) • By the end of the workshop, participants will be able to recall the steps of the intake process • Can assess this objective with an oral test, or observe the participants’ performance as they actually use the computer to complete the task • However, multiple assessment methods are best to authentically measure student learning
Types of learner assessments 6.3 Table (Zimmerman, J., 2010).
Using rubrics • A rubric is a set of criteria used for assessing a piece of work or performance • Includes levels of achievement for each criterion • There will often be numerical scores for each level of achievement, and a summary score that may be produced by adding the scores • May also include space to describe the reasons for each score
Sample rubric criterion • The Performance Task requires students to collect and process information, using it for an authentic purpose • Scarborough, 2006 [Based upon White and Scarborough, 2004]
Sample rubric levels of possible achievement • The task incorporates a variety of information gathering techniques and information resources. Students are required to interpret and synthesize information and accurately assess the value of information gathered. They are required to collect the right information for an authentic purpose, e.g. solve a problem, apply or use in a complex project, etc. • The task requires students to gather and synthesize information, but the value of the information gathered is not assessed. Information may not be used for a purpose • The task requires the students to gather information, but not to interpret it • The task requires no gathering or processing of information
Review of outcome evaluation questions • To what extent did skills improve? • To what extent did behaviors improve? • Which participants demonstrated a change in skills and behaviors?
Outcome evaluation / learner assessment plan 6.4 Table (Zimmerman, J., 2010).
How to use evaluation results • Results can open up a discussion about a program’s strengths and weaknesses and inform changes to improve its effectiveness • Compare program objectives with the outcome data, and determine the degree to which each objective was met or not met • Understanding the reasons why a program succeeded or failed can lead to recommendations
Communicating evaluation results • Evaluation reports should outline: • The objectives and outcomes • How data was collected and identify sources (e.g. participants) • Barriers • Recommendations
Unit 6: Assessment, Summary • Program evaluation and learner assessment • Decision – making to improve the program • Revisions to program goals and objectives • Budget requests and resource allocation • Progress reports to management
Assessment References References Carkhuff, R.R., & Fisher, S.G. (1984). Instructional systems design: volumes I & II. Amherst, MA: Human Resource Development Press. Carliner, S. Training Design. (2003).Danvers, MA: American Society for Training and Development. Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Massachusetts: Allyn & Bacon Clark, D.R. (2004). Bloom’s Taxonomy [monograph on the Internet]. Big Dog & Little Dog’s Performance Juxtaposition. Retrieved on June 21st, 2010 from: http://www.nwlink.com/~donclark/hrd/sat.html. Clark, D.R. Instructional System Design (ISD). (2004). Big Dog & Little Dog’s Performance Juxtaposition. Gagne, R.M., Wager, W.W., & Golas, K. (2004). Principles of Instructional Design (5th ed.). California: Wadsworth Publishing. Reigeluth, C.M. Instructional-design Theories and Models: A new paradigm of instructional theory. Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc.
Assessment References (Cont’d – 1) Charts, Tables and Figures: 6.1 Table: Zimmerman, J. (2010). Process evaluation. Department of Biomedical Informatics, Columbia University Medical Center, New York, NY. 6.2 Table: Zimmerman, J. (2010). Process evaluation. Department of Biomedical Informatics, Columbia University Medical Center, New York, NY. 6.3 Table: Zimmerman, J. (2010). Types of learner assessments. Department of Biomedical Informatics, Columbia University Medical Center, New York, NY. 6.4 Table: Zimmerman, J. (2010). Outcome evaluation/learner assessment plan. Department of Biomedical Informatics, Columbia University Medical Center, New York, NY. 6.1 Figure: Hall, M.V. and Zimmerman, J., 2012 6.2. Figure: Hall, M.V. and Zimmerman, J., 2012
Unit 6: Introduction to Training and Adult Learning This material (Comp 20 Unit 6) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number IU24OC000013. This material was updated in 2016 by Columbia University under Award Number 90WT0005.