260 likes | 271 Views
This lecture outlines essential data collection methods for Human Resource Development (HRD) evaluation, including interviews, questionnaires, direct observation, written tests, simulations, and archival performance data. The advantages and limitations of each method are discussed, emphasizing the importance of choosing the right method based on reliability, validity, and practicality. The lecture also covers the types of data used, such as individual performance, systemwide performance, and economic performance data. Considerations about self-report data and research design are highlighted, addressing potential biases and the significance of a well-defined research plan for HRD evaluation.
E N D
Advances in Human Resource Development and Management Course Code: MGT 712 Lecture 29
Recap of Lecture 28 • Payoff of Training • Why Do HRD Programs Fail to Add Value? • HRD Process Model • Effectiveness • HRD Evaluation • Purposes of Evaluation • Models and Frameworks of Evaluation • Comparing Evaluation Frameworks Lecture 29
Learning Objectives: Lecture 29 • Data Collection for HRD Evaluation • Data Collection Methods • Advantages and Limitations of Various Data Collection Methods • Choosing Data Collection Methods • Type of Data Used/Needed • Use of Self-Report Data • Research Design • Ethical Issues Concerning Evaluation Research Lecture 29
Data Collection for HRD Evaluation Evaluation efforts require the collection of data to provide decision makers with facts and judgments upon which they can base their decisions. Three important aspects of providing information for HRD evaluation include: • Data Collection Methods • Types of Data • Self-report Data Lecture 29
Data Collection Methods • Interviews • Questionnaires • Direct observation • Written tests • Simulation/Performance tests • Archival performance data Lecture 29
Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained interviewers needed Interviews Lecture 29
Advantages: Low cost to administer Anonymity possible Honesty increased if anonymous Respondent sets the pace Variety of options Limitations: Possible inaccurate data On-job responding conditions not controlled Uncontrolled return rate Questionnaires Lecture 29
Advantages: Nonthreatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers Direct Observation Lecture 29
Advantages: Low purchase cost Easily administered Readily scored Quickly processed Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Reliance on norms may distort individual performance Written Tests Lecture 29
Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations: Time consuming Simulations often difficult to create High costs of development and use Simulation/Performance Tests Lecture 29
Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Lack of knowledge of criteria for keeping/ discarding records Information system discrepancies Indirect Need conversion to usable from Records prepared for other purposes Archival Performance Data Lecture 29
Choosing Data Collection Methods • Reliability Consistency of results, and freedom from error and bias in a data collection method • Validity Does the data collection method actually measure what we want it to measure? • Practicality Does it make sense in terms of the resources used to get the data? Lecture 29
Type of Data Used/Needed • Individual performance • Systemwide performance • Economic performance Lecture 29
Individual Performance Data • Individual performance data emphasize the individual trainee’s knowledge and behaviors. • Test scores • Performance quantity, quality, and timeliness • Attendance records • Attitudes Lecture 29
Systemwide Performance Data System wide performance data concern the team, division, or business unit in which the HRD program was conducted and could include data concerning the entire organization. • Productivity • Scrap/rework rates • Customer satisfaction levels • On-time performance levels • Quality rates and improvement rates Lecture 29
Economic Data • Economic data report the financial and economic performance of the organization or unit • Profits • Product liability claims • Avoidance of penalties • Market share • Competitive position • Return on investment (ROI) A complete evaluation effort is likely to include all three types of data. Lecture 29
Use of Self-Report Data • Most commonly used data • Collected for personality, attitude, and perception • Problems: Mono-method bias If both reports pre-test and post-test come from the same source at the same time (after training) the conclusions may be questionable. Socially desirable responses Respondents may report what they think the researcher or boss wants to hear rather than the truth. Fearful to admit that they leaned nothing Response Shift Bias Respondents’ perspectives of their skills before training change during the training program and affect their after training assessment. • Relying on self-report data only may be problematic Lecture 29
Research Design Research design is a plan for conducting an evaluation study. • Research design is critical to HRD evaluation. • It specifies in advance: • The expected results of the study • The methods of data collection to be used • How the data will be analyzed Lecture 29
Research Design If outcomes are measured at all, they are only collected after the training program has been completed. • Such one-shot approach may not capture real changes that have occurred as a result of training • We cannot be certain that outcomes attained were due to the training. Lecture 29
Experimental Design: Pretest-Posttest with Control Group • Pretest and Posttest Including both pretest and posttest allows the trainer to see what has changed after the training. • Control group A group of employees similar to those who receive training, yet who don’t receive training. This group receives the same evaluation measures to make a comparison of their scores. • Random assignment to treatment and control groups so trainees have similar characteristics This four group design is the minimum acceptable research design for training/HRD evaluation efforts It controls the effects of pretest and prior knowledge. Lecture 29
Experimental Design: Pretest-Posttest with Control Group • Time series design Allows the trainer to observe patterns in individual performance. • Sample size Number of people providing data for a training evaluation is often low than what is recommended for statistical analysis. As a bare minimum, the training and control groups each need at least thirty individuals to have even a moderate chance of obtaining statistically significant results. Lecture 29
Ethical Issues Concerning Evaluation Research • Confidentiality When confidentiality is ensured, employees would be more willing to participate • Informed Consent Some evaluations are monitored so that employees know the potential risks and benefits informed consent motivates researchers to treat the participants fairly, it may improve the effectiveness of training by providing complete information. • Withholding Training When results of training are used for raises or promotions, it seems unfair to place employees in control groups just for the purpose of evaluation. • Use of deceptions When an investigator feels a study would yield better results if an employee did not realize they were on an evaluation study. • Pressure to produce positive results When Trainers are under pressure to make sure results of the evaluation demonstrates that the training was effective Lecture 29
HRD Evaluation Steps • Analyze needs. • Determine explicit evaluation strategy. • Insist on specific and measurable training objectives. • Obtain participant reactions. • Develop criterion measures/instruments to measure results. • Plan and execute evaluation strategy. Lecture 29
Summary of Lecture 29 • Data Collection for HRD Evaluation • Data Collection Methods • Advantages and Limitations of Various Data Collection Methods • Choosing Data Collection Methods • Type of Data Used/Needed • Use of Self-Report Data • Research Design • Ethical Issues Concerning Evaluation Research Lecture 29
Reference books Human Resource Development: Foundation, Framework and Application Jon M. Werner and Randy L. DeSimone: Cengage Learning, New Delhi Lecture 29
Thank you! Lecture 29