300 likes | 309 Views
Explore different methods of recruiting diverse candidates and understand legal challenges in recruitment, as well as effective selection systems and decision-making processes. Learn about the relationships among job analysis, HR planning, recruitment, and selection.
E N D
Week 6: Recruitment and Selection Agenda for Today • Understand the components of labor market and different methods of recruiting candidates, especially diverse candidates • Understand adverse impact and other legal challenges in recruiting • Understand effective selection systems and the different components of an effective selection tool • Understand reliability and validity of selection devices • Understand different ways to make selection decisions
Relationships Among Job Analysis, Human Resource Planning, Recruitment and Selection Job Analysis Human Resource Planning Nature and requirements of specific jobs • How many employees needed? • When Needed? • KSAs needed? • Special Qualifications? Job Description • Source? • How are qualified candidates to be recruited? • Recruiters? • Inducements? Recruitment Pool of qualified applicants Selection
Developing Recruitment Policies: Understanding the Labor Market • What defines the limits of a labor market? • Geography • Education and/or technical background required to perform a job • Industry • Licensing or certification requirements • Union membership • Who is in the Labor Force Population? All individuals who are available for selection if all possible recruitmentstrategies are used • Who is in the Applicant Population? A subset of the labor force population that is available for selection using a particular recruiting approach • Who is in the Applicant Pool? All persons who are actually evaluated for selection
Methods of Locating Qualified Job Candidates • Recruiting from Within the Organization • Human Resource Information Systems (HRIS) • Job Posting and Bidding • Recruiting from Outside the Organization • Advertisements • Unsolicited Applications and Resumes • Internet • Employee Referrals (e.g., Cisco, Trilogy Software) • Executive Search Firms • Educational Institutions • Professional Organizations • Labor Unions • Public/Private Employment Agencies • Temporary help Agencies; Employee Leasing
Advantages and Disadvantages of Internal and External Recruiting Strategies
Effectiveness of Different Recruitment Sources The average rating for nine recruitment sources on a 5-point scale as reported by 201 HR executives (1= Not good; 3=Average; 5=Extremely Good) (From Terpstra, HR Focus, May 1996)
Realistic Job Previews (RJPs): The Key to Recruitment Success • High performance and lower turnover costs with RJPs. • Lowers expectations to acceptable levels • Vaccinates against job reality • Conveys an air of honesty • Offers chance to self-select out of job • Effective RJPs provide: • Credible, consistent and understandable information (e.g., 24-hour “recruit cams”, or “day-in-the-life” employee profiles) • Information relevant to the candidates • Optimal amount of negative information • Are effective both before and after hire • Work best (1) when few applicants are hired (2) for entry-level positions, (3) when unemployment is low.
Legal Considerations in Recruiting Decisions • External Recruiting that aims to reduce the occurrence of disparate impact by specifically targeting a wide variety of sources and members of protected classes (e.g., colleges with large female or minority populations, professional associations, spouse relocation/assistance programs) • Internal Recruiting strategies need to be extra vigilant in preventing disparate impact (E.g., referrals, walk-ins) • Employment Advertising that does not create an unequal opportunity (e.g., young & enthusiastic, recent college graduate etc., are impermissible) • Bottom line Question: Are recruiting efforts providing the organization qualified applicants with an appropriate mix of protected-class individuals?
Corporate Recruiting Policies for Diversity • Passive nondiscrimination: Commitment to treat all races and both sexes equally in all hiring, promotion, pay decisions. No attempt to actively recruit minority applicants. Fails to recognize past discriminatory practices blocks prospective applicants from seeking present job opportunities. • Pure diversity-based recruitment: Actively expand the applicant pool so that no one is excluded because of past or present discrimination. • Diversity-based recruitment with preferential hiring: Goes further than #2 strategy—systemically favors women and minorities in hiring and promotion decisions. • Hard ‘quotas’: represent a mandate to hire or promote specific numbers of women or minority group members.
Recruiting for Diversity • Use women and members of underrepresented groups in • HR offices as interviewers • On recruiting trips to high schools, colleges, job fairs • In advertisements • Establish contacts in the groups targeted for recruitment through community or professional organizations (e.g., Society of Mexican-American Engineers & Scientists, National Society of Black Engineers) • Offer additional bonus to search firms for finding a qualified minority applicant. • Outreach & Recruitment Directory: A Resource for Diversity Related Recruitment Needs (2001) • In the end: Candidates look for (1) gender & ethnic diversity in the workforce and upper management (2) availability of training & career development programs
Evaluating Recruitment Efforts: Adverse Impact • Selection Rates (S.R.): Ratio of the total no. of applicants to be selected to the total no. of applicants • UGESP: Calculate S.R. for (a) each job category, (b) for both external and internal selection decisions, (c) for each step in the selection process, (d) by race and sex of applicants • Yield Ratio: Reflects the percentage of job applicants at the beginning of the step in the selection process who move on to the next step in that process (Breaugh, 1997) Applicants 100 Final Interviews 20 Offers Made 10 Hired 8 On-the-job after 1 year: 6 Yield ratio = 5:1 Applicants/Final Interview Yield ratio = 2:1 Final Interview/Offers Yield ratio = 4:3 Hired/Retained Yield ratio = 5:4 Offers/Hires Above Figure also illustrates Applicant Flow Data/Statistics
Common Goals in Recruitment and Selection: Employer and Employee Perspective Mutual Matching Process Organization Individual Job Requirements Qualifications Motivation Rewards Person-Organization/Job/ Fit
Effective Selection Systems What Constitutes Effective Job Performance? What Characteristics Are Needed to do the Job Successfully? What Predicts Successful Job Performance?
How to Choose ‘Good’ Selection Devices • Selection Devices Need to be Both Reliable and Valid • Reliability index of consistency of a measure • When using any kind of selection device organizations must ensure that characteristics of each candidate are measured in the same way • Reliability coefficient: correlations among repeated applications of same measure • Range from –1.0 to +1.0 (lowest reliability approaches zero) • Types of Reliability • Test-retest • Parallel forms • Internal Consistency (raters/items)
Reliability of Selection Devices • Reliability is an index of the extent to which individual’s score on a test is caused by actual characteristics being measured rather than irrelevant factors: Observed Score = True Score + Error Score • Observed Score = test score • True Score = perfect reliability • Error Score = extraneous factors • Greater reliability = Observed Score approximates True Score • Measures vary in reliability: high (physical characteristics) moderate (abilities, skills); low (personality).
Validity of Selection Devices • Validity: Does the selection device measure what we think it measures? • Validity coefficient: correlation between predictor (test) and criterion (job performance) • Range from –1.0 to +1.0 • Criterion used (job performance) must be valid • Reliability is prerequisite to validity: • Can have reliable measures that are not valid, but cannot have valid measures that are not reliable
Types of Validation Strategies • Criterion-Related Validity: to predict how well an applicant will perform on the job: • Predictive: measured when test results of applicants compared with subsequent job performance • Concurrent: measured when employer tests current employees and correlated the scores with their performance ratings • Content Validity: method to identify KSAs and other characteristics necessary to perform job • Construct Validity: Relationship between abstract characteristic (e.g., conscientiousness) and job performance
Predictive Validation Process Job Applicants Test administered to all job applicants Job applicants are hired without regard to test scores New employees receive basic orientation & training New employees perform the job Time Delay Production records/perf. evals used as Criterion of Job Success Test scores are determined Correlation Analysis If Significant Predictive Validity Exists Use on Future Applicants as a Predictor of Job Success
Predictive Validity • Uses applicant pool to validate test • Procedure: • Administer new test to group of applicants • Use old test to select applicants • After some time, collect data on criterion (job performance) • Compute validity coefficient • Advantages: • Represents applicant pool • Preferred by EEOC • Disadvantages: • Hire unqualified employees • Needs large number of applicants (at least 30) • Time (usually one year between hire and test date)
Concurrent Validation Process Job performance of Present employees Present Employees Test is administered to all present employees holding a particular job Test scores are determined Correlation Analysis Production records, or evaluations of present employees Criterion of Job Success If Significant Predictive Validity Exists Use on Future Applicants as a Predictor of Job Success
Concurrent Validity • Uses current employees to validate test • Procedure: • Administer predictor (test) to existing employees • Collect criterion (job performance) data immediately • Compute validity coefficient • Advantages: • Convenient, efficient, cheap • Saves time; preferred by employers • Disadvantages: • Current employees may differ from applicants: diversity, motivation/anxiety, and on-the-job learning • Restriction of range on criterion
Content Validity • Content of test reflect KSAs needed for job • E.g., work sample tests, in-basket simulations etc. • Not quantitative: no validity coefficient • Matter of judgment • Procedure: • Test based on job analysis • KSAs are observable, not mental constructs • KSAs are not learned on the job • Small difference between test and job content • Use of absolute cut-off (can do/can’t do) rather than ranking • Strong “face validity”: appears to measurewhat it is supposed to • Useful where no. of employees is not large enough to justify empirical validation methods
Construct Validity • Construct: unobservable attribute (e.g., friendliness, aggression) • Procedure: • Test related to other established measures of construct • New friendliness test is related to established measure of Extraversion • Tests unrelated to irrelevant measures • EEOC: new test must be related to established measure of construct and established measure must have criterion validity (must be related to effective performance) • Construct incorporates criterion-related validity • Used less frequently than other types of validity because of questions of legality
Different Selection Methods • Application Blanks (.10-.20) • Weighted Application Blanks • Biographical Information (.32-.37) • Reference Checks (.16-.26) • Positively biased; lawsuits for slander vs. lawsuits for negligence in hiring violent employees • Cognitive Ability Tests (.50) • General Aptitude Test Battery (GATB) • Measures “g” factor – good predictor of job performance • Good validity generalization (extent to which validity coefficients can be generalized across situations)
Selection Methods (contd.) • Personality Tests (.31) • Self-report (MBTI; MMPI; 16PF) • Projective tests (Rorschach inkblots; Thematic Apperception Test) • Integrity Tests (.33-.35) • Performance and Work Sample Tests (.54) • On the job behaviors • Job Knowledge Tests (.45) • Level of understanding about a particular job, e.g., Civil service exams • Assessment Centers (.30-.60) • Multiple measures of key job dimensions • Panel of assessors • In-basket, leaderless group discussion, management games, case analyses, written tests and interviews
Interviewing • Most popular technique: Interviews (.37) • Interviewing used: • Assess social skills • Assess ability to work with individual • Face validity • Expected by interviewer and interviewee • Problems with interviews • Snap judgments; first impression determines final decision; “halo effect” – one characteristic overshadows others • Interviewer Bias • Like me bias • Stereotyping; increases with lack of information • Sex-role expectations
Interviewing Methods • Effective Interviewing • Structured Interview: standardize questions asked of all applicants based on job analysis; determine sample answers in advance; several raters/interviewers • Situational Interviews • focus on hypothetical situations: how would you handle xyz? • Behavioral Description Interview • Focus on actual work behaviors: how did you handle xyz in the past? • Multiple questions for each KSA, formally scored • Team/Panel interview (planned roles) • Train the interviewer • Ask good questions, listen, control bias, document
Meaningfulness of Validity Coefficients(Heneman, Heneman, Judge, 1997) • Ability of selection methods to predict job performance; sometimes training data may be substituted • Generally validity coefficients for selection methods do not exceed .50-.60 • Rules of thumb for judging the validity of selection methods: • Average Validity Ranging from .00-.15 – low validity • Average Validity Ranging from .16-.30 – moderate validity • Average Validity Ranging from .31and above – high validity
Reaching a Selection Decision • Clinical Approach • Subjective decision approach; different individuals arrive at different decisions; maybe biased • Statistical Approach • More objective and superior than clinical approach • Identifying most valid predictors & combining the info • Compensatory Model • High score on one predictor (e.g., cognitive ability tests) makes up for low score on another predictor (e.g., interview) • Multiple Cut-off Model • Requires minimum level of proficiency on all selection dimensions • Multiple Hurdle Model • Sequential strategy: only applicants with highest scores at initial test stage go on to subsequent stages