1 / 33

Innovation and Good Practice in Selection: Job Analysis Project and Progress 2011

Innovation and Good Practice in Selection: Job Analysis Project and Progress 2011. Mary O’Reilly, Leicester Programme, Co Chair of Selection Working Party. Background to the Project: The Lancaster Programme.

sybill-lowe
Download Presentation

Innovation and Good Practice in Selection: Job Analysis Project and Progress 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Innovation and Good Practice in Selection:Job Analysis Project and Progress 2011 Mary O’Reilly, Leicester Programme, Co Chair of Selection Working Party

  2. Background to the Project: The Lancaster Programme • 2001: published results of first job analysis, funded by Clearing House. Authors: Phillips, Gray and Hatton from Lancaster Programme • 2005: Lancaster begin using the competencies established using written task, interview and presentation to collect evidence for competencies • 2006: Lancaster begins to use a written task to screen candidates instead of application forms • 2007: Clearing House funded research into use of application forms and written test as first sift of applicants. Report in 2008. Published 2010 (Simpson and Hemmings)

  3. Background to the Project: other programmes • 2005/6 Leicester Programme fund an occupational psychologist to develop competencies and design an assessment centre approach to selection based on these • 2007 Leicester begin screening using a written task • 2009 Surrey commence using written test for screening, Salomons join forces for 2011 • 2011 Shrops/Staffs use written test for screening

  4. Background: National work • 2006: Working party founded by Roth, Wang, & Reynolds. • Fast Stream civil service email in tray task demonstrated at GTiCP • http://faststream.civilservice.gov.uk/How-do-I-apply/Example-e-Tray-Excercise/ • Prohibitive costs to develop this for our use • 2009: Results of Clearing House survey of all UK programmes to look at whether there is a will to move to national screening: 54% express interest • 2009: Selection sub group of GTiCP formed and in 2010 a bid is made for funding to develop a more up to date set of trainee competencies, for use at the point of selection

  5. Successful bid in November 2010 • Helen Baron, Independent Consultant and selection specialist engaged to undertake the work • Steering group for the project: Mary O’Reilly, Leicester Cathy Amor, Lancaster Laura Simonds, Surrey Matthew Jones -Chesters, UEL Linda Hammond, Salomons • Shrops/Staffs Rep (Helena Priest)invited to join us in 2011

  6. Approach

  7. Survey

  8. CP Model • Intellectual Ability • Communication Skills • Self Aware and Open to Learning • Personal Maturity • Warmth and Empathy • Resilience • Organised • Autonomy and Initiative • Motivation and Application • Contextual Awareness

  9. Great 8* comparison *Bartram et al, 2000

  10. Attitudes to a National Shortlisting Tool • Sub-sample from Survey engaged in selection • 86 respondents • 23 course directors • 45 other course staff • 13 clinical supervisors • 5 trainees

  11. Definitely no Probably no In principle would you be interested in the development of a national screening test as an aid to shortlisting for your course? Possibly – depends on measure Definitely yes Probably yes

  12. How appropriate do you consider each of the following measures? % appropriate or somewhat appropriate

  13. Survey Conclusions • Most courses interested in national shortlisting • Over half interested in pilot • Many willing to share costs with students • Variety of assessment options seen as appropriate • Critical Reasoning and other cognitive measures • Situational Judgement • English Language skills

  14. Options appraisal at June conference • Three options presented • No change • Limited pilot • Wholesale adoption of screening • Five groups looked at differing issues • What competencies to measure • Development process • Evaluation process • Logistics and liaising with clearing house • Mandate and accountability

  15. Key messages • In 2011 there were 2000 of 3500 applicants not shortlisted by any course • think of the opportunity costs on staff time (HEI and NHS) of this duplication. • Local decision-making needs to be retained • Current off-the-shelf-measures the preferred way forward • an up-front fee per use that can be known, vs resources needed to develop and scrutinise bespoke measures being the decider • There needs to be proper transparency and accountability (at local programme level and national GTiCP/DCP level) • It is crucial to work closely with the Clearing House on any forward planning

  16. For Screening • Time savings within courses especially with high applicant numbers • Potential to reduce duplication of effort across courses • Addresses double tick applicants • Mapped to competence model • Based on sound selection practice • Capable of empirical study to check validity etc.

  17. For ‘status quo’ • Many courses involve clinicians and service users at the ‘screening by form information’ stage • Feeling that if it isn’t broke don’t fix it • Concern about costs of development

  18. Evidence of effectiveness of screening tests in use • Lancs: • Daiches and Amor (2006) • Simpson et al (2011) • Leics: GTiCP conference 2007 • Surrey/Salomons: GTiCP conference 2009

  19. The scope for a common screening task • To date, 5 programmes are using a screening task • This is a written task • Two programmes have collaborated to use the same task and process (and are reported as one)

  20. Comparisons across 4 programmes • Completed • At the University: all programmes • On computer: 3 programmes • By hand: 1 programme • Marked • By computer: 1 programme • By hand: 3 programmes • Past papers • 2 programmes make these available on-line, 2 do not

  21. Comparisons across 4 programmes • Length of exercise • 30 mins to 1 hour (Mean 49 mins) • Number of questions • 7-14 (mean = 9) • 2 programmes allow flexibility in order of answering questions • 2 programmes ask only about quantitative research • 2 programmes give equal emphasis to quantitative and qualitative methods • No psychological knowledge beyond undergraduate is assumed • Cut off score • 1 programme has a cut off score (50%), 3 rank order

  22. Mapping of Tests to Competence Model • Intellectual ability and academic rigour • Written communication • Resilience • Organisation • Motivation and application

  23. Intellectual ability and academic rigour • Analyses and integrates complex/contradictory material • 2/4 • Synthesises ideas from a variety of sources • 2/4 • Solves problems creatively • 3/4 • Critically evaluates information • 4/4 • Shows good academic and research skills • 4/4 • All other indicators in this area endorsed by one programme or other

  24. Written communication • All competences under written communication • Precise, well expressed and clear • 4/4 • Well organised and grammatical • 4/4 • Adapted to purpose and reader • 2/4 • Coherently argued • 4/4

  25. Resilience, • Copes well with pressure • 4/4 • Tolerates anxiety and uncertainty • 4/4 • Shows flexibility when required • 2/4 • Shows confidence in own ability to deal with multiple demands • 2/4

  26. Organisation • Is punctual and reliable • 3/4 • Prioritises tasks and uses time effectively • 3/4

  27. Motivation and application • Is committed to completing tasks as well as possible • Invests additional effort willingly • Is dedicated to career as clinical psychologist • This competency area mapped by just one course, however, arguably applies to all test takers!

  28. Double Tick • Multi-stage selection process can manage all double tick candidates • All those who meet essential criteria are invited to sit the screening test • Need to set a cut off score below which no candidate can be progressed to next stage • Eg minimum of 60% • Ranking of candidates not accepted under double tick • All double tick applicants above cut score then invited to next stage

  29. A mandate to continue • Directors have said ‘go on’, and want to take more ownership of the project • Steering group made up of directors across regional areas • Working group • Terms of reference to be drawn up by Malcolm Adams

  30. Current working group ideas • Proposal to Pilot use of a common measure for 2013 intake • Requesting further funding for development of a common test • Perhaps one course centre to take lead under the steering group • Initial ideas: One day (Saturday) • One test (form and content) • Computer administration • Scoring options available • machine scored for all but many centres will hand mark

  31. Logistics • Advance notice in Clearing House course entries • One common test vs one common form and different content • One testing day vs several • Regional/area testing centres • All applicants to the ‘opted in’ programmes attend one centre for test • No charge to candidates during pilot

  32. The Future Screening Competencies National Screening?

More Related