140 likes | 156 Views
IAEWS Benchmark Study September 2011. Background and Purpose. Effort began in Jan 2011 Create an industry led benchmark study of key metrics for job board operations and effectiveness Challenges in defining the scope and depth of questions
E N D
Background and Purpose • Effort began in Jan 2011 • Create an industry led benchmark study of key metrics for job board operations and effectiveness • Challenges in defining the scope and depth of questions • Significant work in developing common definitions from worldwide community • Use first year results as a baseline to improve and “dig deeper” in future year
Methodology • Input regarding scope and questions solicited from many boards worldwide • Jobg8.com sponsorship allowed for no-cost participation from worldwide boards • Contracted service from professional research company (Critical Insights) to improve online data collection and ensure data security • Relatively short time frame to create/ distribute/collect/correlate and publish results
Participation and Response • Solicited all IAEWS members and other worldwide job boards in June and July • 154 Job boards registered to participate in first year study • 124 boards submitted some response • 101 boards completed the online survey • Invited to participate in discussion of results • Survey closed on August 8th- results published and distributed September 1st • High Level results to all IAEWS members today
Who participated? Reporting region:
Performance Metrics • Data collected on 16 different KPI’s • Cross tabs done on 5 different segments • This morning’s participants requested that the 2hr. discussion focus on: • Sources of Traffic- relative importance • Comparison of Job board features and functionality • Application production rates for different postings • Expense ratios for Marketing/Sales/Technology • Sources of Job Postings
Some Interesting Metrics • Average time on Site- Median= 4.0 mins • Wide variations by region and type of board • Types of Traffic – Window Shoppers or Take action • Median= 70% “Window Shoppers” • Median = 30% “Take some action” • Wide variations based on region, type, tenure and size • Bounce Rates- Median = 41% • Wide variations based on region, tenure and type
Some interesting Metrics • Significant variations based on region, type, tenure and size
Some interesting Metrics • Performance!- generating applications or candidates to an ATS • ATS Postings • Median = 5.0 • Large variations based on region, size, tenure, type (range of 1-22) • “Email” postings • Median= 3.3 • Large variations based on region, size, tenure, type (range of 1-28) • 40% of respondents checked “Unknown”
Summary of Discussion Session this AM • Great opportunity for real discussion on common issues facing our industry • Candid discussion - sharing of some best practices • Further individual analysis by each participant to compare results against their niche- set goals as to where they want to be by next year • Confirmation that this study should be done again next year • Great suggestions on how to improve the instrument / process and participation
2011 Study- Lessons Learned • Solicit members to participate in Steering Committee by late Fall • Scope, questions, promotion, response requirements • Utilize 2011 study and comments from member discussion to better craft questions- probe deeper on certain issues • Complete survey by early summer • Use steering committee to improve the survey • Focus on some additional areas of job board performance • Longer discussion period at Fall IAEWS
Final Thoughts • A great first year effort by the Association and the members who participated • If you did not participate- you missed the opportunity for some good data • Continuous improvement through continuous measurement and analysis • Watch the IAEWS newsletter for an opportunity to participate in the steering committee or as a survey respondent next year.