1 / 31

Integrated Web Assessment Solutions

benjamin
Download Presentation

Integrated Web Assessment Solutions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Integrated Web Assessment Solutions International Military Testing Association 2003

    3. Supporting organizational capability by effective assessment of KSA-MVA Users: Organizational psychologists Recruiters Human Resources or Personnel Depts. Training & Development, O.D., Learning Sales Vocational Counselors & Outplacement Employee Assistance Programs (EAP) Marketing Research (surveys) Consultants

    4. Supporting organizational capability by effective assessment of KSA-MVA Hundreds of standardized, “off-the-shelf” instruments are now available on-line: Knowledge, skill, and aptitude tests Scored job application forms BioData screens Cognitive skills tests Personality inventories 360 degree tools Integrity tests Emotional intelligence scales Management development Leadership assessment instruments

    5. Supporting organizational capability by effective assessment of KSA-MVA Dozens of reputable publishers sell on-line assessment products: Pan Testing The Psychological Corporation The Test Agency ECPA ETS Pearson Reid London House Ramsay Sigma Skillcheck Hogan

    6. Supporting organizational capability by effective assessment of KSA-MVA Proprietary tests for corporate and government clients (e.g., custom selection) increasingly embedded in web assessment software. Used by Fortune 100 firms, Government clients, small firms Goals of maximizing talent sourcing and filtering and human capital development

    7. Supporting organizational capability by effective assessment of KSA-MVA

    8. Agenda Supporting organizational capability through effective assessment of skills, capabilities, and personality Trends in using on-line tools for assessment Analyzing Potential ROI and the business benefits Hints and Pitfalls based on the experience of global companies Case Studies

    9. Supporting organizational capability by effective assessment of KSA-MVA On-line testing is a subset of well-established Computer-Based Testing (CBT) leveraging web delivery of test user interfaces and harvested data in the service of distributed assessment capabilities. Researchers found that hand-scored test profiles by trained personnel resulted in 53% of profiles containing errors. Hand-scoring of tests is possibly unethical due to inadequate accuracy. Allard, Butler, Faust, & Shea (1995). Errors in hand scoring objective personality tests: The case of the Personality Diagnostic Questionnaire-Revised (PDQ-R). Professional Psychology: Research and Practice, 26, 304-308. Test examinees tend to divulge more information to a computer than to a human examiner. Hart, R., & Goldstein, M. (1985). Computer assisted psychological assessment. Computers in Human Services, 1, 69-75.

    10. Supporting organizational capability by effective assessment of KSA-MVA “Current test and assessment tools will probably be replaced by electronic versions or by new tests with exclusive on-line use.” Harris, W.G. (1999). Association of Test Publishers. WWW.TESTPUBLISHERS.ORG. Assessment professionals are comfortable using computer-based test administration, with over 85% in one sample having done so. McMinn, M., Ellens, B., & Soref, E. (1999). Assessment, Vol. 6, #1, p. 74. Computerized administration and scoring of tests have become a generalized practice. Silzer, R., & Jeanneret, R. (1998). Anticipating the future: Assessment strategies for tomorrow. In. R. Jeanneret & R. Silzer (Eds.) Individual psychological assessment: Predicting behaviors in organizational settings (pp. 445-477). San Francisco: Jossey-Bass.

    11. Supporting organizational capability by effective assessment of KSA-MVA Psychometric research of computerized tests yields conclusive support for test characteristics of stability and validity. Alexander, J. & Davidoff, D. (1990). Psychological testing, computers, and aging. International Journal of Technology and Aging, 3, 47-56. Computerized testing formats are judged acceptable by users, rated as easy to use, and apparently are preferred to conventional paper-pencil and interactive testing in perceived comfort. Campbell, K., Rohlman, D., Anger, W., Kovera, C., Davis, K, & Grossmann, S. (1999). Test-retest reliability of psychological and neurobehavioral tests self-administered by computer. Assessment, Vol. 6, #1, 21-32. Hart, R., & Goldstein, M. (1985). Computer assisted psychological assessment. Computers in Human Services, 1, 69-75.

    12. Supporting organizational capability by effective assessment of KSA-MVA Testing provides probative evidence that the employer met its duty to investigate reasonably an applicant’s fitness. Companies…can reduce their exposure to negligent hiring claims. SHRM Legal Report, 1999 “A well-designed Internet system can be far more secure than many local computer network systems or intranets, and certainly more secure than paper and filing cabinets….In most areas, Internet-based assessment provides the potential for higher levels of control and security…Test scores are also potentially far more secure.” Bartram, D. (2000). Internet recruitment and selection: Kissing frogs to find princes. International Journal of Selection and Assessment, 8 (4), December, 261-274.

    13. Trends in using online tools for assessment The Integrated Web Assessment Services Model Replaces paper/pencil, PC-specific, and LAN models The Internet as software dispensary Access from any Net machine becomes viable XML Integration of test data flows with other applications

    14. Trends in using online tools for assessment Just-in-Time Delivery Assessments can be ordered by qualified test administrators 24/7 from anywhere Testing becomes mobile, matching the increased mobility of the test administrator & company

    15. Trends in using online tools for assessment Actuarial-statistical Data and Reports Scoring templates and PC-based diskettes not needed. Ordering, distribution, administration, scoring, and report delivery is totally electronic Elimination of scoring errors & improved standardization

    16. Trends in using online tools for assessment

    17. Trends in using online tools for assessment Integration - Applicant Tracking & Staffing Management Applications

    18. Trends in using online tools for assessment 2003-2010 Mass adoption of web based assessment New technology making testing more convenient: .xml, wireless device access, mobile testing platforms High stakes testing moving to Internet Standardized Education Tests Government Pre-employment testing Certification Web Services allowing for integration of “best of breed” component applications via internet

    19. Trends in using online tools for assessment Multi-cultural/multi-language test calibration Use of Java technology in addition to Active Server Pages Enhanced delivery of images and interactive assessment stimuli Multi-hurdle testing integrating remote and proctored testing sessions as progressive filters Ongoing systemic integrations with other HR applications

    20. Assessment Security Many leading firms deploy remote testing Internet a secure channel that delivers a test at the right time and place For high-stakes testing can lock down delivery of the test to specific PCs

    21. Assessment Security Data transmission 128-bit SSL encrypted Scoring is “near real time” and algorithms exist offline ISP continually running intrusion analytics

    22. Assessment Security Online Testing Consists of the presentation of a graphical user interface through which a candidate gains access to the test battery No access the testing pages unless directly referred and supervised by an authorized test administrator

    23. Assessment Security Test Administration All testing is done under the control of the test administrator Functional security is completely granular and allows for rights to be granted or revoked for each individual function of the application

    24. Assessment Security Test Scoring Test scoring performed off-line The target user for this aspect of the system is simply an automated process without human intervention.Test Scoring is a batch process isolated from system users

    25. Assessment Security Data Transfer Data is transferred to test centers in order for candidates to be scheduled and verified at the testing center. Raw data transfers to client also routine Transfer is an automated process that does not involve human interaction

    26. Lessons learned from enterprise clients Select the right tools and medium for maximal impact Fit for purpose Specific to organisational needs Global or local scope Web-based Open, standard technologies Scalability is critical

    27. Lessons learned from enterprise clients Bells and whistles often complicate the testing process and drive up costs unnecessarily – keep it simple. Platform needs to be “open” to interfacing with your HRIS systems/ATS systems. Large part of the I/O consideration is now focused on technology AND content

    28. Lessons learned from enterprise clients The customer is always right, sometimes (often test or software skills, seldom both; partnering is essential) Efficiencies of scale and ROI are measurable and compelling; set-up the ROI matrix before launch Internet is a dynamic system and requires constant adjustment; shrink-wrap and do-it-yourself web testing offerings invite problems

    29. Case Study – TSA Mandate Staffing of security personnel screeners at 429 federalized U.S. airports Multi-faceted assessment center process to hire the best candidates An automated solution Responsive and flexible Customer focused Vendor Team – designed to deliver the best possible solution. “Best-of-Breeds” Drs. Elizabeth Kolmstetter & Anne Quigley

    30. Case Study – TSA Mandate Assess, Hire, Train and Deploy: Over 35,000 + Federal Security Directors and Support Staff, Screener Supervisors, Team Leaders, and Screeners 22,000 + Baggage Screeners/Team Leaders/Supervisors

    31. Case Study - TSA Mandate Assess large number of candidates rapidly Complex multi-step, multi-vendor assessment process Need for replication at multiple sites concurrently Administrative analytic and monitoring tools

    32. Case Study - TSA Solution

    33. Case Study – TSA Solution Solution: Integrated Web-Based Process Integrate Content from Multiple Providers Integrated “Back Office” For Multiple Vendors ? Use Internet as Data “Backbone”

    34. Case Study – TSA Solution – Five process factors

    35. Case Study – TSA Solution Integrated Web Assessment Process Flow: Major Steps Application and Qualification Proctored Testing Multi-phase Assessment Center Process

    36. Case Study – TSA Solution

    37. Case Study – TSA Application and Qualification Application and Qualification Job Posting Candidate Job Application Form Candidate Application Received Candidate Notified of Eligibility Scheduling of Proctored Testing

    38. Case Study - TSA Proctored Testing Proctored testing sites Web-based scheduling Web-based test delivery Network of testing sites across U.S. Mobile sites for remote or collateral coverage Explicit protocol followed Sites uniform in equipment IP address control

    39. Case Study – TSA Proctored Testing Timed test battery – Multi- “vendor” Java Applet technology Connectivity needed for upload and download only Encrypted intra-session back-up file Encrypted transmission of data

    40. Case Study – TSA Proctored Testing Testing systems and interface ergonomics Mouse point and click response Mouse tutorial Scheduled breaks Time display Answers display Simple review commands

    41. Case Study - TSA Proctored Testing Personality Inventory Screener Object Recognition Test Mental Rotation Test English Proficiency Test Alternate forms Extended time versions for ADA accommodations

    42. Case Study – TSA Proctored Testing Scored in 2-5 minutes (off-line servers) Cut Score analyses for multiple positions Decision immediately available

    43. Case Study – TSA Assessment Center Dynamic performance forms and data stations Administered at Assessment Centers across U.S. & territories Sequential process selects out/advances candidates based on scoring rules Traveling Candidate Record Authorized User access 24/7 for Real Time Review

    44. Case Study – TSA Assessment Center Dashboard status tracking Candidate status from testing to job offer Importing of third-party vendor data Exporting to other applications (HRIS) Real Time Review capability

    45. Case Study – TSA Assessment Center

    46. Case Study – TSA Assessment Center

    47. Case Study – TSA Assessment Center Interview Customized interview per candidate Driven by scalar analysis of proctored testing battery Two concurrent interviewers Interview data/findings entered directly into system in vivo

    48. Case Study – TSA Assessment Center Data consolidated and scored according to pre-established rules Scores automatically analyzed for eligibility cuts Candidate exited or advanced based on Interview scores Status step updated in Dashboard

    49. Case Study – TSA Innovations Real Time Review Users need immediate access to status data to calibrate critical hiring Web technology enables point-click access to real time data, including required summary analytics

    50. Case Study – TSA Innovations Open Content Integration of multi-partner assessment content Multi-hurdle select-out model adapted due to stringent hiring eligibility criteria Content “travels” with electronic candidate record through the sequence Aggregate content data posted to TSA for analysis

    51. Case Study – TSA Innovations Open System xml data flows (web services) Import of 3rd party assessment data and exports to required databases and applications Dynamic status moves triggered by data decision rules Integration of cutting-edge data management with traditional testing and assessment practices

    52. Case Study – TSA Volume Job applications 59,689 Computerized tests scheduled 45,874 Computerized tests completed 29,256 Scheduled Phase II 9,316 Assessed Phase II 6,692 Computerized Centers (n=450)

    53. Summary Web assessment services era Seamless .xml data flow integrations “Best-of-Breed” vendor collaboration for client Automated multi-hurdle selection Real Time review of relevant data Synthetic reporting capabilities Multi-site data harvesting Scalability and cost containment Leading government and corporate clients are demanding the model

More Related