1 / 31

PERSONAL AND SITUATIONAL FACTORS IMPACTING ON CBT PRACTICES IN DEVELOPING COUNTRIES

PERSONAL AND SITUATIONAL FACTORS IMPACTING ON CBT PRACTICES IN DEVELOPING COUNTRIES. Cheryl Foxcroft, Andrea Watson & Bronwyn Seymour. Admissions and Placement Assessment Programme University of Port Elizabeth SOUTH AFRICA.

frye
Download Presentation

PERSONAL AND SITUATIONAL FACTORS IMPACTING ON CBT PRACTICES IN DEVELOPING COUNTRIES

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PERSONAL AND SITUATIONAL FACTORS IMPACTING ON CBT PRACTICES IN DEVELOPING COUNTRIES Cheryl Foxcroft, Andrea Watson & Bronwyn Seymour Admissions and Placement Assessment Programme University of Port Elizabeth SOUTH AFRICA

  2. CBT has the promise to make assessment more authentic and real-world and is revolutionizing all aspects of assessment (design, development, delivery, scoring, reporting). The advent of CBT is also driving advances in psychometrics. BUT it also brings certain challenges with it. For example, what impact does the computer and computer familiarity have on test performance? How do human & computer factors interact to impact on test performance? The CBT should be sufficiently easy and pleasant to use so that the test-taker can focus on the content of the items and on completing the test to the best of his/her ability, and not be preoccupied with trying to correctly interpret and use the testing interface.

  3. According to the draft International Guidelines on Computer-based and Internet Delivered Testing : Among other things, when using a CBT, the assessment practitioner (test user) should collect data on protected or minority groups to monitor for possible adverse impact. This is especially important where there is inequality of access. Guidelines 33.1 & 33.2

  4. 95% Internet Use

  5. Levels of computer literacy 16% low computer familiarity 32% low computer familiarity • @ UPE: • 17% tested were first-time users • Only 63% of test-takers rated themselves as “computer literate” (37% = not comp. literate)

  6. What should be monitored? • Perception of CBT – especially in developing • countries where PBT still predominates & • practitioners view CBT with scepticism due to • low levels of computer familiarity What Do AssessmentPractitioners Think? • CB Tests can add value save time. • But it is problematic to use CBT where there are low levels of computer literacy. • CBT is costly (hardware & software). • Practitioners need training and need to be computer literate – they feel threatened.

  7. What should be monitored? • Perception of CBT – continued What Does Previous Research Indicate? • Most test-takers, even those unfamiliar with computers respond positively to CBT • In seven years of administering CBTs, only one test-taker out of 319 000 expressed a preference towards PBTs. (Bugbee 1996)

  8. What should be monitored? • Perception of CBT – continued How Do Test-takers Perceive CBT? • Most test-takers were positive and preferred CBT to PBT. • 78% of them completed the sentence “The thing that I liked most about today’s testing was …” with “using a computer”!

  9. What should be monitored? • Perception of CBT • Extent of technological sophistication • Level/extent of computer familiarity • Anxiety • Gender • Ethnicity/Culture • Level of test-wiseness

  10. How should the monitoring be done? • Collect data on numbers tested in the various • groups that are important to monitor. • Explore/monitor group • differences in test scores. • Collect data on computer-related • experiences and CBT experience. • ITC Guidelines 33.1 & 33.3

  11. Technological Sophistication • Although research indicates that test-takers with little or no computer familiarity can learn to effectively use a testing system, not enough research has been done on test-takers who are not familiar with technology in general. • Consequently, assessment practitioners who practice CBT in developing countries which and are not functioning at the same level of technological sophistication as 1st world countries, it is imperative to research the impact of technological sophistication on CBT performance.

  12. Technological Access in South Africa • Severely lacking in • Rural areas • restricted access to power, telephones, ATMs • in schools • 70% have no computers • 80% have no media centres • 34% have no telephones (some have cell phones) • 47% have no power (grid, solar, generators)

  13. Effect sizes Technology Levels with CB Test Performance

  14. 1st time vs ‘experienced’ computer users and effect sizes of impact of exposure to technology on CBT performance

  15. Effect sizes for Technology exposure and CBT Test Performance for blacks and whites

  16. Other researchers found no evidence of a positive relationship between computer experience and CBT performance. Some researchers found that prior computer experience significantly affected the performance of college students on a CBT maths test Computer Familiarity • A review of the literature on computer familiarity and CBT performance revealed conflicting results

  17. Student sample (n=830) • First-time 11.7% Experienced 88.3% • How often? Everyday 37.8% Once a week 24.3% X a month 16.1% Once a month 9.0% Never 12.7% • Rating of Ability: Excellent 17.5% Good 38.1% Fair 29.4% Poor 3.6% No response 11.5%

  18. Comparison of performance of 1st-time & more experienced users

  19. COMPUTER FAMILIARITY Academic Performance: Predicted vs Actual performance

  20. Can gather data on computer familiarity via self-report questionnaires (e.g., rate how often test-taker uses a computer and for what purposes; rate level of anxiety at end of test session). BUT need to explore ways of measuring computer familiarity and anxiety more directly and not just using self-report.

  21. Undertake computer usability studies with behavioural observations Participant side Observer side The UPE Usability Laboratory: Equipment supplied by Noldus Technology in the Netherlands

  22. Anxiety Unfamiliar with computer Greater anxiety Associated with lowered test performance

  23. Anxious YES 10.7% NO 89.3% Student sample (N = 496) Effect sizes: Anxiety & Test Performance

  24. ANXIOUSAcademic Performance: Predicted vs Actual performance

  25. Culture Impact of Culture + Frequency of computer use on CBT performance

  26. Gender Males Females

  27. Challenge to fair assessment practices How to use CBT in a fair way with test-takers with low levels of computer familiarity and who are technologically unsophisticated?

  28. Do not use computerised tests? • Will equivalent PBTs always be available? • Allow test-takers more time to become familiar with a computer? Use expanded tutorials and practice tests • Develop norms or cut scores for test-takers with different levels of computer familiarity • Extensive briefing and debriefing needed

  29. Rigorously research impact of computer familiarity, anxiety, and technological sophistication on test performance by undertaking bias analysis for different: - cultural groups - gender groups - levels of test-wiseness - quality of schooling • Develop a more effective measure to tap HCI factors

  30. THE END

  31. Gender Effect sizes: Gender + Technology on CBT performance

More Related