250 likes | 259 Views
Computer-based Experiments: Obstacles. Stephanie Bryant University of South Florida. Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references. Obstacles—Overview. Technology Skill Needed Threats to Internal Validity Getting Participants. Obstacles (Con’d).
E N D
Computer-based Experiments:Obstacles Stephanie Bryant University of South Florida Note: See Bryant, Hunton and Stone, BRIA 2004 for complete references
Obstacles—Overview • Technology Skill Needed • Threats to Internal Validity • Getting Participants
Obstacles (Con’d) • Technology Skills Needed • “Proficiency” in software or programming
Tools for an Computer-based Experiments • Develop Using a Scripting Language or Applications Software • Applications Software for Web Experiments • Example software packages: RAOSoft, Inquisite, PsychExps • More expensive software, cheaper development & maintenance costs? Easier to use, Features = those built into the software • Scripting languages: • Examples: Cold fusion, PHP, JSP (java server pages), CGI (common gateway interface) • Software is cheap or free, higher development & maintenance costs?, difficult for non-programmers, More features, more customizable • Combine Scripting Languages & applications software
Applications Software: Raosoft Products (Ezsurvey, Survey win, Interform) Difficulty index (1 = hard,10 = easy): 8 Do not provide all the functionalities No randomization, response dependent questions (I.e., only straight surveys) Limited formatting capabilities Expensive – no educational prices ($1,500 - $10,000) SurveyMonkey.com - $19.95/month
Applications Software: Inquisite • Difficulty index (1 = hard,10 = easy):8 • Expensive ($10,000) Supports most of functionalities • To support all desired functionalities requires Software Development Kit (SDK) for complex applications ($2,000 but may be available soon for free)
Applications Software: PsychExps • PsychExperiments Web site created and maintained by the Univ. of Mississippi Psychology professor Ken McGraw. • “Collaboratory” • http://psychexps.olemiss.edu/ • Free! • Requires that user download & install applications software • Many existing scripts (e.g., randomization)
Obstacles (Con’d) • Big learning curves involved • On-campus support sometimes available • Can hire programmers/graduate students to help with programming
Obstacles (Con’d) • Internal Validity Considerations: • Statistical Conclusion Validity • Internal Validity • Construct Validity • External Validity
Statistical Conclusion Validity(The extent to which two variables can be said to co-vary) • Increased sample size and statistical power (e.g., Ayers, Cloyd et al.) • Web to recruit participants! • Decreased or eliminated data entry errors • Capture data directly into database • Increased variability in experimental settings • Difficult to control in Web experiments • People complete experiments in their own (“natural”) settings with various types of computer configurations (browsers, hardware) • McGraw et al (2000) note that WE noise is compensated for by large sample sizes • System Downtime • Software Coding Errors (e.g., Barrick, 01, Hodge 01)
Internal Validity(Correlation or Causation?) • Decreased potential diffusion of treatment • Unlikely that participants will learn information intended for one treatment group and not another. • Increased participant drop-out rates across treatments • A higher drop-out rate among Web vs. laboratory experiments could create a participant self-selection effect that makes causal inferences problematic. • Mitigate by placing requests for personal information and monetary rewards at the beginning of the experiment (Frick et al 1999) and McGraw et al. (2000). • Completion rate approached 86% when some type of monetary reward was offered (Musch and Reips 2000)
Internal Validity(Correlation or Causation?) • Controlling “cheating” • Multiple submissions by a single participant • Identification by email address, logon ID, password, or IP address • Randomization (A control) • Computer scripts available for randomly assigning participants to conditions • Complete scripts published in Baron and Siepmann (2000 247) and Birnbaum (2001, 210-212)
Construct Validity(Generalizability from observationsto higher-order constructs) • Decreased demand effects & other experimenter influences • Rosenthal (66 & 76), Pany (87) • Decreased participant evaluation apprehension • Rosenberg (69) • “Naturalism” of setting decreases?
Getting ParticipantsWeb-based Experiments • Explosion of WWW Use • 172 million computers linked to WWW • 90% of CPAs conduct internet research • 60% of US population has WWW access
Getting Participants • Internet Participant Solicitation • Benefits • Large sample sizes (power) possible • Availability of diverse, world-wide populations • Interactive, multi-participant responses • Real-time randomization of question order • Response dependent questions(branch and bound) • Authentication and authorization • Multimedia(e.g., graphics, sound) • On-screen clock
Getting Participants • Web-based • Post notices in places where your target population might be likely to visit • Access ListServs • PC-based • Student involvement requirement?? • USF Process
USF Process • Mandatory participation in one experiment per semester • Experimentrix site used to manage • https://experimetrix2.com/soa/
A Final Caveat: What Can Go Wrong…Will • Cynical, but realistic • Plan carefully • Develop contingency plans • Consider cost-benefit • Greatest potential for BAR Web experiments is as yet unrealized • Biggest hurdle is required knowledge, but this can be overcome