210 likes | 383 Views
Teaching Research Ethics or Learning in Practice ? Preventing Fraud in Science. Dies Natalis Lecture ISS The Hague, 9 October 2014 Kees Schuyt , PHD, LL.M Sociology professor emeritus, University of Amsterdam; chair National Office of Research Integrity (2006-2015).
E N D
Teaching Research Ethics or Learning in Practice? PreventingFraud in Science Dies NatalisLecture ISS The Hague, 9 October 2014 Kees Schuyt, PHD, LL.M Sociology professor emeritus, University of Amsterdam; chair National Office of Research Integrity (2006-2015)
Two phenomena, five topics • Scientificintegrity (whatit is andisn’t) • Data-management (goodand bad practices)
Five topics: • What do we want to prevent? • Good and bad practices • Why does it happen? - Tentative explanations • What is to be done? - Rules or principles • Educating, learning, mentoring
1. What do we want to prevent ? • History of fraud in science (Baltimore-case (1986-1996) as turning point; US Office of Research Integrity, 1994 • Broad and Wade (1983); Van Kolfschooten (1996, 2012); Grant (2008) • Levelt - report on the Stapel-case (2011/2012) • What can we learn from incidents (outliers)? (teamwork; the system is not watertight: good datamanagement)
Scientific integrity • Integrity is a self-chosen commitment to professional values (B. Williams 1973) • Resnik: “striving to follow the highest standards of evidence and reasoning in the quest to obtain knowledge and to avoid ignorance” (The Ethics of Science,1998) • Integrity is context bound, eg. fabulation in novels and fabulation in science; leading values in science (Merton 1949) • Codes of Conduct: NL 2005/2012; ESF 2010
Violations Violations of the game rules of science: FFP : fabrication or fabulation falsification plagiarism Differencebetween F and P?
2. Good and bad practices • Questionable research practices (trimming, cooking, pimping, sloppiness, uncareful data management, not archiving) • Drawing the line (raw data, co-authorship, impolite behaviour)
Trimming and cooking (Babbage 1830) • Trimming: “consists of clipping of little bits here and there from those observations which differ most in excess of the mean, and in sticking them on to those which are too small” • Cooking: “to give ordinary observations the appearance and character of those of the highest degree of accurance. One of its numerous processes is to make multitudes of observations, and out of these to select only those which agree, or very nearly agree”
Metaphorically: “if a hundredobservations are made, the cook must beveryunluckyif he cannotpick out fifteen or twentywhichwill do forserving up” (Charles Babbage, Reflections on the decline of science in England and some of itscauses, 1830; 1989 editedbyHyman)
Four main distinctions: • honest vs dishonest, fraudulent • good vs bad practices • controversies vs dishonest research • game rules vs goal rules
Data-management The scientific research cycle: • 3 strong controlling points: grants, peer review, scientific community • 2 weak points: primaryprocess and data-archiving • Wide variationsbetween disciplines: is everything okay? • Bad to goodpractices: single vs teamwork • Scale of research: international data-gathering; protocols
Variations in data and in data-gathering • Experimental design data (lab) • Stemcells, MRI-scan data • Mathematical data, logical analysis • Survey-data (pen and pencil) • Public data (time series, economic data, populations figures, official statistics) • Historical data (archives) • Anthropological field observation • Simulations
3. Whydoes it happen? • Three mainexplanations: • Publicationpressure: fromwho to whom? • Sloppyscience • Pressurefrom contract research • Alternativetentativeexplanatoryscheme: misplacedambition, loosementoring, ignoringearlysignals, poor peer review,no institutional response
Contract research • What is the problem? Köbben 1995: scientific independence; pressure from above (yes, minister); conflicts of interests • Research biases? Biomedical research; Roozendaal • Patents, secrecy, firm’s data not public • Remedies: “good fences make good neighbours” (R.Frost), applied to contracts • Research codes, guidance committees, High Prestigious Research Group (hprg) • Conclusion: be a hprg: integrity high, high skills, independent
4. What is to be done? • Learn from best practices across disciplines • Peer pressure before peer review; data-manager and/or statistical counseling; open discussions to keep alert (not too often!) • Scientific pledge or oath taking!? • Lowering publication pressure? (causality!) • Educating ethics in science; integrated in data-management courses
5. Educating, learning, mentoring • The sixpack: a learning rules, discussing ethics b training research skills (eg. advanced statistics, philosophy of science) c good mentoring (becoming a good scientist) d oath-taking (!?) e online learning, the dilemma game f reading Being a scientist • Select your own best combination
Thank you very much indeed for your attention