1 / 27

Polywater

Polywater. A Case of “Pathological Science” Mentor: Dr. James K. Drennen III Elise Jesikiewicz Chloe Drennen Drew Finton AJ Fogl Patrick Erickson. Outline. Background of the Polywater Case “Pathological Science ” Personal Responsibility for Prevention

thattie
Download Presentation

Polywater

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Polywater A Case of “Pathological Science” Mentor: Dr. James K. Drennen III Elise Jesikiewicz Chloe Drennen Drew Finton AJ Fogl Patrick Erickson

  2. Outline • Background of the Polywater Case • “Pathological Science” • Personal Responsibility for Prevention • Responsibility of the Scientific Community • Consequences of Ethical Misconduct • Summary and Conclusions

  3. Background of the Polywater Case The Creation of “Anomalous Water”1 • 1962 – First reports of a new type of water came from Soviet Union scientists Nikolai Fedyakin and Boris Deryagin. • “Anomalous Water” reportedly had properties extremely different from normal water. • This new substance could only be made in one of two ways. Mckinney, W.J. Experimenting on and Experimenting with: Polywater and Experimental Realism. The British Journal for the Philosophy of Science. 1991. 42(3): 295-307.

  4. Background of the PolywaterCase “Anomalous Water” becomes Polywater1 • Western hemisphere ignored the Russian research. • In 1969, Ellis Lippencott (an American) published an article about polymer water (Polywater)2. • Resulted in more research being conducted on Polywater. 1. Mckinney, W.J. Experimenting on and Experimenting with: Polywater and Experimental Realism. The British Journal for the Philosophy of Science. 1991. 42(3): 295-307. 2. Lippencott, E.R., et al. Polywater. Science. 1969. 164(3887): 1482-1487.

  5. Background of the PolywaterCase Research Conducted on Polywater • Scientists throughout the world began to study Polywater. • Lippencott, et al1 • Allen and Kollman2 • Rousseau and Porto3 • “Cat’s Cradle” and media made the Polywatercase popular. 1. Lippencott, E.R., et al. Polywater. Science. 1969. 164(3887): 1482-1487. 2. Allen, L.C. and P.A. Kollman. A Theory of Anomalous Water. Science. 1970. 167(3924): 1443-1454. 3. Rousseau, D.L. and S.P.S. Porto. Polywater: Polymer of Artifact? Science. 1970. 167(3926): 1715-1719.

  6. Background of the PolywaterCase The Truth is Revealed!1 • Since Polywatercould only be formed in small quantities, controlling contamination was extremely difficult. • Scientists eventually discovered that the substance was just water with contaminants! • In 1973, Deryagin admitted that his discovery was due to contaminants, not the creation of Polywater. Mckinney, W.J. Experimenting on and Experimenting with: Polywater and Experimental Realism. The British Journal for the Philosophy of Science. 1991. 42(3): 295-307.

  7. Background of the PolywaterCase How does this case exemplify a breach in ethics? • This was not a case of a deliberate breach in ethics, but rather poor scientific research and a failure to do more research before coming to conclusions. • Reasons for scientists breaching ethical standards in this case: • Cold War politics1 • Want to discover something novel before other scientists 1. Brakel, J.T. Polywater and Experimental Realism. The British Journal for the Philosophy of Science. 44(4): 775-784.

  8. “Pathological Science” “Error is not a fault of our knowledge, but a mistake of our judgment giving assent to that which is not true. . . It is in man’s power to content himself with the proofs he has, if they favor the opinion that suits with his inclinations or interest, and so stop from further research.” - John Locke1 1. Rousseau, D. L. (January 01, 1980). Case studies in pathological science. American Scientist, 80, 54-63.

  9. “Pathological Science” • Defined by Irving Langmuir and Denis Rousseau as errors in science created by a loss of objectivity1 • Characteristics1 • Difficult to detect; produce reliable results repeatedly • Well-accepted theories are often overlooked • Self-delusion; loss of objectivity • The Polywater case is not the only example 1. Rousseau, D. L. (January 01, 1980). Case studies in pathological science. American Scientist, 80, 54-63.

  10. “Pathological Science” Cold Fusion1 • Process of fusing nuclei of two atoms to make one larger nucleus, releasing large amounts of energy due to neutron emission • Potential source of cheap, clean, and unlimited energy • Claimed that a simple electrode could be used to “squeeze” nuclei together 1. Rousseau, D. L. (January 01, 1980). Case studies in pathological science. American Scientist, 80, 54-63.

  11. “Pathological Science” Cold Fusion • Two groups of scientists simultaneously researching this phenomenon in the 1980s • Disregarded terms of an agreement and prematurely submitted results for publishing without verifying their work

  12. “Pathological Science” Infinite Dilution1 • Biologically active sample diluted many times so that active molecules are no longer evident, but the sample still retains its original biological effects • Basis for practice of homeopathic medicine today 1. Rousseau, D. L. (January 01, 1980). Case studies in pathological science. American Scientist, 80, 54-63.

  13. “Pathological Science” Infinite Dilution1 • In 1988, Jacques Benveniste and colleagues infinitely diluted samples of an allergen and conducted experiments on human basophils to determine if it would still produce an allergic response • Samples were so diluted that the probability was low that allergens would contact a basophil, but researchers still claimed to observe allergic response • Failed to produce the same results in double-blind experiment in which researchers did not know which cells had been exposed to allergen 1. Rousseau, D. L. (January 01, 1980). Case studies in pathological science. American Scientist, 80, 54-63.

  14. “Pathological Science” • All 3 cases have the characteristics of pathological science • Difficult to detect; produce reliable results repeatedly • Well-accepted theories are often overlooked • Self-delusion; loss of objectivity • In all cases, the desire to be on the cutting edge of research with such grand implications forced these scientists to conduct their research with a single-minded, biased approach

  15. Personal Responsibility for Prevention Why does every scientist have this personal responsibility? • Reputation at risk • Negatively impact future publications and work

  16. Personal Responsibility for Prevention • Responsibility to not engage in poor scientific practices • Proper experimental controls • Proper design of experiment • Design to find truth/reality • Not design to find a certain result • Having a hypothesis is completely different than attempting to produce certain results • Submit to peer review, independent study, not to the media • Keep emotions, personal agendas, prejudices out of experimental designs and procedures

  17. Personal Responsibility for Prevention • Responsibility to evaluate and hold others accountable • Participate in peer review process • Be aware of motivations behind other experiments • Friends and family asking about “ground breaking” • Teach others, specifically non-scientists, to be skeptical • Help prevent pseudo-scientific claims from becoming pervasive in non-scientific community • Be an active participant in the scientific community

  18. Responsibility of the Scientific Community Recognizing other Cases • The most important part of preventing this particular breach of ethics, on the level of the scientific community, is to recognize a particular topic or group of papers generated from poor science. • Next, the scientific community must be made aware of this breach of ethics. • Once the idea is discredited, it should no longer be used for any new publications; in effect preventing a further breach of ethics.

  19. Responsibility of the Scientific Community • In order to recognize trends in failed information epidemics, more epidemics must be studied so criteria can be established. • These criteria can then be used to detect other cases.

  20. Responsibility of the Scientific Community • Several indicators have been found in the publication of unsuccessful information epidemics1 • Rapid growth/decline in the number of publishing researchers • Epidemic growth/decline in the number of publications • Several distinct disciplines involved (its usefulness as an indicator is in doubt) Ackerman, Eric. "Indicators of failed information epidemics in the scientific journal literature." Scientometrics 66.3 (2006): 451- 66. SciFinder. Web. 10 June 2013.

  21. Responsibility of the Scientific Community If more studies are conducted of on other literature arising from poor science, perhaps a better method for recognizing unethical material can be achieved; therefore preventing the future spread of false knowledge.

  22. Responsibility of the Scientific Community Methods of Prevention • Better editing and review of papers before they are published • Responsibility must fall on the Journals who made the choice to publish these papers containing information achieved through poor use of science • Scientists lose credibility by publishing results based off of poor science • The journal should also lose credibility for not checking the information they are publishing

  23. Responsibility of the Scientific Community • Educating scientists on proper procedures and methods • Being unbiased when conducting experiments • Correct interpretation of data

  24. Consequences of Ethical Misconduct • Publishing before the Data • “It is a capital mistake to theorize before one has data1.” -Sherlock Holmes • Any publication will have negative impact • Analogous to Sun magazine vs. National Geographic Diamond, Arthur M. "The Career Consequences of a Mistaken Research Project." American Journal of Economics and Sociology68.2 (2009): 387-411. WorldCat. Web. 9 June 2013.

  25. Consequences of Ethical Misconduct • Scientists feels negative effects for a lifetime • Citation value greatly decreased • Loss of $13,000-$19,000 per year • Loss of job mobility, lower salary, less impact in community • Loss of citation value has greatest impact • Institution also punished • Employs scientist with low credibility

  26. Summary and Conclusions • Two types of ethical misconduct • Deliberate Misconduct • Fudging the numbers • Plagiarism • Projecting data • Chronological Misconduct • Announced without concrete evidence • Publication without adequate data • Peer review process not completed

  27. Thank you for Listening! Questions?

More Related