1 / 15

Real and Perceived Risks: The Cognitive Science Perspective

Real and Perceived Risks: The Cognitive Science Perspective. Topics: Theories of Risk The Cognitive Science Perspective. Relevant Theories. We will be focusing on several distinct theoretical approaches to risk this semester: Cognitive science Risk society Governmentality

palma
Download Presentation

Real and Perceived Risks: The Cognitive Science Perspective

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Real and Perceived Risks:The Cognitive Science Perspective Topics: Theories of Risk The Cognitive Science Perspective

  2. Relevant Theories • We will be focusing on several distinct theoretical approaches to risk this semester: • Cognitive science • Risk society • Governmentality • A cultural approach • Edgework • Our goal is to be able to describe: • The basic features/arguments of each theory • Similarities and differences across the theories • Examples to illustrate the basic features, similarities, and differences of the theories • Limitations of the theories

  3. Relevant Theories • There are many ways to compare these theories, two are (see Lupton 1999): • Realist/objective vs. relativist/subjective views of risk • Realist/objective – risks are real, we can calculate the probability of negative events and try to manage them • Relativist/subjective – risks are socially constructed; the reality of risk isn’t the critical issue, it is instead the perception of risk and how risk is used that matter most • Macro vs. micro focus • Macro – focus is on society/community/group/culture • Micro – focus is on the individual

  4. Cognitive Science • The general paradox: • We are safer now than at any other point in history (according to experts and objective statistics/data; e.g., on disease, life expectancy, etc.) • Despite this, we face new risks from new technologies • We react to these new risks with fear and we are more afraid now than we have ever been • Our fears don’t match with ‘the reality’ of risks • We are less afraid of bigger risks (e.g., car accidents, accidents in the home) • We are more afraid of lesser risks (e.g., nuclear power)

  5. Cognitive Science • Recognizing this gap between ‘reality’ and ‘perceptions’, researchers have focuses on: • Better understanding the gap between the 1) scientific, industrial, governmental community and 2) the general public • How to better communicate with/educate the public about risk (which is now more critical and distrustful)

  6. Cognitive Science • The cognitive science perspective takes a realist/objective view of risk • “Risk is the idea that something might happen, usually something bad” (Ropeik and Gray 2002: 4) • Four components of risk (from Ropeik and Gray 2002: 4) • Probability – calculating the chances of a certain outcome; objective and subjective probabilities are used; this is done in the name of quantitative science • Consequences – severity of the outcome • Hazard – the outcome must be hazardous to be a risk • Exposure – if something is hazardous, but we are not exposed to it then it is not a risk • It also has a micro focus – the focus in on laypeople’s perceptions

  7. “Perceptions of Risk” – Paul Slovic Important concepts from the Slovic reading: • Risk assessment and risk perceptions • Factors (Dread Risk and Unknown Risk), also: • Voluntariness • Controllability • Possible benefits • Signal potential and higher order impacts

  8. “Perceptions of Risk” - Paul Slovic • Experts assess risk based on objective facts, such as death rates • Laypeople, by contrast, have developed ‘heuristics’ or ‘mental strategies’ to make sense of what is a threat • These differences lead to the gap between ‘risk assessments’ and ‘risk perceptions’ • Biases that shape people’s perceptions: • Difficulty understanding probabilities • Media coverage – our knowledge is filtered through the media, which focuses on mishaps and threats (it sensationalizes them) • Misleading personal experiences • Anxiety in life • Also, new evidence doesn’t always lead people to change their opinions (i.e., when they have strong, existing views)

  9. “Perceptions of Risk” – Paul Slovic • The Factors: • Dread risk • Unknown risk • Other important components: • Voluntariness and equity • Controllability of • Benefits of technology/activity

  10. “Perceptions of Risk” – Paul Slovic • Higher order impacts and signal potential • The impacts/consequences of an accident can be direct and/or indirect • Direct – immediate victims • Indirect – anyone/anything else not an immediate victim • Higher order impacts – the extent to which an accident/unfortunate event has effects beyond direct harm to the immediate victims • Indirect costs (monetary and non monetary) to • the group responsible • to other groups in the same industry • to other groups in other industries • E.g., Increased regulation, increased construction and operation costs, reduced operation, greater public opposition, reliance on other energy sources, hostile view toward new technology, loss in credibility • Examples • Three Mile Island example • BP spill in the gulf • Signal potential • the signal potential of an event and its social impact is related to where a risk fits into the two-factor structure • this determines the extent of higher order impacts

  11. A more recent summary (from Ropeik and Gray 2002: 16-17) • People are more afraid of: • New risks, ones they are not familiar with – e.g., West Nile virus • Human-made risks (vs. natural risks) – e.g., radiation from nuclear power and cell phones vs. the sun • Imposed risks – e.g., asbestos vs. first-hand smoke • Risks with fewer benefits – e.g., people like San Francisco, so they live there despite earthquakes • Risks that kill in awful ways – e.g., killed by shark vs. heart disease • Risks they can’t control – e.g., flying on commercial jet vs. driving one’s self • Risks from places, people, corporations, governments they don’t trust – e.g., would you choose to drink glass of clear liquid from chemical company or Oprah? • Risks that we are more aware of – e.g., terrorism in 2001 • Risks when uncertainty is high – which explains why we fear new technology • Risks to children compared to risks to themselves • Risks that can directly affect us personally

  12. Critiques • The problems, according to Arnoldi, are • the belief that risk perceptions are irrational • People’s perceptions are almost dismissed as the goal is to better educate them and to eradicate the misperceptions • science is limited and the cognitive science perspective gives too much power to experts • they haven’t examined larger social and cultural forces that shape risk perceptions

  13. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA (1996)

  14. The Book: • Rachel Carson Prize (1998) – For a book length work of social or political relevance in the area of science and technology studies • Robert K. Merton Award – The award is given annually in recognition of an outstanding book on science, knowledge, and technology • Nominated for the National Book Award and the Pulitzer Prize (for achievements in newspaper and online journalism, literature and musical composition) • Vaughan is also the 2006 winner of the ASA’s Public Understanding of Sociology Award because she has had exceptional influence as a public intellectual for the past several decades • Vaughan testified before the Columbia Accident Investigation Board in 2003 and worked with the board to write a chapter on social causes of the Columbia disaster • This is not light reading – don’t expect to skim it quickly • The focus in chapter 3 is on the ‘native’ view of risk perceptions related to the o rings on the space shuttle’s solid rocket boosters • The critical concept is the normalization of deviance • The chapter briefly discusses the nature of risk related to space travel and the shuttle program • The chapter focuses on the structure of NASA as well as its parts suppliers (and the built-in conflict) • It also focuses on details of risk assessment related to the o rings – in the day to day events within and external to NASA well before the 1986 disaster • I would suggest that you read it in stages; don’t read for detail, but you should be able to write a summary of the chapter’s main ideas (in about 2-3 paragraphs)

  15. References • Lupton, Deborah. 1999. Risk. London: Routledge. • Ropeik, David and George Gray. 2002. Risk: A Practical Guide for Deciding What’s Really Safe and What’s Really Dangerous in the World Around You. Boston: Houghton Mifflin Company.

More Related