580 likes | 683 Views
A Psychological Approach to How Trust is Built and Lost in the Context of Risk. J. Richard Eiser University of Sheffield, UK Mathew White Friedrich-Schiller Universit ät, Jena, Germany. Structure of this talk. How risk depends on human decisions. Decisions and their consequences.
E N D
A Psychological Approach to How Trust is Built and Lost in the Context of Risk J. Richard Eiser University of Sheffield, UK Mathew White Friedrich-Schiller Universität, Jena, Germany
Structure of this talk • How risk depends on human decisions. • Decisions and their consequences. • Trust as a social judgement about decision-makers and information sources. • ‘Marginal trust’ – changes in trust as a consequence of specific events. • Contributory factors – negativity bias, cognitive consistency, diagnosticity, decision types • Conclusions.
Risk depends on human decisions • Risk involves uncertainty about the likelihood of events and the value of their consequences • Risk arises from interactions between people and their social and physical environment. • Risk depends not only on physical conditions but also on human actions and decisions (e.g. Chernobyl, Hurricane Katrina, Kashmir earthquake).
Risks are social • Poor decisions exacerbate risk for ourselves and others. • We often rely on others to manage and alleviate risks on our behalf. • We often rely on others to inform us about risks and advise us what to do. • Inequality within and between societies increases vulnerability and limits access to help and information.
Hence… • Understanding risk involves understanding not only physical conditions but also how people make decisions. • Risk perception implies judgements about the quality of our own and others’ decisions. • Experts should make higher quality decisions and/or give higher quality information (or else they’re not experts).
What do we mean by ‘quality’? • Within the context of risk management: • Ability to discriminate danger and safety. • Use of appropriate criterion for balancing different costs and benefits. • Within the context of risk communication: • These, plus… • Avoidance of bias due to personal interest. • Use of appropriate criterion for warning about danger (neither too alarmist nor complacent).
Decisions and their consequences • In an uncertain environment, we need to differentiate between safety and danger. • Some situations are clearly safe, others are clearly dangerous. • What happens in between? • An approach derived from the psychology of perception: Signal Detection Theory.
Discriminating danger Danger Cautious criterion ? Risky criterion Safety
Decision-outcome combinations • When deciding whether something is safe or dangerous, there are four possibilities: • Dangerous – treat as dangerous (“Hit”). • Dangerous – treat as safe (“Miss”). • Safe – treat as dangerous (“False alarm”). • Safe – treat as safe (“Correct all clear”).
Consequences • These different combinations can have different costs and benefits. • Misses can often appear more costly than false alarms. • An excessively precautionary approach can deprive users of benefits of a technology, and/or expose them to alternative, perhaps greater, risks (e.g. using cars after a train crash).
Trust as a social judgement • Trust in experts implies a positive judgement of the quality of their decisions and/or information. • Trust can depend on implicit estimates of the others’ competence, partiality and honesty. • If ‘experts’ are seen as having a vested interest, this may undermine trust. • Decision-makers who share one’s interests and values are more trusted.
Example 1: Mobile Phones • Respondents rated different sources of information about possible health risks of mobile phones in terms of: • Trust. • Knowledge. • Warning criterion (how much evidence source would need before warning). • Industry seen as knowledgeable, but reluctant to warn and therefore distrusted.
Scientists Medics Environmentalists Government Media Industry
Scientists Medics Environmentalists Government Media Industry
Example 2: Contaminated land. • Local residents rated different sources of information about possible health risks of contaminated land in terms of: • Trust. • Expertise at judging how safe or dangerous. • Bias in decision-making/communication. • Openness. • Having residents’ own interests at heart. • Perceived expertise does not guarantee trust without impartiality, openness and shared values.
How much would you trust what each of the following might tell you about risks from contaminated land?
If there was contaminated land in your neighbourhood, how able do you think each of the following would be to judge how safe or dangerous it was?
Conclusions of surveys • Baseline levels of trust only partly reflect perceived expertise. • Perceived self-interest, openness and shared values are also important. • Need for an experimental approach to unconfound these factors. • Need to examine how specific events may influence marginal trust.
Marginal trust • Many policy makers know public trust is low • But how can they build it & avoid losing it? • Four psychological insights: 1) Negativity bias (prior) 2) Desire for cognitive consistency 3) Information diagnosticity 4) Decision outcome types (Miss, False Alarms etc.)
1) Negativity bias • "Bad is stronger than good" (Baumeister et al, 2001; Rozin & Royzman, 2001) • Info. valenceEffect on trust Positive Small increase Negative Large decrease • Trust = easier to lose than gain (trust asymmetry) • “Trust comes on foot and leaves on horseback”
Local board authority can close plant Responsive to any sign of problems Local advisory board established Employees carefully trained Employees rewarded for finding problems On-site government inspector Employees informed of problems Community has access to records Neighbours notified of problems Public encouraged to tour plant Try to meet with public Employees closely supervised EPA monitor radioactive emissions No problems for five years Conduct emergency training Record keeping is good Managers live near plant Hold regular public hearings Mandatory drug testing Nearby health better than average Evacuation plan exists Effective emergency action taken No problems in past year No evidence of withholding information Operates according to regulations Contribute to local charities e.g. Keep good records Increase trust Don't contribute to local charities Serious accident is controlled Officials live far away Little communication with community Accident occurs in another state No public hearings Emergency response plans not rehearsed Public tours not permitted Accused of releasing radiation Delayed safety inspections Denied access to records Poor record keeping Health nearby worse than average Employees not informed of problems Official lied to the government Plant covered up problem No adequate emergency response plan Employees drunk on job Records were falsified Decrease trust e.g. Keep bad records 45 "events" in a nuclear power plant Slovic (1993) Valence Trust Negative -4.73 Positive +3.07 F(1,102) = 82.64, p<0.001 pη2 = .45. Terrible News !!!!!!
2) Desire for cognitive consistency • People want stability in their belief structures • We tend to trust good news about things/from people we like but not for things/people we don’t (Hovland, Janis & Kelley, 1953) • People don’t like nuclear power • So greater effect of bad news may be due to a confirmatory bias • What about a less negatively viewed industry?
Negativity or cognitive consistency? Sample = 68 students 1) Attitudes (-3 to +3): Nuc. = -.47; Phar. = +.50, p < 0.01 2) DV Trust change (Slovic,1993; Cvetkovich et al. 2002) “How would your level of trust in the management of a particular nuclear power (pharmaceutical) plant be affected by the following information?” (‘Much less trust–3 to Much more trust +3)
Negativity or cognitive consistency? • 12 events (6 positive & 6 negative) either nuclear power or pharmaceuticals F(1,66) = 8.16 , p < 0.01, pη2 = .11
Negativity or cognitive consistency? • So ‘Trust Asymmetry’ isn’t ubiquitous • Replicated in other domains (e.g. additives) • Good news for already trusted sources but doesn’t help distrusted sources build trust • Fortunately there is more to the story
Local board authority can close plant Responsive to any sign of problems Local advisory board established Employees carefully trained Employees rewarded for finding problems On-site government inspector Employees informed of problems Community has access to records Neighbours notified of problems Public encouraged to tour plant Try to meet with public Employees closely supervised EPA monitor radioactive emissions No problems for five years Conduct emergency training Record keeping is good Managers live near plant Hold regular public hearings Mandatory drug testing Nearby health better than average Evacuation plan exists Effective emergency action taken No problems in past year No evidence of withholding information Operates according to regulations Contribute to local charities e.g. Keep good records Increase trust Don't contribute to local charities Serious accident is controlled Officials live far away Little communication with community Accident occurs in another state No public hearings Emergency response plans not rehearsed Public tours not permitted Accused of releasing radiation Delayed safety inspections Denied access to records Poor record keeping Health nearby worse than average Employees not informed of problems Official lied to the government Plant covered up problem No adequate emergency response plan Employees drunk on job Records were falsified Decrease trust e.g. Keep bad records Look at the variance! Some good news is very good for trust Some bad news is not so bad for trust Unpacking why might help us build trust Slovic (1993)
3) Information diagnosticity • We make the world simpler by categorising others E.g. Friendly/Unfriendly; Honest/Dishonest etc. • The info. we use varies in terms of diagnosticity i.e. how good is it at differentiating people • One important aspect = information specificity i.e. relate to a single event or many events • Jo took £10 from the till …. a) last Wednesday or b) every day last week
3) Information diagnosticity • Slovic info. differed in terms of specificity: A) High specificity: Events “A plant official is found to have lied about a safety matter.” B) Low specificity: Policies “There is careful selection and training of plant employees.” • Trust should be more affected by policy (low specificity) than event (high specificity) info. • Re-analysed data in terms of events vs policies
High specificity info. (EVENTS) Low specificity info. (POLICIES) Re-analysis of Slovic (1993) Positive events Positive policies Negative events Negative policies
Re-analysis of Slovic (1993) + new study Reanalysis New Study Valence: F(1, 102) = 82.64, p < 0.001 F(1,35) = 7.61, p < 0.01 Specificity: F(1, 102) = 3.89 , p = 0.051 F(1,35) = 12.19, p < 0.001 V X S: F(1, 102) = 118.17, p < 0.001 F(1,35) = 13.26, p < 0.001
Re-analysis of Slovic (1993) + new study Reanalysis New Study Valence: F(1, 102) = 82.64, p < 0.001 F(1,35) = 7.61, p < 0.01 Specificity: F(1, 102) = 3.89 , p = 0.051 F(1,35) = 12.19, p < 0.001 V X S: F(1, 102) = 118.17, p < 0.001 F(1,35) = 13.26, p < 0.001
3) Information diagnosticity • Trust asymmetry exists for events (high specificity) but not for policies (low specificity) • a) Bad events have large negative effects on trust • b) Good events have small positive effect • c) Good and bad policies have similar large effects • Conclusion: Promote positive policies not events!
4) Event types • Our final psychological insight again suggests it‘s a little more complicated
4) Event types Thought reactor operations were “Dangerous” “Safe” ReactorDangerous A) HIT B) MISS really was Safe C) FALSE ALARM D) ALL CLEAR • Which engineer would you trust/distrust most? • Our final psychological insight again suggests it‘s a little more complicated
4) Event types Thought reactor operations were “Dangerous” “Safe” ReactorDangerous A) HIT B) MISS really was Safe C) FALSE ALARM D) ALL CLEAR • Which engineer would you trust/distrust most? Risk communication • What about if you learned that some of them had tried to cover up their mistakes? • Our final psychological insight again suggests it‘s a little more complicated
Predictions H1) Discrimination ability: Correct > Incorrect Hits & All Clears > False Alarms & Misses H2) Response bias: Caution > Risk Hits & False Alarms > All Clears & Misses Benefits of Hit loom larger; Costs of Miss loom larger H3)Communication bias:Transparency > Reticence Open > Closed
Predictions H H AC FA AC M FA M Open Closed Communication bias
4) Event types • 189 Students with three different scenarios : 1) Nuclear power - tank corrosion 2) Vaccine - holiday 3) Computer virus - in uni library Between Ps design per scenario: 2 (discrimination ability - correct/incorrect) x 2 (response bias - “safe”/”dangerous”) x 2 (communication bias “open”/”closed”) DV = Trust change
Nuclear power H AC H AC Open Closed Communication bias
Nuclear power H FA AC FA M H AC M Open Closed Communication bias
Nuclear power H FA AC FA M H AC M Open Closed Communication bias
Travel vaccines H FA AC H FA M AC M Open Closed Communication bias
Computer viruses FA H AC H FA M AC M Open Closed Communication bias
Summary • Correct decisions (Hits & All Clears) as predicted • „False Alarm effect” - Increases in trust • Closed Misses - Big falls in trust! • Trust change generalises from exemplar to category (Specific doctor to doctor in general) • But: 1) Single event (Cry wolf effect?) 2) Might Misses be preferred (e.g. Legal/Rights)