200 likes | 367 Views
The Philosophy of Risk Martin Sewell. URMPM World Congress 2012 “The Human Factor in Risk” London 8–9 September 2012. Risk. In my (Bayesian) mind, risk is uncertainty.
E N D
The Philosophy of RiskMartin Sewell URMPM World Congress 2012 “The Human Factor in Risk” London 8–9 September 2012
Risk • In my (Bayesian) mind, riskis uncertainty. • Uncertaintyis best described using a probability distribution, and the broader the distribution, the greater the uncertainty. • We live in a largely deterministic universe; the only truly random processes involve phenomena which are aspects of quantum mechanics. • However, the movement of just three bodies can be chaotic (therefore unpredictable), so in practice life is full of uncertainty, and risk.
BayesianISM • Normative—how rational agents should behave. • A Dutch book is a gambling term for a set of odds and bets that guarantees a profit, regardless of the outcome of the gamble. • At the very least, one who practices self-consistent reasoning should not be susceptible to having a Dutch book made against them. • If an individual is not susceptible to a Dutch book, their previsions are said to be coherent. • A set of betting quotients is coherent if (Ramsey 1926; de Finetti 1937; Shimony 1955) and only if (Kemeny 1955; Lehman 1955) they satisfy the axioms of probability. • Bayes’ theorem is merely the calculus for updating a probability in the light of new evidence, so the validity of the formula itself is not controversial, but it does presuppose the applicability of probability. • By definition, an individual is a Bayesian to the extent that they are willing to put a probability on a hypothesis. • Science is essentially applied Bayesian inference (Sewell 2012).
We are poor Bayesians • The finite frequency theory of probability defines the probability of an outcome as the frequency of the number of times the outcome occurs relative to the number of times that it could have occurred. • For our ancestors in the environment of evolutionary adaptedness, a quick count of the number of predators approaching was likely to have been a useful heuristic for survival, which may explain why we make fewer errors when dealing with relative frequencies than when we are faced with (Bayesian) probabilities. • Fast and frugal frequency-based probability, rather than Bayesian methods, has evolved (Gigerenzer and Hoffrage1995; Cosmidesand Tooby 1996). This leads to failing to take sufficient account of, or even ignoring, prior probabilities, which is known as base rate neglect.
Knightianuncertainty The American economist Frank Knight was the first to explicitly make a conceptual distinction between risk and uncertainty (Knight 1921): • risk—outcome governed by a known probability distribution • uncertainty—outcome governed by an unknown probability model
Unknown unknowns On 12 February 2002 United States Secretary of Defense, Donald Rumsfeld, made the following statement during a press briefing where he addressed the absence of evidence linking the government of Iraq with the supply of weapons of mass destruction to terrorist groups: Rumsfeld was ridiculed at the time for obfuscation, but it actually makes sense. ‘[T]here are known knowns; there are things we know that we know. There are known unknowns; that is to say there are things that, we now know we don’t know. But there are also unknown unknowns—there are things we do not know, we don't know.’
Black Swan Events In his book The Black Swan: The Impact of the Highly Improbable (Taleb 2010), NassimTalebdefines a Black Swan event as having the following three attributes: • Rarity—It is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. • Extreme impact—It carries an extreme impact. • Retrospective (though not prospective) predictability—In spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.
NassimTaleb Taleb (2010) claims that: • History is dominated by Black Swan events—unexpected high-impact rare events that are beyond the realm of normal expectations. • The probability of these high-impact rare events is very small and cannot be calculated. • Due to psychological biases people are blind to uncertainty and unaware of the hugely significant role of these rare events in historical affairs.
ElieAyache • Traditionally, taking a Bayesian perspective, we map probabilities onto states of the world. • In his book The Blank Swan: The End of Probability (Ayache 2010), ElieAyache replaces probability with a market-generated price: contingent claims → market → prices
ElieAyache: An Assessment • Ayache proclaims the ‘end of probability’, but the subjective Bayesian Bruno de Finettihas already famously noted that ‘probability does not exist’. • Where markets exist, Ayache’s thesis makes sense, e.g. bookies odds imply that P(Brazil win 2014 FIFA World Cup) = 0.22 P(England win 2014 FIFA World Cup) = 0.05 • However, markets do not always exist when we are interested in a probability, e.g. P(rain tomorrow). • We need probability in order to conduct science.
Risk aversion • Risk aversion exists when an individual prefers a guaranteed payoff to an uncertain payoff with the same expected value. • Wealth is generated by a multiplicative process. • In order to maximize growth of wealth, one must maximize the expected value of the logarithm of wealth after each period (Kelly 1956; Breiman 1961). • If one is risk neutral in terms of log(wealth), because the log utility function is concave, it follows that one must exhibit a small degree of risk aversion regarding wealth. • Normatively, we have a tendency towards slight risk aversion with respect to utility generated by a multiplicative process.
Prospect theory Prospect theory (Kahneman and Tversky 1979; Tversky and Kahneman 1992) provides a descriptive account of decision making under risk. People tend to be risk averse, and will pay for insurance, but can also be risk seeking for low probability events, such as playing the lottery.
Enlightenment thinking • The British philosopher John Grayargues that Enlightenment thinking aimed to supplant Christianity with a scientific view of the world, but could do so only if it was able to satisfy the hopes it had implanted (Gray 2008). • The Enlightenment belief that humanity is an inherently progressive species is a by-product of Christianity. • Human knowledge increases in a cumulative fashion, science progresses and allows us to improve our material conditions. • Thanks to economic growth modern societies become richer. • However, we cannot expect improvements in ethics, politics, society or humanity. Theories of such progress are myths, which rely on a teleological view and answer the human need for meaning. • History is not a movement in the direction of a universal goal or a march to a better world, human history has no overall meaning. • Gray (2008) states that humans are not becoming more civilized and that conflicts are becoming more savage, in contrast Pinker (2011) evidences the fact that violence is diminishing. • The basis of all of our Western Civilisation utopias (ideologies) is the false elevation of humans to be above and separate from nature. We’re only animals, albeit intelligent ones. • Such ideologies assume that man is good but has been rendered bad by some historical condition that must be overcome.
The Paradox of Increasing risk • Technological progress has led to increasing efficiency. • David Ricardo’s law of comparative advantage has led to increasing globalization. • The above two points have led to fewer supply chain disruptions. • NassimTaleb argues that reducing vulnerability to small shocks may increase the severity of large ones. • Hyman Minsky claims that in a capitalist economy stability is inherently destabilizing. • An analysis of the Dow Jones Industrial Average shows that the long term trend in stock market volatility has been upwards since about 1960, so it could be that risk, in general, is increasing. • Examples of recent Black Swan events include the terrorist attacks in the US on 11 September 2001 and the 2008–2012 global financial crisis.
The Future: Social Discount Rate • How much do we care about the future? • How much should we care about the future? • If we wish to perform a cost-benefit analysis on a future public sector project (such as climate change mitigation), we must choose a discount rate that reflects society’s preference for present benefits over future benefits.
The social discount rate:A Prescription • Although humans are simply vehicles that have evolved as if to help ensure that their genes survive in perpetuity, all that is required of individuals is that their ultimate motivation is to reproduce, so we seek to maximize gene replication within our lifetime, but not beyond. • During a lifetime, generally the risk that a reward will not be available decreases as one approaches the time that the reward is expected, which leads to a hyperbolic discount function. • This account is descriptive, but as we cannot transcend our genes (Moxon 2010), a prescriptive social discount rate must accommodate our motivational set, so optimally coincides. • An individual’s discount function is hyperbolic and reaches 100% at the end of their lifetime. An equitable social discount function should average the population's individual discount functions.
the Media AND BIASES • Our ancestors living in the environment of evolutionary adaptedness would only experience or witness events taking place within the environment of their own tribe. • Our minds evolved during a time when the number of times that we experienced an event would have been fairly representative of the probability of it recurring. Of course, ‘extreme’ events have always been more memorable than mundane events. • What is the effect of newspapers and other media reporting news? • News, by definition, is unpredictable (otherwise, it would have been reported yesterday). • If we cannot predict something, it will be a surprise. So news is surprising, the most likely to be reported news, therefore, is the most surprising. • This means that rare events, such as a man being killed by a shark, are likely to be heavily reported. While, for example, dying of diabetes is much more common, but goes unreported. • The media creates a biased impression of the world around us.
availability • Availability (or saliency) (Tversky and Kahneman 1973) is a cognitive heuristic in which we rely upon knowledge that is readily available, rather than examine other alternatives or procedures. That is, we make decisions based on how easily things come to mind (which is usually something that is likely to be newsworthy). • Modern man is far more likely than his ancestors to hear about events that he is unlikely to experience (such as an airplane crash). • This likely biases our judgement of risks, overestimating the probability of high-impact low probability events.
Conclusions • Probability does not exist, but is necessary if we wish to remain self-consistent and conduct science (science is essentially Bayesian inference). • Don’t expect risk to decline. • Avoid naive optimization at the expense of robustness (Taleb 2010)—Mother Nature includes redundancy, e.g. two eyes, two lungs, two kidneys. • We can’t change human nature in any radical sense, so it is better not to try. • In-group/out-group biases are natural and market forces are inevitable, so elude the control of government. • Don’t expect or force people to make sacrifices in the present for the sake of the distant future. • It is our utility relative to others that matters.
REFERENCES • AYACHE, Elie, 2010. The Blank Swan: The End of Probability. Chichester: Wiley. • BREIMAN, L., 1961. Optimal gambling systems for favorable games. In: Jerzy NEYMAN, ed. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume I. Berkeley: University of California Press, pp. 65–78. • COSMIDES, Leda, and John TOOBY, 1996. Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty. Cognition, 58(1), 1–73. • de FINETTI, Bruno, 1937. La prévision: Sesloislogiques, ses sources subjectives. Annales de l’InstitutHenri Poincaré, 7(1), 1–68. Translated into English as ‘Foresight: Its logical laws, its subjective sources’ in H. E. Kyburg, Jr and H. E. Smokler, eds. Studies in Subjective Probability. New York: Wiley (1964), pp. 93–158. • GIGERENZER, Gerd, and Ulrich HOFFRAGE, 1995. How to improve Bayesian reasoning without instruction: Frequency formats. Psychological Review, 102(4), 684–704. • GRAY, John, 2008. Black Mass: Apocalyptic Religion and the Death of Utopia. Penguin Books. First published by Allen Lane in 2007. • KAHNEMAN, Daniel, and Amos TVERSKY, 1979. Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263–292. • KELLY, Jr, J. L., 1956. A new interpretation of information rate. The Bell System Technical Journal, 35(4), 917–926. • KEMENY, John G., 1955. Fair bets and inductive probabilities. The Journal of Symbolic Logic, 20(3), 263–273. • KNIGHT, Frank H., 1921. Risk, Uncertainty and Profit. Boston, MA: Hart, Schaffner & Marx; Houghton Mifflin Company. • LEHMAN, R. Sherman, 1955. On confirmation and rational betting. The Journal of Symbolic Logic, 20(3), 251–262. • MOXON, Steve, 2010. Culture is biology: Why we cannot ‘transcend’ our genes—or ourselves. Politics and Culture, 1. • PINKER, Steven, 2011. The Better Angels of Our Nature. New York: Viking Books. • RAMSEY, Frank Plumpton, 1926. Truth and probability. In: R. B. BRAITHWAITE, ed. The Foundations of Mathematics and Other Logical Essays. London: Kegan Paul, Trench, Trübner(1931), Chapter VII, pp. 156–198. • SEWELL, Martin, 2012. The demarcation of science. Young Statisticians’ Meeting, Cambridge, 2–3 April 2012. • SHIMONY, Abner, 1955. Coherence and the axioms of confirmation. The Journal of Symbolic Logic, 20(1), 1–28. • TALEB, Nassim Nicholas, 2010. The Black Swan: The Impact of the Highly Improbable. Second ed. New York: Random House Trade Paperbacks. • TVERSKY, Amos, and Daniel KAHNEMAN, 1973. Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. • TVERSKY, Amos, and Daniel KAHNEMAN, 1992. Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5(4), 297–323.