350 likes | 499 Views
“Is this the right room for an argument?”. “I’ve told you once…”. In the spirit of Winston Churchill (“Madam, we’ve already established that– now we are trying to establish the price”), I offer a syllogism:
E N D
“Is this the right room for an argument?” “I’ve told you once…”
In the spirit of Winston Churchill (“Madam, we’ve already established that– now we are trying to establish the price”), I offer a syllogism: • Human beings differ one to another in their susceptibility to carcinogenesis (a.k.a., their individual risk at a particular exposure); • A single number (a cancer potency factor, an EDxx, a risk at an exposure below the POD, an MOE, etc., etc.) will correctly predict individual risk to someone within the spectrum of human susceptibility; • Therefore, this number will underpredict risk to everyone who is more susceptible than this person, and over-predict risk to everyone who is less susceptible. • Only on Planet EPA does 1 + 2 3
“(We’ve already established that: now by how much…?)” How many of us have our cancer risks under-estimated by EPA, and by how much, concerns me, because it leads to under-regulation. Others may well be concerned with the converse (over-estimation of individual risk). Everyone (even the economists) should be concerned with whether EPA’s estimates of population risk (“body counts”) are biased low: Population risk = (mean risk) * (size of population) Mean risk = Potency * (mean susceptibility) * (mean exposure) Mean susceptibility > (median susceptibility)
Reasonably homogeneous wealth distribution: Typical citizen earns $100,000/yr; 2% in each “tail” differ by a factor of 10 (that is, some earn $10,000; others earn $1 M). Mean income = $ 194,000 (the mean is about twice the median) Suppose we introduce another source of variability, such that the typical income remains unchanged, but the “tails” diverge from the typical by a factor of 1000 (that is, some earn $100; others earn $100 M) MEAN INCOME NOW = $ 39,000,000 (the mean is 390 times the median) IF WE ARE UNCERTAIN WHETHER THE VARIABILITY IS SMALL OR LARGE, WE CANNOT KNOW THE MEAN TO WITHIN A FACTOR OF 200
“TMI” (too much information) on Interindividual variability in Exposure
260 Million (Identical) Large, Spherical Rodents 6 feet 154 pounds
Suggested reading: “Life is Lognormal”-- http://stat.ethz.ch/~stahel/lognormal/ (σ = 1) mode Mean= median x exp(½σ2) median 95th %ile = median x exp(1.645σ) Therefore, Max(95th/mean) = 3.9 mean 95th%ile
Mean Value “Mass” above 90th σln X σ(geometric) “Mass” above 95th
Uncertainty in the Sample Mean drawn from a Lognormal Population (from Finkel, Risk Analysis, 1990) 720 N=10 27 σ(sample) N=100 5.2 N=1000 (Ratio 95th/ 5th %iles) σ(population)
Human Interindividual Variability in Steps along the Pathway to Carcinogenesis (Hattis and Barlow, Human and Ecological Risk Assessment, 1996) σ Category (ln X) (90% c.i.) # Data Sets
(from A. Finkel, chapter in Low-Dose Extrapolation of Cancer Risks, 1995)
From Science and Judgment in Risk Assessment (NRC 1994): “Recommendation: EPA should adopt a default assumption for susceptibility … EPA could choose to incorporate into its cancer risk estimates for individual risk a “default susceptibility factor” greater than the implicit factor of 1 that results from treating all humans as identical. EPA should explicitly choose a default factor greater than 1 if it interprets the statutory language [in the Clean Air Act Amendments of 1990: “the individual most exposed to emissions”] to apply to an individual with high exposure and above-average susceptibility.” “It is possible that ignoring variations in human susceptibility may cause significant underestimation of population [cancer] risk.”
A Colossal Non Sequitur: “The EPA has considered [the NAS recommendation] but has decided not to adopt a quantitative default factor for human differences in susceptibility [to cancer] when a linear extrapolation is used. In general, the EPA believes that the linear extrapolation is sufficiently conservative to protect public health.Linear approaches from animal data are consistent with linear extrapolation on the same agents from human data (Goodman and Wilson, 1991; Hoel and Portier, 1994)” -- EPA Proposed Guidelines for Carcinogen Risk Assessment (1996)
tobacco smoke saccharin nickel cadmium PCBs asbestos arsenic estrogens reserpine
Arguments in Favor of Protecting for “Unidentifiable Variability” • provides impetus to advance the science • already being done for exposure variation • already being done for economic variation • Congressional intent • evidence of public perception • done, without challenge, in OSHA’s MeCl2 rule
Recommendations in Light of Human Variability • Communicate more honestly that current estimates may be • “plausible upper bounds,” but if so, only for average people. • Resist efforts to arbitrarily remove purported “conservatism” • in estimates, and to require that “best estimates” be used. • Replace “default” models with more sophisticated ones only • if sufficient human data exist to generalize the conclusions. • Develop better safeguards so that individual genetic information • can be ascertained and acted upon (esp. when truly a “last • resort”), rather than closing the door on the information and • thereby over-exposing the minority (or the majority).
Recommendations (cont.) 5. Also consider variability in “exposure” to economic harm, with the ultimate goal of replacing with and its PDF
NAS “Science and Decisions, 2008 An assumption that the distribution is lognormal is reasonable, as is an assumption of a difference of a factor of 10 to 50 between the median and upper 95th percentile people… It is clear that the difference is significantly greater than the factor of 1, the current implicit assumption in cancer risk assessment. In the absence of further research leading to more accurate distributional values or chemical-specific information, the committee recommends that EPA adopt a default distribution or fixed adjustment value for use in cancer risk assessment. A factor of 25 would be a reasonable default value to assume as a ratio between the median and upper 95th percentile persons’ cancer sensitivity for the low-dose linear case, as would be a default lognormal distribution. … For some chemicals, as in the 4-aminobiphenyl case study below, variability due to interindividual pharmacokinetic differences could be greater. The suggested default of 25 will have the effect of increasing the population risk (average risk) relative to the median person’s risk by a factor of 6.8: For a lognormal distribution, the mean to median ratio is equal to exp(σ2/2). When the 95th percentile to median ratio is 25, σ is 1.96 [=ln(25)/1.645], and the mean exceeds the median by a factor of 6.8. If the risk to the median human were estimated to be 10−6, and a population of one-million persons were exposed, the expected number of cases of cancer would be 6.8 rather than 1.0.
Conclusions on Susceptibility and Defaults: • Distributions accounting for uncertainty and interindividual variability are preferable to point estimates. • EPA has stated for 25+ years that its point estimates of cancer risk are “plausible upper bounds, and could be as low as zero”: the first statement is false, and the second is misleading (a linear term in the LMS polynomial of zero is a totally different concept than “zero potency.”) • A plausible upper bound would account for the most basic characteristic of human beings (biological individuality); a zero lower bound would require a sensible attitude towards defaults and departures therefrom.
“There are eight degrees of charity, each one higher than the other. The act of charity than which there is none higher is a gift or loan, or offer of partnership or employment, which enables the recipient to self-sustenance. Of lesser degree is charity to the poor wherein donor and recipient are unknown to each other. And lesser still, wherein the donor is unknown to the recipient.And lesser than these, wherein the recipient is unknown to the donor. Of yet lower degree is unsolicited alms put into the hands of the poor,And of lower degree still, alms which have been solicited.Inferior to these is charity which, though insufficient, is cheerfully given. The least charity of all is that which is grudgingly done.” -Rabbi Moses ben Maimon (“Maimonides”), (1135-1204) From Meditation 17: Nunc Lento Sonitu Dicunt, ‘Morieris’ -John Donne (1572-1631) [this bell, tolling for another, says “Thou must die”] “…Any man’s death diminishes me, because I am involved in mankind; and therefore never send to know for whom the bell tolls; it tolls for thee.”
The new decision-making paradigm partially (very partially) adopted by the Science and Decisions committee should actually start not with “problem formulation,” but “solution formulation.”
[the “old” (current) way] What is the acceptable concentration of the substance? Signal of harm (bioassay, epidemiology) What is the risk from the substance? [a possible new way: “solution-focused risk assessment”] Which alternative(s) best reduce overall risk, considering cost? What alternative product(s) or process(es) exist? What product(s) or process(es) leads to exposures? Signal of harm (bioassay, epidemiology)
Main Assertions: • We’ve gotten so far away from grounding risk assessment in a decision-making context that we increasingly refer to as “decisions” things that really are nothing more than pronouncements about risk. EPA “decides” that the NAAQS for ozone should be 75 ppb; OSHA “decides” that workplace air should contain less than 5 ug/m3 chromium(VI)—but these say merely that IF such levels were achieved, a certain acceptable amount of harm would persist. Worse yet, even if we assume perfect implementation and enforcement of controls [that may never have been defined in the “decision” process], at best these “decisions” will achieve a defined amount of exposure reduction, but not necessarily ANY risk reduction, because of risk-versus-risk effects! • If we’re going to decide rather than merely opine, the fundamental chicken-and-egg question is whether risk assessment questions should precede or should follow risk management questions. You are more likely to choose the relatively best decision if you think your way from solutions to problems, rather than dissecting the problem until you are ready [or told that you must be ready!] to “find a solution to the well-understood problem.” The earlier in the process you think about what can be done, the more likely you are to think of better ways to do it, solutions that cannot possibly occur to you after the problem has been defined in such a way as to exclude them.
Other Ways To Recognize SFRA • It reverses the original Red Book paradigm (in which risk management doesn’t begin until risk assessment has “defined the problem”), to one in which a (preliminary) risk management step starts the process and harnesses risk assessment to evaluate solutions. • It shifts the balance more towards design standards and away from pure performance standards—but more than that, it attempts to capture the best features of both. • It combines risk assessment and more holistic decision frameworks such as life-cycle analysis, green design, and inherently safe production processes. It puts risk assessment to work comparing different ways of controlling hazards from the same “functional unit,” in LCA-speak.
Other Guises of SFRA (cont.) • It shifts attention away from continued angst over the performance of risk assessment, and instead picks up on advice (first?) offered by Bernie Goldstein in 1993: “It is time for risk assessors to stop being defensive. Because risk management is broke is no reason to fix risk assessment.” • It changes risk-versus-risk assessment from a theoretical footnote (or a monkey wrench to justify turning our back on risks) to an integral feature of any analysis—from “what are some of the benefits from less exposure to one substance?” to “what are the total benefits of various actions designed primarily to reduce exposure to one substance?” • It restores risk assessment to a central place in environmental policy, just in time to avoid alternative paradigms that do away with it altogether in the name of replacing “paralysis by analysis” with “who needs analysis?”
Small Steps (Lip Service?) to SFRA: • PCCRAM—“In some cases, examining the options may help refine a risk analysis” • Ecological Risk Assessment’s “Problem Formulation (PF)/Planning & Scoping (P&S)”—these improve on the Red Book by vowing to “address the needs of the decision-maker.” HOWEVER, these stop well short of evaluating solutions, because they concentrate on the (welcome) expansion of the definition of the true problem at hand. PF (by its very name and by its practice) is more about designing the risk assessment itself than about designing the optimal solution; P&S is about bringing decision-makers and stakeholders to the table, and ensuring that they consider multiple stressors, but does not imply scoping the solutions.
An Example of “SFRA 2.0” Analytic Exercise(s) Response by Airplane Painters Response by Foam Assemblers Pronouncement Question
Example of Different Paradigms Analytic Exercise(s) Response by Foam Assemblers Response by Airplane Painters Question Pronouncement * An unpainted 747 weighs 500 lbs. less than a painted one; American Airlines saves 7 million gallons of jet fuel per year by eliminating paint (about 0.5% of its total fuel consumption)