430 likes | 562 Views
U.S. Earthquake Frequency Estimation - Ratemaking for Unusual Events. CAS Ratemaking Seminar Nashville, Tennessee March 11-12, 1999 Stuart B. Mathewson, FCAS, MAAA, M.EERI ICAT Managers. Introduction. Tough to price high severity, low frequency coverage
E N D
U.S. Earthquake Frequency Estimation - Ratemaking for Unusual Events CAS Ratemaking Seminar Nashville, Tennessee March 11-12, 1999 Stuart B. Mathewson, FCAS, MAAA, M.EERI ICAT Managers
Introduction • Tough to price high severity, low frequency coverage • Often priced without actuarial involvement • Catastrophe portion of property coverage an obvious example • Property actuaries now have tough problems
Introduction • New methods to analyze catastrophes allow for much better rate making • The frequency estimates are key • Among major perils, earthquake frequency is most problematic • This is a brief survey of methods, sources and current issues in seismic frequency
Experts • Seismologists and Geologists • Seismological Society of America (SSA) • U. S. Geological Survey (USGS) • Cal. Div. Of Mines & Geology (CDMG) • Southern California Earthquake Center (SCEC) • Earthquake Engineering Research Institute (EERI) • Others for Central U.S., Pacific NW, etc.
Seismologists’ Methods • Slip Rate Analysis • Plate Tectonics • For seismicity at plate boundaries • Scientists can measure the rate at which one plate moves in relation to another
Seismologists’ Methods • Slip Rate Analysis • Measure overall slip • Amount of slip correlated to amount of energy released - measured by Magnitude • Observe displacement in historical event • Calculate return time for that event
Seismologists’ Methods • Slip Rate Analysis • Simple example • San Andreas Fault moves about 2 inches per year • 1906 San Francisco earthquake had maximum displacement of about 20 feet • This gives a return time of 120 years
Seismologists’ Methods • Slip Rate Analysis • Real world much more complicated • Faults are not simple lines at the plate boundary • For instance, Southern California has a complex system of faults • Even Northern California is not simple • Scientists actually apportion the amount of accumulated slip
Seismologists’ Methods • Slip Rate Analysis • Works well where plate tectonics gives a measure of slip • Other approaches necessary elsewhere, or as supplement to slip rate
Seismologists’ Methods • Gutenberg-Richter Relationship • Log relationship between magnitude and frequency • log N = a - b*M • Fitted to actual experience • Used to project large events beyond historical record
Seismologists’ Methods • Paleoseismic research • Washington-Oregon • Oregon - Study of buried soils beneath marshes to show evidence of subsistence • 16 disturbance events over 7,500 years • Return times of nearly 500 years, if all disturbances caused by earthquakes
Seismologists’ Methods • Paleoseismic research • Washington-Oregon • Washington - Study of buried soils beneath marshes to show evidence of subsistence • one very large shallow earthquake about 1,000 years ago on fault through Seattle
Seismologists’ Methods • Paleoseismic research • Washington-Oregon • Pacific Northwest has potential for a great subduction earthquake • Great Subduction Earthquake of January 27, 1700 • Japanese tsunami records and local traditional stories
Seismologists’ Methods • Paleoseismic research • New Madrid • Great earthquakes of 1811-12 • Trench and date sand blows • Dated large events at 900 and 1300 A.D.,(in addition to 1811-12) with two others possible in last 2,000 years • Magnitudes of events not known, but large enough to cause sand blows ( > 7.5 ? )
Seismologists’ Methods • Paleoseismic research • New Madrid • This implies a return time of 500 for large events - maybe 7.5 • Some were larger than others - scientists’ estimates of 400-1,100 year for 8.0+ • An additonal event between 1400-1600?
Seismologists’ Methods • Paleoseismic research • Southern California • Trenching • Landers EQ (7.3) - Multiple faults, some not broken for over 10,000 years • San Andreas fault - one site showed 10 events over 2,000 years, but they show clustering
Sources • USGS Open-File Report 88-398 • 1988 study of probabilities on major faults in Northern and Southern California • Probabilities of certain events in next 30 years • Bay area ( 7.0 ) 50% • So. San Andreas ( 7.5 - 8.0 ) 60% • San Jacinto ( 6.5 - 7.0 ) 50%
Sources • USGS Circular 1053 • 1990 study updating probabilities on major faults in Northern California • Bay Area ( > 7 ) 67% • Hayward North 20% => 28% • Hayward South 20% => 23% • Peninsula S. A. 20% => 23% • Add Rogers Creek => 22%
Sources • SCEC paper “Seismic Hazards in Southern California: Probable Earthquakes, 1994 to 2024” • Updated study on Southern California Earthquakes • Southern California ( > 7 ) 80-90%
New Hazard Maps • Series of maps covering the U.S. (USGS) and California (USCG/CDMG) showing probabilistic maps of peak accelerations • Various return times • Shown as exceedance probabilities • Examples -- Peak ground accelerations with 10% probability of exceedance in 50 years
Non-California Sources • For other areas including Central US, Pacific Northwest, South Carolina, Salt Lake City, etc. • Sources listed earlier, plus local universities and state geologists
Ratemaking Issues • Loss Costs are very sensitive to model frequencies • Eg., sensitivity to New Madrid 8.0+ assumption
Current Thoughts • Gutenberg-Richter vs. Characteristic Earthquake • Seismologists disagree on this • Gutenberg-Richter seems to apply for a region, maybe not a single fault • But, how big a region?
Current Thoughts • Simplistic example • Characteristic earthquake = 7.0 • GR ranges 6.0 to 7.5
Current Thoughts • “The Paradox of the Expected Time until the Next Earthquake,” by Sornette and Knopoff • Paper in SSA Bulletin challenges conventional wisdom • The chances of a quake in an area may not increase with time since the last one • Clusters
Current Thoughts • The Enigma in the SCEC report - D.D.Jackson at EERI, 1998 (and earlier) • The SCEC estimates give much higher estimates of probability than the historical record (150 yrs ) suggest -- by a factor of 2
Current Thoughts • The Enigma in the SCEC report - D.D.Jackson at EERI, 1998 (and earlier) • Why? • Non-earthquake creep - latest research => No • Lucky? - Maybe, but not too likely • Or …….
Current Thoughts • The Enigma in the SCEC report - D.D.Jackson at EERI, 1998 (and earlier) • Jackson suggests that earthquakes larger than the Ft. Tejon earthquake of 1857 are possible and necessary to use up the strain • Perhaps an 8.6 earthquake is possible every 1000 years (Richter)
Current Thoughts • The Enigma in the SCEC report - D.D.Jackson at EERI, 1998 (and earlier) • Good news? Maybe • If there is one huge event, we would then project significantly fewer 6’s and 7’s
Current Thoughts • 1998 SSA Meeting • Researchers disputed ‘great earthquake’ theory • Historical record may be skewed
Conclusion • Earthquake frequency is key to model-driven rates, but carries much uncertainty • Scientific community has done much to help • Scientists are still not in agreement
Conclusion • Work will continue to progress, and estimates will change • We, as ratemakers, must understand assumptions in models … and the sensitivities.