530 likes | 965 Views
Commercial Property Size of Loss Distributions. Glenn Meyers Insurance Services Office, Inc. Casualty Actuaries in Reinsurance June 15 , 2000 Boston, Massachusetts. Outline. Data Classification Strategy Amount of Insurance Occupancy Class Mixed Exponential Model
E N D
Commercial Property Size of Loss Distributions Glenn Meyers Insurance Services Office, Inc. Casualty Actuaries in Reinsurance June 15 , 2000 Boston, Massachusetts
Outline • Data • Classification Strategy • Amount of Insurance • Occupancy Class • Mixed Exponential Model • “Credibility” Considerations • Limited Classification Information • Program Demonstration • Goodness of Fit Tests • Comparison with Ludwig Tables
Separate Tables For • Commercial Property (AY 1991-95) • Sublines • BG1 (Fire and Lightning) • BG2 (Wind and Hail) • SCL (Special Causes of Loss) • Coverages • Building • Contents • Building + Contents • Building + Contents + Time Element
Exposures • Reported separately for building and contents losses • Model is based on combined building and contents exposure • Even if time element losses are covered
Classification Strategy • Amount of Insurance • Big buildings have larger losses • How much larger? • Occupancy Class Group • Determined by data availability • Not used • Construction Class • Protection Class
Potential Credibility Problems • Over 600,000 Occurrences • 59 AOI Groupings • 21 Occupancy Groups • The groups could be “grouped” but: • Boundary discontinuities • We have another approach
The Mixed Exponential Size of Loss Distribution • i’s vary by subline and coverage • wi’s vary by AOI and occupancy group in addition to subline and coverage
The Mixed Exponential Size of Loss Distribution • i = mean of the ith exponential distribution • For higher i’s, a higher severity class will tend to have higher wi’s.
The Fitting Strategyfor each Subline/Coverage • Fit a single mixed exponential model to all occurrences • Choose the wi’s and i’s that maximize the likelihood of the model. • Toss out the wi’s but keep the i’s • The wi’s will be determined by the AOI and the occupancy group.
Varying Wi’s by AOI Prior expectations • Larger AOIs will tend to have higher losses • In mixed exponential terminology, the AOI’s will tend to have higher wi’s for the higher i’s. • How do we make this happen?
Solution • Let W1i’s be the weights for a given AOI. • Let W2i’s be the weights for a given higher AOI. • Given the W1i’s, determine the W2i’s as follows.
Step 1Choose 0 d11 1 Shifting the weight from 1st exponential to the 2nd exponential increases the expected claim cost.
Step 2Choose 0 d12 1 Shifting the weight from 2nd exponential to the 3rd exponential increases the expected claim cost.
Step 3 and 4 SimilarStep 5 — Choose 0 d15 1 Shifting the weight from 5th exponential to the last exponential increases the expected claim cost.
Estimating W’s (for the 1st AOI Group) and d’s (for the rest) Let: • Fk(x) = CDF for kth AOI Group • (xh+1, xh) be the hth size of loss group • nhk = number of occurrences for h and k Then the log-likelihood of data is given by:
Estimating W’s (for the 1st AOI Group) and d’s (for the rest) • Choose W’s and d’s to maximize log-likelihood • 59 AOI Groups • 5 parameters per AOI Group • 295 parameters! Too many!
Parameter Reduction • Fit W’s for AOI=1, and d’s for AOI=10, 100, 1,000, 10,000, 100,000 and 1,000,000. Note AOI coded in 1,000’s • The W’s are obtained by linear interpolation on log(AOI)’s • The interpolated W’s go into the log-likelihood function. • 35 parameters -- per occupancy group
On to Occupancy Groups • LetWbe a set of W’s that is used for all AOI amounts for an occupancy group. • Let X be the occurrence size data for all AOI amounts for an occupancy group. • Let L[X|W] be the likelihood of Xgiven W i.e. the probability of Xgiven W
There’s No Theorem Like Bayes’ Theorem • Let be n parameter sets. • Then, by Bayes’ Theorem:
Bayesian Results Applied to an AOI and Occupancy Group • Let be the ith weight that Wk assigns to the AOI/Occupancy Group. • Then the wi‘s for the AOI/Occupancy Group is:
What Does Bayes’ Theorem Give Us? • Before • A time consuming search for parameters • Credibility problems • If we can get suitable Wk’s we can reduce our search to n W’s. • If we can assign prior Pr{Wk}’s we can solve the credibility problem.
Finding Suitable Wk’s • Select three Occupancy Class Group “Groups” • For each “Group” • Fit W’s varying by AOI • Find W’s corresponding to scale change • Scale factors from 0.500 to 2.000 by 0.025 • 183 Wk’s for each Subline/Coverage
Prior Probabilities • Set: • Final formula becomes: • Can base update prior on Pr{Wk |X}.
The Classification Data Availability Problem • Focus on Reinsurance Treaties • Primary insurers report data in bulk to reinsurers • Property values in building size ranges • Some classification, state and deductible information • Reinsurers can use ISO demographic information to estimate effect of unreported data.
Database Behind PSOLD 30,000+ records (for each coverage/line combination) containing: • Severity model parameters • Amount of insurance group • 59 AOI groups • Occupancy class group • State • Number of claims applicable to the record
Constructing a Size of Loss Distribution Consistent with Available Data Using ISO Demographic Data • Select relevant data • Selection criteria can include: • Occupancy Class Group(s) • Amount of Insurance Range(s) • State(s) • Supply premium for each selection • Each state has different occupancy/class demographics
Constructing a Size of Loss Distribution for a “Selection” • Record output - Layer Average Severity • Combine all records in selection: LASSelection = Wt Average(LASRecords) Use the record’s claim count as weights
Constructing a Size of Loss Distribution for a “Selection” Where: i = ith overall weight parameter wij = ith weight parameter for the jth record Cj = Claim weight for the jth record
The Combined Size of LossDistribution for Several “Selections” • Claim Weights for a “selection” are proportional to Premium Claim Severity • LASCombined = Wt Average(LASSelection) • Using the “selection” total claim weights • The definition of a “selection” is flexible
The Combined Size of LossDistribution for Several “Selections” • Calculate i’s for groups for which you have pure premium information. • Calculate the average severity for jth group
The Combined Size of LossDistribution for Several “Selections” • Calculate the group claim weights • Calculate the weights for the treaty size of loss distribution
The Deductible Problem • The above discussion dealt with ground up coverage. • Most property insurance is sold with a deductible • A lot of different deductibles • We need a size of loss distribution net of deductibles
Size of Loss Distributions Net of Deductibles • Remove losses below deductible • Subtract deductible from loss amount Relative Frequency
Size of Loss Distributions Net of Deductibles • Combine over all deductibles LASCombined Post Deductible Equals Wt Average(LASSpecific Deductible) • Weights are the number of claims over each deductible.
Size of Loss Distributions Net of Deductibles For an exponential distribution: Net severity Need only adjust frequency -- i.e. wi’s
Adjusting the wi’s • Dj jth deductible amount • ij • Wi
Goodness of Fit - Summary • 16 Tables • Fits ranged from good to very good • Model LAS was not consistently over or under the empirical LAS for any table • Model unlimited average severity • Over empirical 8 times • Under empirical 8 times
A Major Departure from Traditional Property Size of Loss Tabulations • Tabulate by dollars of insured value • Traditionally, property size of loss distributions have been tabulated by % of insured value.
Fitted Average Severity as % of Insured Value Blow up this area
Fitted Average Severity as % of Insured Value Eventually, assuming that loss distributions based on a percentage of AOI will produce layer costs that are too high.
PSOLD Demonstration • No Information • Size of Building Information • Size + Class Information • Size + Class + Location Information