360 likes | 443 Views
General Iteration Algorithms. by Luyang Fu, Ph. D., State Auto Insurance Company Cheng-sheng Peter Wu, FCAS, ASA, MAAA, Deloitte Consulting LLP 2007 CAS Predictive Modeling Seminar Las Vegas, Oct 11-12, 2007. Agenda. History and Overview of Minimum Bias Method
E N D
General Iteration Algorithms by Luyang Fu, Ph. D., State Auto Insurance Company Cheng-sheng Peter Wu, FCAS, ASA, MAAA, Deloitte Consulting LLP 2007 CAS Predictive Modeling Seminar Las Vegas, Oct 11-12, 2007
Agenda • History and Overview of Minimum Bias Method • General Iteration Algorithms (GIA) • Conclusions • Demonstration of a GIA Tool • Q&A
History on Minimum Bias • A technique with long history for actuaries: • Bailey and Simon (1960) • Bailey (1963) • Brown (1988) • Feldblum and Brosius (2002) • A topic in CAS Exam 9 • Concepts: • Derive multivariate class plan parameters by minimizing a specified “bias” function • Use an “iterative” method in finding the parameters
History on Minimum Bias • Various bias functions proposed in the past for minimization • Examples of multiplicative bias functions proposed in the past:
History on Minimum Bias • Then, how to determine the class plan parameters by minimizing the bias function? • One simple way is the commonly used an “iterative” methodology for root finding: • Start with a random guess for the values of xi and yj • Calculate the next set of values for xi and yj using the root finding formula for the bias function • Repeat the steps until the values converge • Easy to understand and can be programmed in almost any tool
History on Minimum Bias • For example, using the balanced bias functions for the multiplicative model:
History on Minimum Bias • Past minimum bias models with the iterative method:
Iteration Algorithm for Minimum Bias • Theoretically, is not “bias”. • Bias is defined as the difference between an estimator and the true value. For example, is bias. If , then xhat is an unbiased estimator of x. • To be consistent with statistical terminology, we name our approach as General Iteration Algorithm.
Issues with the Iterative Method • Two questions regarding the “iterative” method: • How do we know that it will converge? • How fast/efficient that it will converge? • Answers: • Numerical Analysis or Optimization textbooks • Mildenhall (1999) • Efficiency is a less important issue due to the modern computation power
Other Issues with Minimum Bias • What is the statistical meaning behind these models? • More models to try? • Which models to choose?
Summary on Historical Minimum Bias • A numerical method, not a statistical approach • Best answers when bias functions are minimized • Use of an “iterative” methodology for root finding in determining parameters • Easy to understand and can be programmed in many tools
Connection Between Minimum Bias and Statistical Models • Brown (1988) • Show that some minimum bias functions can be derived by maximizing the likelihood functions of corresponding distributions • Propose several more minimum bias models • Mildenhall (1999) • Prove that minimum bias models with linear bias functions are essentially the same as those from Generalized Linear Models (GLM) • Propose two more minimum bias models
Connection Between Minimum Bias and Statistical Models • Past minimum bias models and their corresponding statistical models
Statistical Models - GLM • Advantages include: • Commercial software and built-in procedures available • Characteristics well determined, such as confidence level • Computation efficiency compared to the iterative procedure
Statistical Models - GLM • Issues include: • Requires more advanced knowledge of statistics for GLM models • Lack of flexibility: • Reliance on commercial software / built-in procedures. • Cannot do the mixed model. • Assumes a pre-determined distribution of exponential families. • Limited distribution selections in popular statistical software. • Difficult to program from scratch.
Motivations for GIA • Can we unify all the past minimum bias models? • Can we completely represent the wide range of GLM and statistical models using Minimum Bias Models? • Can we expand the model selection options that go beyond all the currently used GLM and minimum bias models? • Can we fit mixed models or constraint models?
General Iteration Algorithm • Starting with the basic multiplicative formula • The alternative estimates of x and y: • The next question is – how to roll up xi,j to xi, and yj,i to yj ?
Possible Weighting Functions • First and the obvious option - straight average to roll up • Using the straight average results in the Exponential model by Brown (1988)
Possible Weighting Functions • Another option is to use the relativity-adjusted exposure as weight function • This is Bailey (1963) model, or Poisson model by Brown (1988).
Possible Weighting Functions • Another option: using the square of relativity-adjusted exposure • This is the normal model by Brown (1988).
Possible Weighting Functions • Another option: using relativity-square-adjusted exposure • This is the least-square model by Brown (1988).
General Iteration Algorithms • So, the key for generalization is to apply different “weighting functions” to roll up xi,j to xi and yj,i to yj • Propose a general weighting function of two factors, exposure and relativity: WpXq and WpYq • Almost all published to date minimum bias models are special cases of GMBM(p,q) • Also, there are more modeling options to choose since there is no limitation, in theory, on (p,q) values to try in fitting data – comprehensive and flexible
2-parameter GIA • 2-parameter GIA with exposure and relativity adjusted weighting function are:
2-parameter GIA and GLM • GMBM with p=1 is the same as GLM model with the variance function of • Additional special models: • 0<q<1, the distribution is Tweedie, for pure premium models • 1<q<2, not exponential family • -1<q<0, the distribution is between gamma and inverse Gaussian • After years of technical development in GLM and minimum bias, at the end of day, all of these models are connected through the game of “weighted average”.
3-parameter GIA • One model published to date not covered by the 2-parameter GMBM: Chi-squared model by Bailey and Simon (1960) • Further generalization using a similar concept of link function in GLM, f(x) and f(y) • Estimate f(x) and f(y) through the iterative method • Calculate x and y by inverting f(x) and f(y)
3-parameter GIA • Propose 3-parameter GMBM by using the power link function f(x)=xk
3-parameter GIA • When k=2, p=1 and q=1 • This is the Chi-Square model by Bailey and Simon (1960) • The underlying assumption of Chi-Square model is that r2 follows a Tweedie distribution with a variance function
Mixed GIA • For commonly used personal line rating structures, the formula is typically a mixed multiplicative and additive model: • Price = Base*(X + Y) * Z
Constraint GIA • In real world, for most of the pricing factors, the range of their values are capped due to market and regulatory constraints
Numerical Methodology for GIA • For all algorithms: • Use the mean of the response variable as the base • Starting points:1 for multiplicative factors; 0 for additive factors • Use the latest relativities in the iterations • All the reported GIAs converge within 8 steps for our test examples • For mixed models: • In each step, adjust multiplicative factors from one rating variable proportionally so that its weighted average is one. • For the last multiplicative variable, adjust its factors so that the weighted average of the product of all multiplicative variables is one.
Conclusions • 2 and 3 Parameter GIA can completely represent GLM and minimum bias models • Can fit mixed models and models with constraints • Provide additional model options for data fitting • Easy to understand and does not require advanced statistical knowledge • Can program in many different tools • Calculation efficiency is not an issue because of modern computer power.
Demonstration of a GIA Tool • Written in VB.NET and runs on Windows PCs • Approximately 200 hours for tool development • Efficiency statistics: