1 / 26

Distinguishing the Forest from the Trees 2006 CAS Ratemaking Seminar

Distinguishing the Forest from the Trees 2006 CAS Ratemaking Seminar. Richard Derrig, PhD, Opal Consulting www.opalconsulting.com Louise Francis, FCAS, MAAA Francis Analytics and Actuarial Data Mining, Inc. www.data-mines.com. Data Mining.

Download Presentation

Distinguishing the Forest from the Trees 2006 CAS Ratemaking Seminar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distinguishing the Forest from the Trees2006 CAS Ratemaking Seminar Richard Derrig, PhD, Opal Consulting www.opalconsulting.com Louise Francis, FCAS, MAAA Francis Analytics and Actuarial Data Mining, Inc. www.data-mines.com

  2. Data Mining • Data Mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. In order to achieve this, data mining uses computational techniques from statistics, machine learning and pattern recognition. • www.wikipedia.org

  3. The Problem: Nonlinear Functions

  4. An Insurance Nonlinear Function:Provider Bill vs. Probability of Independent Medical Exam

  5. Common Links for GLMs The identity link: h(Y) = Y The log link: h(Y) = ln(Y) The inverse link: h(Y) = The logit link: h(Y) = The probit link: h(Y) =

  6. Nonlinear Example Data

  7. Desirable Features of a Data Mining Method: • Any nonlinear relationship can be approximated • A method that works when the form of the nonlinearity is unknown • The effect of interactions can be easily determined and incorporated into the model • The method generalizes well on out-of sample data

  8. Decision Trees • In decision theory (for example risk management), a decision tree is a graph of decisions and their possible consequences, (including resource costs and risks) used to create a plan to reach a goal. Decision trees are constructed in order to help with making decisions. A decision tree is a special form of tree structure. • www.wikipedia.org

  9. Different Kinds of Decision Trees • Single Trees (CART, CHAID) • Ensemble Trees, a more recent development (TREENET, RANDOM FOREST) • A composite or weighted average of many trees (perhaps 100 or more) • There are many methods to fit the trees and prevent overfitting • Boosting: Iminer Ensemble and Treenet • Bagging: Random Forest

  10. The Methods and Software Evaluated 1) TREENET 5) Iminer Ensemble 2) Iminer Tree 6) Random Forest 3) SPLUS Tree 7) Naïve Bayes (Baseline) 4) CART 8) Logistic (Baseline)

  11. CART Example of Parent and Children NodesTotal Paid as a Function of Provider 2 Bill

  12. CART Example with Seven Nodes IME Proportion as a Function of Provider 2 Bill

  13. CART Example with Seven Step FunctionsIME Proportions as a Function of Provider 2 Bill

  14. Ensemble Prediction of Total Paid

  15. Ensemble Prediction of IME Requested

  16. Naive Bayes • Naive Bayes assumes conditional independence • Probability that an observation will have a specific set of values for the independent variables is the product of the conditional probabilities of observing each of the values given category cj

  17. Bayes Predicted Probability IME Requested vs. Quintile of Provider 2 Bill

  18. Naïve Bayes Predicted IME vs. Provider 2 Bill

  19. The Fraud Surrogates used as Dependent Variables • Independent Medical Exam (IME) requested • Special Investigation Unit (SIU) referral • IME successful • SIU successful • DATA: Detailed Auto Injury Claim Database for Massachusetts • Accident Years (1995-1997)

  20. Results for IME Requested

  21. TREENET ROC Curve – IMEAUROC = 0.701

  22. Ranking of Methods/Software – 1st Two Surrogates

  23. Ranking of Methods/Software – Last Two Surrogates

  24. Plot of AUROC for SIU vs. IME Decision

  25. Plot of AUROC for SIU vs. IME Favorable

  26. References • Brieman, L., J. Freidman, R. Olshen, and C. Stone, Classification and Regression Trees, Chapman Hall, 1993. • Friedman, J., Greedy Function Approximation: The Gradient Boosting Machine, Annals of Statistics, 2001. • Hastie, T., R. Tibshirani, and J. Friedman, The Elements of Statistical Learning, Springer, New York., 2001. • Kantardzic, M., Data Mining, John Wiley and Sons, 2003.

More Related