1 / 22

ACTUARIAL DATA MANAGEMENT (ADM) IN A HIGH-VOLUME TRANSACTIONAL PROCESSING ENVIRONMENT

Learn about the critical role of Actuarial Data Management (ADM) in optimizing performance and analytical processes for actuarial staff operating in high-volume transactional processing environments, along with key tools and modern roles in end-user computing.

gdouglas
Download Presentation

ACTUARIAL DATA MANAGEMENT (ADM) IN A HIGH-VOLUME TRANSACTIONAL PROCESSING ENVIRONMENT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ACTUARIAL DATA MANAGEMENT (ADM) IN A HIGH-VOLUME TRANSACTIONAL PROCESSING ENVIRONMENT Joe Strube and Bryant Russell GMAC Insurance, Southfield, Michigan

  2. Goal of the ADM Function • Equip the Actuarial Staff with the data resources necessary to excel in the performance of their functions • Not simply “get the actuaries data” • Add value to their analytical processes • C.A.T. Criteria • Complete (Collect, Consolidate, Derive) • Accurate (Clean dirty/distorted data) • Timely (Prioritize data resource deliveries) • Possibly assume responsibility for next stage

  3. What Is A High-Volume Transactional Processing Environment (HVTPE)? • Frankly, It’s Arbitrary! • Online Transaction Processing System (OLTP) • Mainframe/Midrange Server Extract Files • Operational Data Store (ODS) • Data Warehouse (DW) • Data Mart (DM) • Desktop DB • Consider your most granular actuarial data resource • Are 1 million or more transactions added per month? • If YES, you’re operating in an HVTPE

  4. ADM, An Outgrowth of End User Computing • Rockart & Flannery Research Study (1983) • Sloan School of Management at MIT • Interviewed 250 People • 3 Fortune 50 Manufacturers • 2 Major Insurance Companies • 3 Sizable Canadian Companies • Identified Six Types of End Users • Non-Programming End Users • Command Level Users • End User Programmers • Functional Support Personnel • End User Computing Support Personnel • DP Programmers

  5. The Square Root of 12,345,678,987,654,321 is 111,111,111

  6. As Technology Evolves, So Do Deliverables • Mainframe to Minicomputer to Microcomputer (PC) • Complex mainframe programs • Originally produced reports • Historical data files • Data downloads • Data Management Technology Branches Out • Data Warehousing • Data Marts • Online Analytical Processing (OLAP) • Extraction-Transformation-Loading Software (ETL) • Meta Data Repositories • Decision Support Systems • Data Profiling/Cleaning/Integration Software • Data Mining

  7. Modern Roles in End User Computing • Non-Programming End Users (Business Manager, Process Modeler, Trainer) • Command Level Users (HR Rep, Accountant, Claim Analyst, Market Analyst) • End User Programmers (Actuary, Financial Analyst, Strategic Planner) • Functional Support Personnel (Data Manager/Administrator, Actuarial Technician) • End User Computing Support Personnel (Help Desk, User Hotline, DSS Analyst, DW Support Team) • DP Programmers (a.k.a. Systems Analyst) (Internal/Outside Contractor, Technical Consultant)

  8. Key Roles for ADM in a HVTPE • The Actuary • The Actuarial Technician • Information Technology Dept. (IT)

  9. HVTPE and The Role of the Actuary • HVTPE offers the opportunity to work with detailed, granular data • Classification Analyses • GLM Analyses • Loss Distributions • Data Mining • Comparison of data sources across functional areas becomes possible • Policy Year vs. Accident Year vs. Calendar/Accident Year “slices” can be reconciled more easily • Common data source overcomes the “my data-your data” syndrome

  10. The Role of the Actuary • HVTPE is inherently multidimensional • Transactional data is very granular, detailed • Multiple views can be aggregated from common source (transactional data) • Very useful for examining interactions between factors • As a “source”, HVTPE allows actuarial data repositories to be created, stored, and maintained over time • Actuary can select data based on actuarial value • Less reliance on non-actuarial reports

  11. Q: Why not let the actuary do it all? • Many actuaries well-versed with database and analytical software • But data from HVTPE is just the first step • Data extraction is input to analyses • Actuarial work typically requires more than just a summary report of historical experience • Data extraction process can overwhelm traditional desktop tools • Over 1 million transactions per month • Even data storage, can overwhelm desktop resources available to actuary

  12. Q: Why not let others (ADM) do it all? • Data specialization enables more robust process of gathering, storing, reporting data • Data skills are specialized • Software and hardware can be “industrial strength” • Monitoring, balancing, aggregating are important, but non-actuarial tasks • HVTPE is not a static environment • Changes to data definitions • Addition of new data elements • Addition of new data sources

  13. Q: Why not let others (ADM) do it all? • Value of actuarial data elements may be seen as secondary to other functional areas • Required level of detail for actuarial analysis is different • Historical retention periods are different • Specific data elements may be uniquely valuable to the actuary – other areas would not gather/maintain these. • Actuarial data needs are dynamic • Summary level varies by type of actuarial analysis • Variables included can range from few to many • Not realistic to try & build all possible aggregations ahead of time (OLAP tools notwithstanding)

  14. The Role of the Actuary • Identify the value of actuarial data • Critical Data Elements • Actuarially Valuable Data Elements • Nonessential Data Elements (for actuarial analysis) • Determine required level of detail • Granularity of data (e.g. at transactional level) • Historical time periods and retention periods • Definitions of derived data (e.g. books of business, classes, etc.) • Support the Value Proposition of Data Dictionary • What does the data mean? What are meaningful values? • Have definitions, coding, accuracy, completeness changed over time?

  15. The Role of the Actuarial Technician • Data Facilitator • Building Inspector • Lawyer • Guinea Pig • Data Supplier • Fulcrum • Sculptor • Magician

  16. The Role of IT Management • Manage Infrastructure • Manage Corporate Projects

  17. Data Management Processes • Data Modeling, Metadata, Data Dictionary • Data Extraction, Profiling, Quality • Data Integration, Transformation • Data Loading • Not Data Retrieval, Reporting

  18. There are 336 cavities on a golf ball.

  19. GMAC Insurance Case Study • Vehicle Service Contracts • Multiple period contracts – under 12 months to over 84 months • Multiple countries and business partners • Multiple data sources • Over 1 million transactions per month

  20. GMAC Insurance Case Study • Vehicle Service Contracts • Multiple period contracts – under 12 months to over 84 months • Multiple countries and business partners • Multiple data sources • Over 1 million transactions per month

  21. Q & A

  22. In Nawlins . . . It’s mandatory to jazz things up!

More Related