630 likes | 640 Views
Discover how Decision Support Systems can improve child support program performance using data mining, predictive analytics, and strategic planning. Learn about the synergy between compliance and performance through a comprehensive business intelligence approach.
E N D
Data, Data – Where is my Data?! Utilizing a Data Warehouse • Presenters: • Jeff Cohen, Vermont IV-D Director • Jeanette Savoy, Central Registry Supervisor, CDHS, CSE • Keith Horton, Georgia IV-D Director May 23, 2011, 1:30 – 3:00 p.m.
Decision Support Systems: • Using Data to Improve Child Support Program Performance Jeff Cohen Vermont Office of Child Support
Agenda Decision Support for Performance • Strategic Value • Decision Support Components • Reporting • Dashboards • Data Mining • Predictive Analytics • OCSE model data warehouse
Performance Drivers: • Incentive Formula • Self Assessment • Strategic Plans • State Legislature • State Performance Audits • Public Expectations
Building Blocks for Excellence Business Results Customer Focus Process Mgm’t Strategic Plan Human Resources Leadership Information and Analysis Information and Analysis
Objective: Management by fact • ‘What gets measured gets done’
Decision Support System Data Warehouse ETL Data Sources Data Storage Presentation/ OLAP Services End User Access CS Case Data Marts * Data Warehouse Metadata Summary Detail End User App Extract Transform Load Welfare Report Writers Data Staging Ad Hoc Query DOC Data Mining Source Data Extraction Other Data Sources … Forecasting & Projections DSS Architecture
Time • Dimension • Fiscal Year • Quarter • Month • Day • Demographic • Dimension • Income • Gender • Location • Education • Facts • Current $ • Arrears $ • # of Cases • Ownership • Dimension • Region • County • Worker • Case # • Case Status • Dimension • IV-D Type • Status Type Example Star Schema
Some DSS Uses and Demos • Dashboard • Reports • Drilling • Data Mining • Predictive analytics • Linking
Statewide Strategic_Plan_Parentage_Establishment_for_Region_and_Worker_1195.pdf
Data Mining – The Problem The Objective Height Weight
Lift: Random List vs Model-ranked list Cumulative % Hits % of cases reviewed 5% of random list have 5% of targets…
Lift: Random List vs Model-ranked list Lift at 5% of list reviewed = 21% / 5% In other words, 4.2 times better than random Cumulative % Hits % of cases reviewed 5% of random list have 5% of targets… but 5% of model ranked list have 21% of targets.
Colorado’s • Experience: • Business Intelligence Grant • Performance Dashboard • Presented by: • Jeanette Savoy, • Supervisor of the Colorado Central Registry
Purpose • Synergistic relationship between compliance and performance • Initiate system caseload analysis capability using business intelligence tools • Replace monthly exception-based reports • Improve individual and county caseload performance
Emphasis • Accurate representation of information • Clear understanding by CSE workers • Ability to drill down to case level to specify actions needed
Project Goal The CSe-Tools Performance Dashboard will give CSE staff the tools to view caseload health and identify actions to help improve caseload management and program effectiveness, as measured by the four key performance indicators.
Dashboard Design Principles • Keep it simple • Provide information quickly and clearly • Minimize distractions and unnecessary embellishments that can create confusion • Maintain consistency with the design to ensure accurate interpretation
Project Development • Business Intelligence Workgroup • County, State and Federal representation • Review proposed solutions • Provide input on specific functionality • Elicit support, participation and cooperation • Project Development Team • Small group of programmers • Development of both data warehouse and performance dashboard
Data Warehouse (Closet) • Provides appropriate information for the dashboard without overloading the main production database • “Warehouse” cost prohibitive • Initial “closet” to be expanded in incremental steps
CSe-Tools • Browser-based application toolkit • Front-end application for statewide system • Interfaces with statewide system using web services and file transfers • Search and reporting capabilities • Drill down capabilities to case and financial detail information from the statewide system
Performance Dashboard • Prominently displayed in the middle of the CSe-Tools homepage • Initial and immediate portrayal of caseload health on a single screen • Visual display of prioritized information • Ability to drill down to a list of cases that require action
Performance Dashboard (cont.) • Specific to the following CSE Key Performance Indicators (KPI) • Paternity Establishment Percentage • Percent of Cases with Support Orders • Percent of Current Support Paid • Percent of Arrears Cases with a Payment KPI = quantifiable measurement that reflects an organization’s critical success factors
Work Lists • Based on logic of KPIs • Redesigned after implementation based on feedback from grant participants • Use of “tags” (colored dots) to identify a set of criteria indicating the type of action that may be needed
“Tags” • Green • Blue • Black • Peach NCP employer is verified Wage withholding is active NCP employer is verified Wage withholding is inactive NCP in Department of Corrections NCP employer is not verified Reciprocal case is not initiating NCP address is verified
KPI: Percent of Current Support Paid • The gold area shows the entire measure amount (i.e. -Total current support owed for worker’s YTD). • The vertical bar shows the goal. (i.e. - MSO goal for YTD caseload). • The parallel bar shows progress toward the goal (i.e. - Total MSO paid for YTD caseload). • Hover over each one to see numerical amounts.
Evaluation • Data analysis for • Percent of Current Support Paid • Percent of Arrears Cases with a Payment • Post-implementation surveys and interviews
Evaluation (cont.) • Statistical findings invalid • Low number of demonstration participants • Short time period (17 months) for grant • Inability to develop assumptions and findings representative of the State • Inability to isolate impact of variables
Evaluation (cont.) • Post implementation surveys and interviews provided wealth of information • Lessons learned will ensure successful rollout of Performance Dashboard in Colorado • Valuable information for other states interested in implementing a performance dashboard
Lessons Learned • Training: Key to success • Two-fold • Functionality of the dashboard, especially if new technology is involved • How to use the dashboard to manage a caseload • Define clear expectations • Replacement vs. supplemental tool • Resistance to change
Lessons Learned (cont.) • Value – Caseload size • Less value for workers with smaller caseloads or from smaller counties • Ten large counties in Colorado = 80% of State’s caseload = very valuable to Colorado
Lessons Learned (cont.) • Support must come from the TOP down • Real-time interface is critical • More information is not always better • Ability to create personalized work lists • Identify cases reported on multiple work lists
Lessons Learned (cont.) • Functionality to record notes on work lists minimizes duplicated research and allows continuous analysis at a case level • Matrix of appropriate actions for each work list / tag is helpful for less experienced workers
Finale • Final grant report submitted September 30, 2010 • Statewide rollout to commence July 2011