1 / 46

Using Data Analytics to Support Decision-Making

Learn how big data and data analytics can be used to support decision-making in the public sector. Discover the power of modern computing to unlock insights and transform data into usable information. Explore a real-world example of using big data to address perception of excessive overhead spending.

mlove
Download Presentation

Using Data Analytics to Support Decision-Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data Analytics to Support Decision-Making May 22, 2015

  2. Agenda • What is ‘big data’

  3. Big Data … • The first documented use of the term “big data” was by NASA scientists in 1997 to describe a problem they had with visualization (i.e. computer graphics) which “provided an interesting challenge for computer systems: data sets are generally quite large, taxing the capacities of main memory, local disk, and even remote disk • Wikipedia defines big data (before the Oxford English Dictionary) as “an all-encompassing term for any collection of data sets so large and complex that it becomes difficult to process using on-hand data management tools or traditional data processing applications.” • 2011 big data study by McKinsey defined big data as (#3) “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze,” Gil Press, “12 Big Data definitions what’s yours”, Forbes, 9/03/2014

  4. Computing Power … • It is said, computing power doubles every 18 – 24 months • Americans went to the moon in 1969 • Since then computer power has doubled 23 to 31 times or between 8.4M and 2,147M times (assuming a base of 2)

  5. Our Data Sources • Within the public sector (all levels), we have a myriad of data sources • Financial systems • Human resource systems • Client service statistics (various) • Property related data • Asset utilization data (cars, etc.) • For the most part these systems do not talk to each other

  6. So How Does This Apply … • Not here to sell you a business intelligence solution • My assumption for the remainder of the presentation is that you can assemble the data you need to perform your analysis • Most of the time you do not actually need to invest in a complex solution, because not all the data is meaningful to all your decision-makers • Decision-makers at different levels have different information needs

  7. Data Analytics: Science or Art • DATA: facts that can be analyzed or used to make decisions • INFORMATION: the result of applying data processing techniques to data to give it meaning • KNOWLEDGE: human faculty resulting from interpreted information; understanding that germinates from a combination of data, information, experience and individual interpretation; its repeatable • INTELLIGENCE: the ability to apply knowledge to manipulate one’s environment; using questions to focus effort and establish parameters for predictive modeling

  8. Data Analytics is both Science and Art • Knowledge approach (science): • Mine the data (typically using software) to see if there are previously unknown relationships / information buried in the data which would be useful to decision-makers • Focus is on understanding the past • Intelligence approach (blend art with science): • Determine what information is need to support decision-makers • Mine the data to answer these specific questions • Use past information to predict possible future outcomes

  9. A Current Example … • Problem Statement • There is a perception that too much is spent on overhead (internal services) • How would you use ‘Big Data’, modern computing power and the previous discussion to address this perception?

  10. Using the Intelligence Approach … • What question are we answering for decision-makers: • What is a reasonable level of overhead spending for a department • What sources of data do we have to transform into information • Departmental Financial Information • Central Financial Reporting and Management System • Financial Statements • Public Accounts

  11. Data Transformed to Information • We selected the Public Accounts of Canada as our data • Accessible and common reporting structure • No customization in data presentation • What data processes did we use to convert the data • Cost accounting techniques were applied • Direct and Indirect cost categorizations • Organizations analyzed over four years • Categorize like-functions (Method of Intervention)

  12. Converting Information to Knowledge • Departments are not directly comparable • Previous clustering exercise to identify like-organizations • Developed own clustering methodology • Resourcing Strategy • Consolidation of all studies related to internal services • Allocation of costs to the lowest level of the Program Activity Architecture (PAA) • Internal Service assessment • Development of a model for the assessment of overhead, including the application of stepped-variable cost theory

  13. Interesting Tidbits of Knowledge Gained • Cost driver for overhead • 88% of overhead costs can be linked to full-time equivalents (FTE) • Overhead costs are mostly some form of labour • 86% of overhead costs are capacity related • 66% are Personnel • 20% are Professional Services • Full loaded FTE costs • Average FTE costs $34K/FTE • G&C Average FTE costs $43K/FTE

  14. From Knowledge to Intelligence • Allows for comparison based on categorization of organization through self-identification metrics (MOI, department type) and scale • Will allow for: • Establishing baselines for costs of • Programs • Internal services • Identify Most Efficient Organizations (MEO) and explore what makes them so effective and/or efficient

  15. Clustering of Departments Complexity H DND CSC RCMP M ESDC CFIA PS AAFC DFO ACOA WED QED LAC CBSA CIC EC DFAIT NRC CRA L PCO AANDC Cluster 1 Cluster 2 Cluster 3 Cluster 4 Cluster 5 > $4B < $300M $300M - $900M $900M - $2B $2B - $4B Net Total Expenditures

  16. Manipulating our Environment GC IS cost performance in relation to other governments (illustration purposes only) Effectiveness - Departmental IS MAF Assessment Results 3 2 ? ? A reasonable proxy for the department’s IS cost performance (both blended and for each IS line) Efficiency - Departmental IS cost ratios in relation to the median cluster cost ratios (“Capacity”) 1

  17. Way Ahead … Cost Factors Manual • Develop a Cost Factors Manual for Government of Canada • Method of Intervention  $/FTE in IS • Method of Intervention costs by type and scale of department • Department profile for overhead efficiency • Use existing measures developed by EMS • Comparing overhead costs within clusters Ultimately, answer the question: Are overhead costs incurred by an organization reasonable

  18. Leverage Intelligence in Cost Estimating • Predictive Costing: • Managerial decision-making process which considers uncertainty in identifying estimated costs. Statistical methods are used to incorporate uncertainty in the estimated cost. • The evidence gathered through this process could feed the inputs to statistical models related to entity of government costs • Evidence-based modeling to strengthen cost estimates provided to decision-makers for investment or pricing decisions

  19. Risk: Using Information as Intelligence • A lack of contextual information when decisions are made hinders decision-makers’ ability to understand the financial risks associated with a project: • Costing, by its very nature is based on estimates. Estimates are subject to change based on a number of factors including: scope, framing assumptions, options analyzed, fluctuations in key inputs, and implementation schedule • In providing a ‘single number’ to decision-makers, sufficient contextual information for informed decision-making is often not provided • For projects that extend over several years, decision-makers and Canadians are surprised when actual costs vary significantly from earlier estimates

  20. The Information Level & Cost Estimating • According to a joint McKinsey & Co. / University of Oxford study, half of IT projects with budgets of more than $15M: • Run 45% over budget • Run 7% behind schedule • Deliver 56% less functionality than planned • The General Accountability Office of the United States has identified: • Our conclusion was that without realism and objectivity, bias and over optimism creep into estimates … and the estimates tend to be too low • The ability to generate reliable costs estimates is a critical function. Without this ability, agencies are at risk of experiencing cost overruns, missed deadlines and performance shortfalls. Furthermore, cost increases often mean the government cannot fund as many programs as promised or deliver them as promised

  21. A Number Versus a Range • Traditional cost estimating provides a single value (points A to D) • Sensitivity analysis is the practice of using what-if scenarios (usually worst, most likely and best cases) to represent financial risk • Cost estimating through sensitivity analysis generates a range of possible outcomes within an upper and lower cost estimate boundary • Over time, the cost estimate range tightens and more closely represents a single value • However, it is only at project close-out that a point estimate is appropriate because uncertainty has been eliminated Upper cost estimate boundary D Cost ($) Lower cost estimate boundary C B A Time

  22. Using Statistical Modeling to Determine the Range • Statistical modeling is a scientific approach to estimating that eliminates the potential bias of what-if scenarios • In statistically modeled cost estimates, simulations are run where random numbers are generated based on data to estimate the expected cost of the project. These simulations are collated and analyzed as a statistical distribution • Models provide a mathematical basis to determine the project’s sensitivity to risk and assess the reasonableness of potential costs over thousands of individual assessments Upper cost estimate boundary D Cost ($) Lower cost estimate boundary C B A Statistical model of a cost estimate at a decision point (dotted line) Time

  23. An Example to Illustrate the Different Approaches to Cost Estimating • Acquisition of a COTS IT solution • Initial acquisition price: $100K • Licenses: $0.5K/user (200 users) • Implementation Costs • Configuration: $1K/day (50 days) • Salaries: 3 staff for $250K • Training development: $20K • Training delivery: $1K/day (10 sessions) • Overhead can be absorbed within existing levels

  24. As a Typical Cost (Point) Estimate …

  25. As a Cost Estimate Range …

  26. As a Statistical Model …

  27. Modelled Cost Estimate Results 95% = $585,365 66% = $559,630 Certainty Max = $530,000 Certainty = 11.81%

  28. A Comparison of the Methodologies • Typical cost estimate (Base Case (BC)) $530,450 • Sensitivity analysis (3 what-if scenarios) • Best Case $492,999 93% of BC • Most Likely $524,270 99% of BC • Worst Case $707,816 134% of BC • Statistically modeled (1M times) using random numbers within probability distributions (Annex A) • Best Case $530,000 (very unlikely) 100% of BC • Most Likely less than $559,630 106% of BC • Worst Case less than $585,365 110% of BC

  29. Key Conclusions of a Statistical Model Versus a Traditional Cost Estimate • Risk intolerant cost estimate $585K • Potential financial risk versus base case $55K • 95% confidence interval represents the expected costs should all the identified and modelled risks materialize. Only a 5% chance that the actual costs could exceed the cost estimate • More risk tolerant cost estimate $560K • Potential financial risk versus base case $30K • 66% confidence interval represents that there is only a one in three chance that the actual costs could be higher than the cost estimate • Chance of exceeding base case (very high) 88% • There is an 88% chance that the actual costs will be more than the base case cost estimate of $530K

  30. Statistical modeling can identify and prioritize the most significant risks Models can be analyzed to determine which risk functions have the Most impact to the spread of the cost estimate Mathematical techniques are used to determine the correlation between cost elements and their impact on the estimate This analysis will identify the cost input factors with the largest variability or those which cause cascading impacts This allows management to focus on the areas of most significance in decision-making

  31. The New Cost Estimating Conversation • Statistical modelling supports a ‘new’ conversation with respect to cost estimating: • Decision-makers are given an estimated cost range • Sensitivity of the cost estimate to risk is disclosed • Key variables impacting risk can be identified and nuanced • However, this approach requires a shift in the paradigm of how decisions are made: • Telling the ‘story’ of a project’s costs requires more context • For complex proposals the funding approach may need to change • Significant training and change management would be required

  32. Funding and Pricing are Different but Related to Cost Estimating • Funding, Pricing and Cost Estimating are linked but different • Cost Estimating is the basis of the analysis, but it can diverge for a variety of reasons • Funding and pricing should be based on accepted practices • Full Cost – all costs are included in the calculation • Incremental Cost – only related costs are included • Stepped-Variable Cost – only new capacity is included

  33. In Practice … • Full Cost • People who do not pay taxes or this service is considered to be beyond services covered by taxes paid • External clients when not covered by an appropriation • Introduces the concept of Private versus Public Benefit • Incremental Cost • Service providers for Internal clients when there is no appropriation • Citizens for services in excess of tax base • Stepped-Variable Cost • For incremental capacity requirements to deliver a service

  34. Trading Services between Organizations • Organization A reviews its costs decides that it may be more cost effective to have another organization provide an internal service • Organization B agrees to provide an internal service and proposes a fee to Organization A for the service What approach would organization B typically take? In your opinion is it appropriate – why or why not

  35. The Typical Approach … • Organization B will likely use an incremental cost approach as their position is that their appropriation was not provided for this purpose • Appropriate from a departmental perspective, but … • Decision-makers who provide appropriations in a public setting are doing so from an entity perspective • From an entity perspective, stepped variable costing is the appropriate approach

  36. Discussion • Why would decision-makers take this position? • Is the typical incremental approach appropriate? • If we are to take a capacity-based approach, what would be the typical impact to service arrangements • What risks does this represent to the service provider • How about to the organization purchasing the service

  37. What is the Typical Cost Profile • Variable Costs typically follow a log normal distribution

  38. Costing Methodology for Pricing or Funding Proposals • Government of Canada proposes a seven step model predicated on activity-based costing • What would be the impact of the costing approach with these cost profiles

  39. Main Pitfall: Using Information as Intelligence for Decision-Making • Activity-based costing is information not intelligence • System has capacity challenges (surplus & deficit) • Financial lapse • Bottlenecks (financial risk and production risk) • Delta between planned and actual staffing • Prices which are not updated potentially eroding appropriations and do not necessarily respecting decision-makers intentions when price was set • Resource Management decisions should be based on intelligence not information; otherwise sub-optimal decisions will likely be made

  40. OCG Strategy to Strengthen Cost Estimating in Government A multi-pronged approach is required which provides: • A consistent and balanced capacity within departments to develop sufficiently robust cost estimates for decision-making purposes • Contextual information to enable decision-makers to understand the project’s potential financial risks • Comparative information to strengthen the reasonableness assessment cost estimates performed by TBS • Office of the Comptroller General focus, as a policy centre, on the most significant cabinet documents from a cost estimating perspective

  41. Consistent and Balanced Department Cost Estimating Capacity • Not all organizations require the same degree of cost estimating capacity • Annex A provides a strategy including the establishment of a professional cost estimating designation in Canada to enhance this capacity across government • Currently, only DND has a dedicated costing team who hold professional cost estimating accreditation • CCE and DND/D Cost S will exchange senior analysts to trial a cost estimating analyst development program Dedicating Costing Team with Professional accreditation High Dedicated Costing Team with partial accreditation Departmental cost estimating capacity Access to accredited cost estimators if required Low

  42. Modeling to Enhance Decision-Makers Understanding of Project Risk • Models provide a mathematical basis to determine the project’s sensitivity to risk and assess the reasonableness of potential funding levels over thousands of individual assessments • Models reduce the impact of bias in the development of cost estimates because of its methodology approach • Assumptions and evidentiary standards are foundational pieces in the development of models – bad data is accounted for scientifically Upper cost estimate boundary D Cost ($) Lower cost estimate boundary C B A Statistical model of a cost estimate at a decision point (dotted line) Time

  43. Strengthening Central Agency Oversight Through TB Policy Instruments • Directive on Assets and Liabilities, Directive on Internal and External Charging in conjunction with the Guideline on CFO Attestation are the transformative pieces to strengthen central agency oversight and trigger the modernization of cost estimating within the Public Service (Annex B provides a timeline of the anticipated release strategy) • Directive on Assets and Liabilities should establish the minimum information which must be presented decision-makers when a complex proposal is being considered for acquisition • Directive on Internal and External Charging should establish pricing and cost recovery principles founded on prudent risk management from an entity perspective • Guideline on CFO Attestation should focus on assumptions and risk assessments, and confirm sufficient information is provided to support the requested decision making (including potentially working papers)

  44. Comparative Information to Permit Reasonableness Testing • A Government of Canada Cost Factors Manual will be developed to provide reasonableness tests • Direct program spending across like-departments (size, complexity and type) based on TBS/EMS Methods of Intervention(i.e. regulatory, service to the public, etc, (list provided at Annex D)) • Internal service costsacross like-departments for departmental spending and in support of like-programs • An assessment of spending compared to funding for specific complex proposals to incorporate learning into cost estimating • Integrate costing metrics with EMS performance metrics

  45. Comptroller General Focus on the Most Significant Cabinet Proposals High Departmental cost estimating capacity No due diligence reviews, maintains standards Deeper reviews of targeted assessments Develops standards and coordinates training ‘Due diligence’ reviews within resource and time constraints Low High Intended central agency oversight

  46. Implementation Plan: Parallel Paths to Get There Variable funding models for complex projects UFA amended Key departments have dedicated and accredited staff High Maturity of departmental cost estimating ICEAA Canadian designations CCE deep dives ICEAA partnership with institutions Statistically modeled cost estimates ICEAA Canadian content Directive on Assets and Liabilities Directive on Internal and External Charging GoC Cost Factors Manual Guide to cost estimating of IT enabled projects Pilot statistical model with key department CSPS Costing courses Guide on Costing Guideline on CFO Attestation Guideline on Cost Estimating of Capital Assets promulgated Low High Central agency oversight

More Related