180 likes | 325 Views
NOT GOVERNMENT POLICY. Performance Management Framework Reference Guide Better Practice Guideline for Improving service areas, service standards and targets reported in the State Budget. Performance Unit, Department of the Premier and Cabinet
E N D
NOT GOVERNMENT POLICY Performance Management Framework Reference Guide Better Practice Guideline for Improving service areas, service standards and targets reported in the State Budget Performance Unit, Department of the Premier and Cabinet in collaboration with Queensland Treasury and Trade August 2012
PURPOSE Annual review of service areas, service standards and targets • Provide more relevant and appropriate performance information • Improve alignment between whole-of-Government direction and agency service delivery • Decrease the reporting burden • Address issues and risks identified by the Auditor-General
VALUE CHAIN FOR THE QUEENSLAND PUBLIC SECTOR Services create value for clients, stakeholders and the community influencing trust and confidence VALUE CHAIN Client, stakeholder and community expectations and opinions Whole of government direction Agency service delivery Agency business direction • Objectives for the community – Getting Queensland Back on Track (pledges) • WoG priorities and strategies informed by political and cross jurisdictional commitments • Performance reported in WoG reports • Strategic plan articulates purpose, vision and objectives of agency • Performance indicators measure whether outcomes achieve agency objectives • Performance reported in annual report • Services delivered using the agency’s capabilities (e.g. human, financial, information, physical assets and ICT) and business processes • Performance measured using service standards and other measures • Performance reported in annual report and Service Delivery Statements Governance Governance Governance Performance Management Governance Public sector agencies should deliver services that are valued by clients and other stakeholders
SERVICE DELIVERY STATEMENTS (SDS) The Service Delivery Statements play an integral part of the policy development cycle Performance reported through the Service Delivery Statements… Improves decision making Useful in evaluating policy decisions and assessing the extent to which service areas are achieving their objectives Improves accountability Useful in examining if government services are being delivered in accordance with the agency business direction Government should be monitoring its service standards to better understand the outcomes of its policy decisions, identify areas for improvement and develop the best and most appropriate solutions to issues facing Queenslanders. Government should be regularly assessing whether a service is being delivered efficiently and effectively and is being transparent with stakeholders about its performance. … form an integral part of the policy development cycle.
SERVICE AREAS An exemplary service area will have a clear purpose and deliver outputs and outcomes that will help the agency achieve its objectives • Agencies must develop the following key elements for each service area when a new service structure or a new service area is proposed: • the purpose (objective) of the service area (explaining how the service area contributes to the achievement of agency objectives) • supporting contextual information for each service area (such as related services and stakeholders and their information needs) • a balanced set of service standards and targets. be alignedto the agency’s objectives clearly state its purpose (objective) and identifyits clients and other stakeholders Services areas should… be namedso it is easy for clients and stakeholders to understand the purpose of the service area from its name deliver service outputs (i.e. the products and engagements* the service will deliver) * Engagement: Interactions, connections and relationships developed between Government and its stakeholders (including clients).
Service standardsare set with the aim of defining a level of performance that is appropriate for the service and is expected to be achieved. Service standards provide information on whether the government is ‘doing the right things’ through measuring how efficiently and effectively it is delivering its services to its clients and stakeholders. This information also provides evidence that the government is doing the things it said it would do, and ‘doing it right’. However, for this to occur, there is a need to ensure that the government is ‘measuring the right things’ and ‘measuring it right’. SERVICE STANDARDS A successful service standard will measure ‘the right thing’ and ‘measure it right’ Are we measuring the right things? Service standards work best when there is clarity about what is being measured and why. The right things to measure will be ultimately influenced by client and stakeholder expectations, which informs the whole of government direction and the agencies’ business directions. Public sector performance in Queensland has often been measured in terms of what the government has done (e.g. measures of input, process and activity), but better results can be achieved by including service standards measuring the efficiency and effectiveness of its services. • Communicating this information to the community is critical Are we measuring it right? Service standards work best when there is clarity about what is being measured and why.
MEASURING THE RIGHT THINGS To properly measure ‘the right things’, there must be a clear line of sight between the sources of ‘the right things’ and the measures of ‘the right things’ Sources of ‘the right things’ • Client and stakeholder expectations and consultation • Whole of government priorities and strategies • Ministerial Charter Letters • Cross jurisdictional commitments through COAG • Benchmarks & industry standards • (Results, process or better practices) The agency business direction is informed by ‘the right things’. Operational plans describe the services needed to deliver the agency’s objectives. Service standards should measure how well the agency has delivered the services. Measures of ‘the right things’ • Relevant to what the agency is aiming to achieve • Attributable – capable of being influenced by the agency’s actions • Comparable – with either past periods or similar measures elsewhere • Well-defined and easy to understand • Reliable, credible and able to be measured consistently • Measurable – clear and transparent standard of success • Timely – performance data can be produced regularly and quickly • Achievable – aim for improved standards, but remain attainable • Cost-effective in terms of gathering and processing the data • Credible – supported by stakeholders, research and/or industry standards Source: Boyle, R. (2009). ‘Performance reporting: Insights from international practice’, IBM Centre for The Business of Government.
Outcome focused Efficiency: Measures of efficiency reflect how capabilities (inputs/resources) are used to produce outputs, expressed as a ratio of capabilities (inputs/resources) to outputs. Efficiency measures generally assess how well an agency uses its available capabilities (resources) to deliver its outputs. Effectiveness:Measures of effectiveness describe the quantifiable extent of the effect of the service on recipients (i.e. the outcome experienced by them), as a result of the level and quality of the service provided. Standards of effectiveness include “cost” effectiveness (cost to provide the desired outcome) and “service” effectiveness (how well the service achieves its stated purpose [objective]). MEASURING IT RIGHT To successfully ‘measure it right’, service standards should be primarily measures of efficiency and effectiveness Input / Output focused • Activity: Measures of activity measure the number of service instances, service recipients, or other activities for the service. They demonstrate the volume of work being undertaken. They can often be converted into efficiency measures by combining them with input measures. • Process: Measures of process measure throughput, or the means by which the agency delivers the service, rather than the service itself. It demonstrates how the agency delivers services, rather than how effectively services are delivered. • Input: Measures of input measure the resources consumed in delivering a service, either as an absolute figure or as a percentage of total resources. Input measures demonstrate what it costs to deliver a service. • Quality: Measures of quality measure how well a service is being delivered using specific criteria such as timeliness, client/stakeholder satisfaction, etc Service standards should measure both the efficiency of the output and the effectiveness of the outcome.
SERVICE PROCESS The Report on Government Services (RoGS) ‘service process framework’ demonstrates how efficiency and effectiveness is measured External influences Service Area Input Process Output Service objectives Outcomes Efficiency Cost-effectiveness Service effectiveness Adapted from Steering Committee for the Review of Government Service Provision. Report on Government Services2012, Productivity Commission, Canberra. (Ch. 1, p. 13)
TARGET SETTING Setting appropriate targets is equally as important as developing the service standard itself • Target checklist: • Target does not promote adverse results (e.g. efficiency improves to a level that substantially decreases quality) • Target indicates the desired movement of performance (e.g. > x or < x) • Target is challenging, but achievable • Target is a clear and quantified measure against which the agency can assess performance • Target is expressed as an absolute number (i.e. avoid use of words), a range, percentage, or ratio • Target is congruent to objectives and targets set in other government publications • Target is at or above minimum regulatory standards and benchmarks • Service standards that measure regulatory/policy compliance should be reviewed and agency’s should consider removing Refer to A Guide to the Queensland Government Performance Management Framework for more information on setting targets.
1 2 3 4 5 6 Principles Principles to assist agencies when reviewing service areas, service standards and targets 1. Provide more relevant and appropriate performance information that highlights the efficiency and effectiveness of agency service delivery 6. Allow for trend analysis 2. Increase alignment between the Government’s objectives for the community, strategic plans and agency services PRINCIPLES 5. Encourage high quality data management 3. Decrease the reporting burden on agencies 4. Improve consistency across agencies
1 2 3 4 5 6 Efficiency and effectiveness Provide more relevant and appropriate performance information that highlights the efficiency and effectiveness of the agency service delivery External influences Service Area Input Process Output Service objectives Outcomes Efficiency Cost-effectiveness Service effectiveness Adapted from Steering Committee for the Review of Government Service Provision. (2010). Report on Government Services2012, Productivity Commission, Canberra. (Ch. 1, p. 13)
Services create value for clients, stakeholders and the community influencing trust and confidence VALUE CHAIN Client, stakeholder and community expectations and opinions Whole of government direction Agency service delivery Agency business direction 1 2 3 4 5 6 Clear line of sight • Objectives for the community – Getting Queensland Back on Track (pledges) • WoG priorities and strategies informed by political and cross jurisdictional commitments • Performance reported in WoG reports • Services delivered using the agency’s capabilities (e.g. human, financial, information, physical assets and ICT) and business processes • Performance measured using service standards and other measures • Performance reported in annual report and Service Delivery Statements • Strategic plan articulates purpose, vision and objectives of agency • Performance indicators measure whether outcomes achieve agency objectives • Performance reported in annual report Governance Governance Governance Governance Performance Management Increase alignment between Government objectives for the community, agency strategic plans and agency services
1 2 3 4 5 6 Decrease the reporting burden Decrease the reporting burden on agencies COAG agreements Existing measures of efficiency or effectiveness already collected and reported by the agency for other purposes… Report on Government Services Service standards External benchmarks By increasing alignment with existing measures, the reporting burden will be decreased. Industry standards
1 2 3 4 5 6 Consistency There is a need to improve consistency across agencies in the SDS Inconsistencies Issues Client satisfaction measures Overly technical and complex language Presentation and sub-headings Excessive amount of descriptive text in the service standard Compliance with ‘regulatory timeframes’ Common terms Mitigation strategies • Client satisfaction: Encourage agencies to measure clients’ and stakeholders’ satisfaction with the overall service, and the service’s timeliness, ease of access, staff, quality and outcome. Agencies should refer to the Performance Management Framework Reference Guide Measuring Client Satisfaction, published by the Department of the Premier and Cabinet. • Presentation and sub-headings: Permitting only one level of sub-headings under each service area (i.e. no sub-sub-headings). • Common terms: Ensuring agencies use consistent language (e.g. “people with disabilities”, not “the disabled”). • Technical language: Minimising the use of overly complex or technical language that potentially confuses the readers. • Excessive text: Encouraging the use of the notes to provide context and understanding for the reader, rather than having overly descriptive service standards. Treasury has agreed to providing additional space for notes. • Regulatory timeframes: Service standards that measure the “delivery of XYZ service within regulatory timeframes” are not measures of efficiency and are suggested for deletion or amendment (e.g. into a measure of efficiency).
1 1 2 2 3 3 4 4 5 5 6 Data management High quality data management is necessary for improved performance reporting • A data dictionary is required from every agency and should be modelled on the recommendations made in: • QAO Better Practice Guide: Performance Reviews, July 2010 • Elements of relevant and robust performance information: • Relevant, appropriate and align with externally reported measures • Accurate, reliable and readily-accessible to managers • Information is presented clearly with a basis for comparison provided for all data • Performance measures are regularly reviewed. • ABS Data Quality Framework (No. 1520.0), May 2009 • -Seven Dimensions of ‘Quality’: • Institutional environment: Collection agencies should build a culture that focuses on quality, and an emphasise on objectivity and professionalism. Consideration of the institutional environment associated with a statistical product is important as it enables an assessment of the surrounding context, which may influence the validity, reliability or appropriateness of the product. • Relevance: To be relevant, the collection agency must stay abreast of the information needs of its users. • Timeliness: These aspects are important considerations in assessing quality, as lengthy delays between the reference period and data availability, or between advertised and actual release dates, can have implications for the currency or reliability of the data. • Accuracy: This is an important component of quality as it relates to how well the data portray reality, which has clear implications for how useful and meaningful the data will be for interpretation or further analysis. • Coherence: The use of standard concepts, classifications and target populations promotes coherence, as does the use of common methodology across surveys. Coherence is an important component of quality as it provides an indication of whether the dataset can be usefully compared with other sources to enable data compilation and comparison. • Interpretability: The availability of information to help provide insight into the data. Interpretability is an important component of quality as it enables the information to be understood and utilised appropriately. • Accessibility: The ease of access to data by users, including the ease with which the existence of information can be ascertained, as well as the suitability of the form or medium through which information can be accessed. Accessibility is a key component of quality as it relates directly to the capacity of users to identify the availability of relevant information, and then to access it in a convenient and suitable manner.
1 2 3 4 5 6 Trend analysis Consistent reporting of service standards over time enhances transparency and provides a clear assessment of achievements Informs policy analysis, development and evaluation PERFORMANCE TIME
CONTINUOUS IMPROVEMENT The Auditor-General is expecting significant improvements to government’s performance information so that it highlights value to clients and stakeholders • While crediting the government with improvements, the Auditor-General has criticised the low proportion of service standards that are measures of efficiency or effectiveness. • The Auditor-General has indicated in previous reports that he expects significant improvements to performance information once the PMF is fully implemented (i.e. September 2011). • Report No. 4, 2007 – “Better performance information is needed for the department, the Minister and all stakeholders, including Parliament, for a more informed government” • Report No. 1, 2008 – “Failing to answer questions such as Has the agency achieved what it intended to do? Is this better than last year? Is this good enough? Were these activities needed in the first place? Could they have done this for less money?” • Report No. 7, 2009 – “A comparatively small number of measures in the agencies’ 2009-10 SDS could be considered measures of efficiency or effectiveness”. As a general principle it is suggested that the Service Delivery Statements become a more focused, succinct document reporting on fewer, yet more meaningful targets of performance. Service Delivery and Performance Commission 2007, Report on Strengthening Performance Management in the Queensland Government