330 likes | 456 Views
Outcome Measurement Across Community Services. Experiences from Other Jurisdictions and the ACT Current Context Admir Meko – ACTCOSS December 2013. Purpose of Presentation. What is the current discourse on Outcome Measurement in Community Services in Australia?
E N D
Outcome Measurement Across Community Services Experiences from Other Jurisdictions and the ACT Current Context AdmirMeko – ACTCOSS December 2013
Purpose of Presentation • What is the current discourse on Outcome Measurement in Community Services in Australia? • Present relevant experiences from different Jurisdictions. • Discuss and reflect on what approaches can be incorporated in the sector.
Focus of this Presentation • Effects of Sector Reform initiatives in measuring impact; • Government Initiated and/or Supported Initiatives in measuring outcomes; 2. Cross-Sector and Sub-sector Initiatives; 3. Shared (and not) Outcome Measurement; • Any tool is used by more than one Org; • Aiming to measure Impact; • Build information collectively about what works. Session is aimed to be both informative and consultative.
Limitations of Presentation • Our research is not finalised; • Difficult to include every single activity; • Many sporadic initiatives and more broader processes are under way; • Lack of Evaluation and Assessment of current initiatives to date; • Difficult to predict sudden shifts in policies and procedures. • Somehow limited time for presenting.
What has Driven the Discourse for Outcome Measurement in Social Services? At the Organisational Level: • The Demand from Funders including Gov; • The need from within Organisations; • A shift from focusing only on Inputs/Outputs; • Limitations and/or Inability to Report on the Broader and Long-term Impact.
At the National and State/Territory Level • Sector Reform Activities in the last 5 Years; • Recommendations from Productivity Commission Report – The Contribution of Not-for-Profit Sector; • Continuous Changes in Procurement, Funding & Reporting Mechanisms; • A Vision and Demand for more Collaboration in Service Delivery of Government/s Funded Programs.
What is Happening at the National Level? Productivity Commission Report (2010) highlighted: • Growing calls from sector for accountability and demonstration of impact; • Purchasing arrangements put pressure on Gov-NFP sector relations; • The need for a unified framework for clearer governance and accountability; • Improve arrangements for an effective sector development; and more importantly Recommended - A nationally agreed measurement and evaluation framework - a greater understanding of the outcomes and impacts of the sector and underpin enhanced evaluation within the sector.
Coalition Government – Pre-election • Each agency funded by Commonwealth Gov – one single contract; • Flexible contract negotiation - ??? • Decrease reporting on a single annual one; • Important: Replacement of time-consuming and costly system/s of data collection with: A Number of Cross-Sector Evaluation Programs No detailed info on this so far
Factors for a Successful Shared Outcome Approach (Eibhlin et al, 2013)
Defining Shared Measurement (Foundation Strategy Group, 2009)
Victoria Sector Reform (Shergold2013) – Three main pillars • Improving System Funding • Improving System Operation • Improving Outcomes Recommendation: Development of an outcomes framework through a partnership between the government and community service organisations. It should: • establish metrics against which the delivery of beneficial social impact will be audited, monitored, measured and reported over time. • Individual government departments should clearly articulate the outcomes sought from government investment in the services they fund; • Wherever possible, link funding to the achievement of those outcomes; • Policy development and program design should be based on the collection of data, research, analysis and evaluation of outcome performance.
Victoria – Current Context • Community Sector Reform Council (CSRC); • Some orgs are already applying outcome measurement tools; • No sector-wide models; • Some innovation in outcome focused funding: • Services Connect • Victorian Homelessness Action Plan – Innovation Action Projects • Victoria’s Vulnerable Children’s Strategy • Youth Partnerships
New South Wales During 2010-2011 - Realignment of Community Services Grant Program:
EIPP RBA Project - Program Logic and Evaluation Pathways EIPP Program Logic Diagram EIPP Youth and Family Support Results Logic Diagram EIPP Child and Family Support Results Logic Diagram EIPP Intensive Family Support Results Logic diagram EIPP Intensive Family Preservation Results Logic diagram EIPP Child and Family Support Performance Measures EIPP Youth and Family Support Performance Measures EIPP Intensive Family Support Performance Measures EIPP Intensive Family Preservation Performance Measures
Additional Processes (all services) • Performance Monitoring Framework Five standardised activities: • self assessment – capacity and capability for data collection and reporting; • desktop review; • monitoring & review meeting (if required) • performance improvement planning (if required); • the decision to continue funding a service.
Queensland • Human Service Quality Framework (December 2012) – less red tape, common standards across services. Six Standards replace previous Service Standards for Disability Service & Advocacy, Community and Children Services. • Common Service Agreement (from July 2010) - A consistent model for funding similar services. • Output Funding and Reporting (2012-2015) – moving from input to output based. Supported by the Outputs Catalogue and OASIS online reporting tool.
Western Australia • Partnership Forum (2010) – a medium for Gov and NFP to discuss and solve issues. • Delivering Community Services in Partnership Policy (2011). • More focus on outcomes (qualitative), especially from NFP though no shared measurement. • Services still in the process of understanding the changes in the policy.
South Australia No explicit top-down approach identified yet. Though one generated by NFP – ‘Together SA’ • In 2009 Community Centres SA ran an RBA model for planning, implementing, evaluation, etc. • Interest from Dep. of Comm. and Soc. Inclusion. • Transformed into a Collective Impact model. • Results – RBA Comm. of Practice; An Alliance between agencies – move beyond individual agendas through coordination, many training activities; • It will interesting to follow up especially the Gov involvement and potential further expansion.
Tasmania • Quality and Safety Standards Framework; • Standard and Performance Pathways (SPP) – online portal by TasCOSS. • Strategic Policy Team within Community Services Relationship Units (similar to Fed. and Victoria) – a single point of contact for all issues. • Joint Project ‘Working Together to Make Difference’ – started November 2013.
Joint Project ‘Working Together to Make Difference’ 3 Year Capacity Building Strategy aiming to: • Embed outcomes measurement; • Build Knowledge, Skills and Networks; • Develop shared language and understanding; • Develop outcomes for specific service areas; • Initiate a ‘Collective Impact’ strategy; • Be able to measure progress in outcomes.
Two Streams of Activities Stream 1 • Resources and Evidence Building; • Training and Seminars; • Networks and Peer Learning. Stream 2 • Raise Awareness of ‘Collective Impact’ • Develop Strategic Intent amongst orgs; • Develop a Coordinated Plan for intended outcomes; • Develop and Disseminate Knowledge.
Australian Capital Territory Development of Purchasing Framework (2010 – to date) Important notes: Information taken from CSD website; Anecdotal evidence suggest final product may change; The development of the Blueprint for Human Services may bring further recommendations.
The Reasoning for the Framework Intends to address the following: • A short to medium term approach to service development; • A maximum three year funding cycle; • Funding is not linked to population outcomes; • There are different funding approaches across ACT human service agencies; • Outputs are the focus of transactional contracting; • There is no common prequalification framework for human services; and • A limited outcome based quality reporting.
Objectives All funded services will be required and expected to have: • A prequalification Framework; • A maximum 10 year Service Funding Agreement consisting of a fixed term with option periods and agreed performance milestones (this may decrease to five; • An Outcomes Based Quality Framework; • Standard common terms and conditions in their Service Funding Agreements; and • Work within a common performance reporting framework.
Mid-long Term Results • Increase the capacity to measure the achievement of specific population results over a 10 year period with the potential to measure intra-generational outcomesover a longer period; • Provide common reporting systems – across all ACT Government funded human services; • Offer incentives for ACT government funded human services as they improve the quality of service delivery • Improve the efficiency, reduce duplicationand costsfor all service providers
Main Pillars • Prequalification Framework - quality assurance, capability, eligibility of provision; • Reporting Framework – Outcomes, Indicators and Outputs to measure the progress; • External Audits – To measure Efficiency and Effectiveness • Sector Management – Sector Development, Contract Management, Quality and Improvement; • Quality Improvement Framework – procurement tools, Centralised Contract and Grants Processing, Tendering process and Standard Terms and Conditions for all services.
What We Know • Difficult to determine what it would look like. • CYFSP has its own Output & Outcome Reporting Framework. • Housing & Homelessness is developing its own through Joint Pathways Group. While • Individual agencies are collecting data and measuring outcomes for their own purpose – RBA, SROI, Outcome Star, etc. • Need for more knowledge and understanding from both gov& NFP. • Without Government support cross-sector outcome measurement difficult.
Main Issues • How to incorporate different sectors in one Outcomes Reporting Framework? Is it do-able? • One Reporting Tool Vs Orgs choose their own. • Are top-down approaches more effective? Some of the best initiatives are bottom-up models. • Who should be responsible for data analysis – gov, each agency or an independent body and how are they used to drive change?