270 likes | 371 Views
Grant Funded Services - Debrief. North West Performance Leads. 6 th February 2007. Overview. Purpose of Data Collection NWPL Sub Group Work Non-Care Managed PI’s Feedback Survey Results Types of Scheme Your Comments Conclusions Areas for Discussion. Purpose of New Data Collection.
E N D
Grant Funded Services - Debrief North West Performance Leads 6th February 2007
Overview • Purpose of Data Collection • NWPL Sub Group Work • Non-Care Managed PI’s • Feedback Survey Results • Types of Scheme • Your Comments • Conclusions • Areas for Discussion
Purpose of New Data Collection The Health & Social Care Information Centre states; • The need for this information has been heightened by the Government's increasing emphasis on preventative services and a recognition of the important role played by the voluntary sector, as for example implied by the recent White Paper "Our Health, Our Care, Our Say". • This signals an intention to further invest in such forms of service in the future as one means of meeting the needs of our ageing population in a more focused and cost effective way.
NWPL Sub Group Work • It was agreed that a sub group meet and discuss the requirements of the new return. Of particular concern was the inclusion of scope wider than Social Services • Sub Group agreed to make amendments to the Standard Organisation form in order to meet the requirements of the LA Collection Form (e.g. Numbers in Care Plans) • Sub Group also created a list of Questions and submitted those to the Health & Social Care Information Centre
Scope of the Data Collection • The scope of the Data Collection was amended late September by the Information Centre moving away from the initial decision to include Council Wide services. • Further guidance was issued, restricting the scope of the data collection to; ‘Schemes which do not form part of a formal care package agreed by social services, but which are nevertheless funded in whole or in part by social services’
Sharing of Performance Data • NWPL agreed to share Non-Care Managed Performance Indicators in Mid January, prior to submission of the Return, purely for comparison on a regional basis. • Concern has been expressed by NWPL regarding the accuracy and comparability of these results due to differences in calculation, data quality, double counting etc. which may skew the results. • The PI results for 19 NWPL Authorities are as follows;
Overview of Feedback Survey • The feedback survey was designed to inform this debrief on different aspects of this new Statutory Return, as experienced by NWPL authorities. • 18 NWPL authorities completed a simple feedback survey. • The Results are as follows.
Feedback Results – No. Schemes • Total Number of Schemes included in Data Collection – 791. • Total Number of those schemes operating in the sample week – 764 (97%). • 79 (10%) Schemes were specifically for BME Clients. • 53% of the BME schemes were provided by just two NW authorities. • The Number of schemes per authority ranged from 8 – 126. Just three authorities account for almost half of all schemes.
Feedback Results – Sample Week • 89% of all respondents used the suggested sample week 13th-19th November 2006. • 11% (Two authorities) chose an alternative week; 20th – 26th November 11th – 17th December
Feedback Results – Survey Types • 44% used the amended form as agreed by the NWPL sub-group tasked to focus on the GFS Return. • 33% used the Standard Form provided by the Information Centre. • 17% took the opportunity to further design their own form with the inclusion of additional questions or simplify the form and guidance. • A small number used a combination of forms and/or actual information provided by the organisation.
Feedback Results –Data Collection • 83% authorities used a combination of methods with E-mail, Post, Telephone being the preferred options. • Three authorities used only Postal collection. • Approximately 10% of figures were reported as estimated. Indicating that 90% of figures provided were actual.
Feedback Results - Numbers in Care Plans • 44% were unable to complete this section of the return - most used the Standard Form. Some authorities also unable to collect data using the amended form. • Only one authority estimated figures for this section of the Return based on a cross-matching exercise using nominal data. Of the authorities that were able to collate this information, • 33% collated some information on numbers in Care Plans from an amended form. • 6% collated actual information directly from the organisations. • 11% reported that they used a combination of both forms and actual figures provided by organisations.
Feedback Results - Individualised Data • All except one authority reported that they were unable to obtain Individualised data. • The authority successful in obtaining Individualised data reported that they added additional questions to the survey form. The questions asked organisations to state numbers accessing more than one scheme and/or care plan.
Feedback Results - Monitoring Data • 67% of authorities reported that they regularly collect monitoring data from voluntary organisations • 33% do not collect regular monitoring data. • The main frequency of collection is quarterly • Collection ranges from Monthly, quarterly, six monthly to annually
Feedback Results – Engaging Voluntary Organisations On a scale of 1-4, with 1 being easy to 4 being extremely difficult • 67% of respondents rated a score of 2 (Moderate). • 22% of respondents rated a score of 3 (Difficult). This relates to Four authorities, two of those authorities had the highest number of schemes. • One authority rated a score of 1 (Easy).This authority had the least number of schemes. • One authority rated a score of 4 (Extremely Difficult).
Feedback Results - Data Quality • 83% of respondents reported moderate data quality with a requirement for some clarification with organisations • 17% reported that data was good quality with no further clarification required • None of the authorities reported poor data quality.
Feedback Results -Comments (1/3) ‘Found it very difficult to trust the data submitted by the vol.orgs as only saw whole figures and not how they arrived at these’ ‘Difficult to see how the data will be used as could have double counting across schemes, therefore not a reflection on actual numbers. The goal posts kept changing so wasn’t clear what you could include and what you couldn’t. How on earth do you know if they are being Care Managed if you only get a whole number and not any other personal details?’
Feedback Results -Comments (2/3) ‘Some aspects were easier than others. We held a workshop which was well attended and well received. Collection of data was much harder with issues such as communication and availability’. ‘Many of the schemes provide a service to a diverse range of clients, including a range of client groups, age groups and ethnic groups. Some were concerned that the template generalised their work and did not reflect the diversity of the clients they support’.
Feedback Results -Comments (3/3) ‘The smaller organisations did not provide a costing, only time. As the very nature of their organisation is voluntary, they found it difficult to complete this’. ‘It was not always easy to communicate with the voluntary organisations as they were not always available via phone or email. You had to chase a number of times before you received a response’. ‘Difficulty sometimes in defining which services were mainstream funded and which were grant assisted’.
CONCLUSIONS • PAF Data and Survey provides only a partial picture of preventative service provision. • Reporting by Main Primary Client Type generalises data. • Concern that Survey of Non-Care Managed support is not an accurate reflection on activity due to double counting between schemes and/or Care Plans. • Lack of robust collection systems/data validation/audit trail increases risk of inaccuracies in reporting of data by voluntary organisations. • Reporting of Care Plan data only made possible by changes to Standard Form by NWPL. • Only one authority able to collect Individualised Data
ARES FOR DISCUSSION • Where does accountability lie for the validation/data quality of the data supplied in the GFS Return? • How do we start to measure individualised data to resolve the double counting issue, given issues around Data Protection and Service Users’ Desire for Anonymity? • How do we measure Outcomes for Service Users of Grant Funded/Preventative Services? • How can we make best use of this valuable data to provide a holistic approach to Planning, Developing and Commissioning of Preventative services? • Next Steps?