1.52k likes | 1.53k Views
This report summarizes the findings from the NHS National Staff Survey 2018 for NHS South Tyneside CCG. It provides an overview of results, compares them to external and internal benchmarks, and includes Dartboard charts.
E N D
NHS National Staff Survey 2018 Management report NHS South Tyneside CCG February 2019
Executive summary Introduction Overview of results Positive score summary External benchmarks Internal benchmarks Dartboard charts Appendix 1: Results poster Appendix 2: How your scores are calculated Contents
Executive summary • Section 1
This document summarises the findings from the NHS National Staff Survey 2018, carried out by Picker, on behalf of NHS South Tyneside CCG. Picker was commissioned by 66 Clinical Commissioning Grouporganisations to run their survey – this report presents your results in comparison to those organisations. A total of 90questions from the survey be positively scored. 82 of these can be compared historically between NSS17 and NSS18. Your results include every question where your organisation had the minimum required 11 respondents. Executive summary (part 1 of 2) *Chart shows the number of questions that are better, worse, or show no significant difference
Introduction • Section 2
Survey background The NHS National Staff Surveyruns every year. All eligible trusts in England are required to conduct the survey. As an approved survey contractor, we worked with 66 Clinical Commissioning Grouporganisations on the 2018 NHS National Staff Survey. This report shows your results in comparison to the average of those organisations (the “Picker Average”). Methodology The questionnaire used for the NHS National Staff Survey 2018 was developed by the NHS Staff Survey Coordination Centre together with the NHS Advisory Board. NHS England have comprehensive guidelines on which staff must be included in the survey, available here: http://www.nhsstaffsurveys.com/Caches/Files/ST18_Participating%20organisations%20guidance.pdf The guidelines also include a copy of the mandated core survey. Reporting This report uses “positive score” as its primary unit of measurement. This allows you to compare your results historically, and to other similar organisations on a question-by-question basis, for all questions that can be positively scored. For detailed information about positive scores, significant differences and sample sizes, please see Appendix 2. NHS National Staff Survey 2018
How to use this report When deciding which areas to act upon, a useful approach is to look at a particular section and follow these steps: Identify any questions where you wish to highlight the results. The positive score summary is the first step to pick out any questions where the results are significantly different to the Picker Average. This allows you to feed back on where your organisation performs better then the average as well as where you may wish to focus improvement activities. Review your organisation’s performance over time. Our report highlights significant changes from your previous survey and longer term trends over the last several years. Are there particular areas which have being improving or declining over time? How does your organisation compare. Look at the External Benchmark charts to see the range of scores and see how you compare with the other organisations that took part in the survey. This will give you an indication as to how you compare and what is a realistic ambition for areas where you may wish to improve. Compare areas within your organisation. Good practice could be shared and you can also see areas that may need attention. Go to the Internal Benchmark section to see where this is the case.
Overview of results • Section 3
61% Overall response rate (total returned as a % of total eligible) 78% Average response rate for similar organisations Response totals: Survey activity
The overall league table shows your overall positive score’s ranking in comparison to the overall positive score of every other Clinical Commissioning Grouporganisation that ran the NHS National Staff Survey with Picker this year. League table: overall positive score
The historical league table shows how your overall positive score changed from last year, and how this change compares to the changes that happened in other Clinical Commissioning Group organisations who ran the NHS National Staff Survey with Picker this year. League table: historic positive score
Positive score summary • Section 4: historical and organisation type comparison
This section compares your latest results to your historical scores, as well as to the Picker Average, across a 5-year period. The average scores have been calculated from all Clinical Commissioning Group organisations that commissioned Picker to conduct their survey. How to read the tables The historical comparison tables contain positive scores, where higher scores indicate better performance. For an in-depth explanation of positive scoring, see Appendix 2. Coloured cells show where a particular score is significantly different to the score in the column to its left (e.g. Trust compared to Average, or this year to last year). Green cells indicate a significantly improved score, and red cells show a significantly worse score. For an in-depth explanation of significance testing, see Appendix 2. The left hand section of the table contains historical scores, which show all your positive scores for previous years. The right hand side of the table shows your score for this year vs. the average for similar organisations. Example Table: Historical comparisons Historical Organisation type
External benchmarks • Section 5: comparison of your results against other organisations
External benchmarking compares the experiences in your organisation with other Clinical Commissioning Grouporganisations. This allows you to understand where your performance sits in relation to the overall trend (i.e. the “Picker Average”). Each blue bar shows the range of performance for a specific question, which helps to highlight where improvements are possible or resources could valuably be concentrated. How to read the tables Your job (part 1 of 3) External benchmarks 1 Worse Score Better Score 2 3 5 4
Your job (part 1 of 3) Worse Score Better Score
Your job (part 2 of 3) Worse Score Better Score
Your job (part 3 of 3) Worse Score Better Score
Your managers (part 1 of 2) Worse Score Better Score
Your managers (part 2 of 2) Worse Score Better Score
Your health wellbeing and safety at work (part 1 of 3) Worse Score Better Score
Your health wellbeing and safety at work (part 2 of 3) Worse Score Better Score
Your health wellbeing and safety at work (part 3 of 3) Worse Score Better Score
Your personal development Worse Score Better Score
Your organisation Worse Score Better Score
Background information Worse Score Better Score
Internal benchmarks • Section 6: comparison of results within your organisation
Internal benchmarking charts allow you to easily compare experiences within your organisation, by showing you where the problem areas and top performers are across every positively scored question. How to read the tables This chart shows how people across your organisation responded to a particular question. Each coloured bar represents a different aspect of performance: positive responses (green), neutral responses (amber), and negative responses (red). Your job Q2a. Often/always look forward to going to work Internal benchmarks 1 2 3 5 4
Your job Q2a. Often/always look forward to going to work
Your job Q2b. Often/always enthusiastic about my job
Your job Q2c. Time often/always passes quickly when I am working
Your job Q3a. Always know what work responsibilities are
Your job Q3b. Feel trusted to do my job
Your job Q3c. Able to do my job to a standard I am pleased with
Your job Q4a. Opportunities to show initiative frequent in my role
Your job Q4b. Able to make suggestions to improve the work of my team/dept
Your job Q4c. Involved in deciding changes that affect work
Your job Q4d. Able to make improvements happen in my area of work