1 / 15

'Quality'? Indicators from combining Likert items and other measures from surveys

Explore the intricacies of combining Likert items and other survey measures to assess quality indicators by Professor Diana Kornbrot from the University of Hertfordshire. Understand how aggregating components can change results and be used for critical decision-making. Learn about indicator ranking in public survey statistics, higher education evaluation, and economic development benchmarking. Get insights on the importance of weighting, normalization procedures, and the impact on ranking measures. Discover the nuances of principal components analysis and the role of output measures in league rankings.

harringtonm
Download Presentation

'Quality'? Indicators from combining Likert items and other measures from surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 'Quality'? Indicators from combining Likert items and other measures from surveys Professor Diana Kornbrot University of Hertfordshire d.e.kornbrot@herts.ac.uk

  2. Scene Setting • Background Experience • Evaluating Higher Education • Benchmarking for HEA, Student Experience • Psychological Measurement and Mathematical Modelling • Public Indicator Statistics are Composite • Components all same type • 1-5 Ordinal: agreement or quality(Likert items); Money • Components mixed • Economic Development: 1-5 Ordinal + Money + Mortality + Literacy • Higher Education: 1-5 Ordinal + Money + Failure + Employment • Agregation Method from Components Changes Results • Important Decisions made on Indicator Ranks Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  3. Getting the Data • Higher Education • HESA http://www.hesa.ac.uk/holisdocs/home.htm • By institution • By subject • Pay to get by insitution by subject • HOLIS Search by Subject Area, University, Gender, Degree Class • THES http://www.thes.co.uk/statistics/university_performance/ • PUSH http://www.push.co.uk/pushguide/shortlist.jsp £14.95 • UCAS • Economic Development Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  4. Times Higher Education Supplement • League Tables 2006 Entry standards A level points ST 100 Student-to-staff ratio 8 - 34 ST 100 • Student satisfaction (4min-20max) Likert 13.7 – 16.1 ST 150 Research assessment exercise Pubs 0 -7, bizarre ST 200 Library and computing spending £s 3 year average Facilities spending £s 3 year average • Good honours % of graduates ST 100 • Completion % entrants • Graduate destinations % known ST 100 • Salaries 04-05 VC Pay 2004/2005 £104k-£305k Average FT Salary 2004/2005 £21.7k-£43.2k Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  5. National Student Satisfaction • Form • For each statement, show extent of agreement or disagreement • 5 Definitely agree, 4 Mostly agree, 3 Neither agree nor disagree, 2 Mostly disagree, 1 Definitely disagree, N/A Not applicable • The quality from my University for eatch feature was: • 5 Excellent, 4 Good, 3 Satisfactory, 2 Weak, 1 Poor, N/A Not applicable • Examples • 8. I have received detailed comments on my work • 19. The course has helped me to present myself with confidence • Topics • The teaching on my course Q1-4 • Assessment and feedback Q5-9 • Academic support Q10-12 • Organisation and management Q12-15 • Learning resources Q16-18 • Personal development Q19-22 Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  6. Making the League Table • Actual Agregation Procedure • Normalize so maximum is 100 • Decide on weighting • Aggregate • Do weightings matter? • Desirable? • Normalize to z-scores • Distinguish input and output measure • Exploratory principal components? • Univariate multiple regrssion > canoncial input measure? • MANOVA? > canoncial input, output measure? • Rank • Categorize League? Where are big differences Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  7. Principal Components 2 Components Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  8. Principal Components • Oblimin • Component correlation .199 • Varimax • Component 1: 41.0 %variance • Component 2: 41.0 %variance Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  9. Input and Output Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  10. League Measures Raw Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  11. League Measures Agregate Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  12. On Evaluating HE • Misleading if not by discipline • Shows ideas • League Tables Uninformative • Need predictors • Weighting Matters? • Benchmarking E-Learning • Untested target: Embed E-Learning • Implications • Better tools from public statistics Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  13. On Indicators Generally • Economics Very Similar • Can show almost anything • Needs sound basis • Predictors • Weighting? Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  14. Thank You Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

  15. Original Abstract • Single quality indicators are often derived from weighted sums of survey items, including ordinal Likert items. It is widely believed that the weights and combining procedures don't make much difference to the final rankings or scores. This presentation challenges this assumption using examples from the National Student Survey of UK higher education and country development indices. Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics

More Related