150 likes | 166 Views
Explore the intricacies of combining Likert items and other survey measures to assess quality indicators by Professor Diana Kornbrot from the University of Hertfordshire. Understand how aggregating components can change results and be used for critical decision-making. Learn about indicator ranking in public survey statistics, higher education evaluation, and economic development benchmarking. Get insights on the importance of weighting, normalization procedures, and the impact on ranking measures. Discover the nuances of principal components analysis and the role of output measures in league rankings.
E N D
'Quality'? Indicators from combining Likert items and other measures from surveys Professor Diana Kornbrot University of Hertfordshire d.e.kornbrot@herts.ac.uk
Scene Setting • Background Experience • Evaluating Higher Education • Benchmarking for HEA, Student Experience • Psychological Measurement and Mathematical Modelling • Public Indicator Statistics are Composite • Components all same type • 1-5 Ordinal: agreement or quality(Likert items); Money • Components mixed • Economic Development: 1-5 Ordinal + Money + Mortality + Literacy • Higher Education: 1-5 Ordinal + Money + Failure + Employment • Agregation Method from Components Changes Results • Important Decisions made on Indicator Ranks Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Getting the Data • Higher Education • HESA http://www.hesa.ac.uk/holisdocs/home.htm • By institution • By subject • Pay to get by insitution by subject • HOLIS Search by Subject Area, University, Gender, Degree Class • THES http://www.thes.co.uk/statistics/university_performance/ • PUSH http://www.push.co.uk/pushguide/shortlist.jsp £14.95 • UCAS • Economic Development Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Times Higher Education Supplement • League Tables 2006 Entry standards A level points ST 100 Student-to-staff ratio 8 - 34 ST 100 • Student satisfaction (4min-20max) Likert 13.7 – 16.1 ST 150 Research assessment exercise Pubs 0 -7, bizarre ST 200 Library and computing spending £s 3 year average Facilities spending £s 3 year average • Good honours % of graduates ST 100 • Completion % entrants • Graduate destinations % known ST 100 • Salaries 04-05 VC Pay 2004/2005 £104k-£305k Average FT Salary 2004/2005 £21.7k-£43.2k Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
National Student Satisfaction • Form • For each statement, show extent of agreement or disagreement • 5 Definitely agree, 4 Mostly agree, 3 Neither agree nor disagree, 2 Mostly disagree, 1 Definitely disagree, N/A Not applicable • The quality from my University for eatch feature was: • 5 Excellent, 4 Good, 3 Satisfactory, 2 Weak, 1 Poor, N/A Not applicable • Examples • 8. I have received detailed comments on my work • 19. The course has helped me to present myself with confidence • Topics • The teaching on my course Q1-4 • Assessment and feedback Q5-9 • Academic support Q10-12 • Organisation and management Q12-15 • Learning resources Q16-18 • Personal development Q19-22 Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Making the League Table • Actual Agregation Procedure • Normalize so maximum is 100 • Decide on weighting • Aggregate • Do weightings matter? • Desirable? • Normalize to z-scores • Distinguish input and output measure • Exploratory principal components? • Univariate multiple regrssion > canoncial input measure? • MANOVA? > canoncial input, output measure? • Rank • Categorize League? Where are big differences Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Principal Components 2 Components Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Principal Components • Oblimin • Component correlation .199 • Varimax • Component 1: 41.0 %variance • Component 2: 41.0 %variance Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Input and Output Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
League Measures Raw Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
League Measures Agregate Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
On Evaluating HE • Misleading if not by discipline • Shows ideas • League Tables Uninformative • Need predictors • Weighting Matters? • Benchmarking E-Learning • Untested target: Embed E-Learning • Implications • Better tools from public statistics Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
On Indicators Generally • Economics Very Similar • Can show almost anything • Needs sound basis • Predictors • Weighting? Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Thank You Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics
Original Abstract • Single quality indicators are often derived from weighted sums of survey items, including ordinal Likert items. It is widely believed that the weights and combining procedures don't make much difference to the final rankings or scores. This presentation challenges this assumption using examples from the National Student Survey of UK higher education and country development indices. Kornbrot, Radical Statistics Meet 24 Feb 2007: Quality of Publice Survey Statistics