460 likes | 643 Views
On “Quality and Equity in the Performance of Students and Schools”. Robert M. Hauser Vilas Research Professor of Sociology The University of Wisconsin-Madison. About Ch. 4 of PISA 2006: Analysis. Valuable Impressive Complex Well-documented
E N D
On “Quality and Equity in the Performance of Students and Schools” Robert M. HauserVilas Research Professor of SociologyThe University of Wisconsin-Madison
About Ch. 4 of PISA 2006: Analysis • Valuable • Impressive • Complex • Well-documented • But the analysis can usefully be amended and extended
Context: The Coleman-Campbell Report (1966) • Substantial equality in school resources and facilities • Most outcome variation within schools • Absolute outcome differentials increased with grade, relative positions unchanged • Student SES contributed to outcome differentials • Black-white differentials not explained
Coleman-Campbell: Four consequences • Report suppressed • Extensive and useful critical responses, e.g., Mosteller-Moynihan • Shift of research and policy from resources to outcomes • Shift to longitudinal thinking about achievement processes
PISA 2006: Science Achievement • Has anything changed? • Narrower scope (science) • Narrower age (15) • Broader geography (57 countries) • But has not gone beyond the findings of 1966
PISA 2006: Assessment • Real value in a comparative overview • Reliance on a small set of variables limits what we can learn • Over-reaches in effort to draw policy implications
Problems: Two examples • Effects of school socioeconomic context • Discredited more than 35 years ago • Largely a consequence of poor model specification • Reading policy from within- and between-school slopes of science achievement on SES and dispersion around the lines • Ignores roles of time and place in schooling process
Outcome: Overall Science Achievement “The overall impact of home background on student performance tends to be similar for science, mathematics and reading in PISA 2006. Therefore, to simplify the presentation and avoid repetition, this chapter limits the analysis to [overall] student performance in science, the focus area in 2006.”
What’s wrong here? • If science is not different, what is the report about? • General academic achievement? • Why invest in measuring scientific knowledge and skill? • Does homogeneity in school and SES effects really hold? • Across countries? • Across regions? • Loss of statistical power
Between- and Within-School Variance in Science Achievement 1 • Wonderfully descriptive graphic (Fig 4.1) • Dramatic inter-country differences (4.7% in Finland vs. 69.6% in Bulgaria) • But reference value is average OECD variance • Country order would change using actual share of between-school variance in each nation • U.S. 29.1 vs. 23.2 • Hungary 60.5 vs. 70.4 • Bulgaria 69.6 vs. 55.0 • Azerbaijan 17.9 vs. 51.8
Between- and Within-School Variance in Science Achievement 2 • Shares don’t add up in the OECD • Average between schools: 33.1 • Average within schools: 68.1 • Actual OECD • Average between schools: 33.1 • Average within schools: 66.9 • All 57 countries • Average between schools: 36.1 • Average within schools: 63.9
Between- and Within-School Variance in Science Achievement 3 • What does substantial between-school variance tell us? • A single system with supposedly equal resources and opportunities Vs. • A world-wide collection of schools • Varying organization and processes • Multiple grade levels • What is surprising here?
The PISA SES Index 1 • Accounts for half or more of between-school variance in most countries • Accounts for a small fraction of within-school variance in most countries • Why?
The PISA SES Index 2 • Differential bias • Aggregated SES at school level is highly reliable • Almost all of the unreliability is at the student level (within schools) – 0.52 to 0.80 • How can we compare countries? • Unmeasured heterogeneity • School SES is positively correlated with the “good” and negatively correlated with the “bad” • But we don’t know what’s what • School SES correlation poses a question, not an answer
The PISA SES Index 3: Construction • IRT scaling • Three components • Highest parent’s education • Highest parent’s occupational status (ISEI) • Count (?) of many possessions • 3 possessions unique to each country • Weights assigned in relation to a principle components analysis • Standardized to mean = 0, sd = 1 in OECD
The PISA SES Index 4: Deconstruction • Not the same in each country (or comparable across years of PISA study) • Standardization eliminates the metrics • Claim that separate effects of components cannot be separated because of colinearity • What are the effects of the several components of SES? • Do those effects differ among countries? Or population groups within countries? • Use the MIMIC model (Hauser-Goldberger 1971)
The MIMIC model SES variables Composite Achievements
Achievement Regressions 1 • SES accounts for • 20.2 percent of achievement variance across OECD countries • Average of 14.4 percent within OECD countries • The relationship is close to linear
Achievement Regressions 2: Country Differences in Means • No description of adjustment method • Spreadsheet: Means were compared on individual country slopes • Confounds slope and composition differences • Country positions change when adjusted on the average (OECD) slope • Mexico, Turkey, Azerbaijan, Brazil, Colombia, and Thailand: 11 to 17 points lower • Indonesia, Macao-China, and Tunisia: 24 to 29 points lower
Achievement Regressions 3: Percentage of Variance Explained (R2) • A measure of the strength of the association between the PISA SES Index and achievement on the combined science scale • “On average across OECD countries, 14.4% of the variation in student performance in science within each country is associated with the PISA index of economic, social and cultural status. This figure is significantly higher than the OECD average in Luxembourg, Hungary, France, Belgium, the Slovak Republic, Germany, the United States, New Zealand and the partner countries Bulgaria, Chile, Argentina and Uruguay.”
But values of R2 are not comparable Slope SES variance Unexplained variance Explained variance
What to do? • Compare the error variances among countries • And the picture is very different!
Achievement Regressions 4: Countries, Achievement, and SES • Figure 4.10 cross classifies countries by unadjusted achievement and percentage of variance explained • Figure 2 cross classifies countries by SES-adjusted achievement and error variance • The two pictures are completely different
An example: The U.S. • Figure 4.10: The U.S. is near the center, slightly below the OECD average in science achievement and above average in percentage of variance explained. • Figure 2: The U.S. is slightly above average in science achievement (for all nations) and far above average in error variance (equality of educational opportunity)
Within and Between-School Regressions: Figures 4.11 and 4.12
Within- vs. Between-School Slopes 1 • Plainly, the between-school slopes are much steeper than within-school slopes in almost every country • Average ratio is 3:1 • What does the report make of this?
Within- vs. Between-School Slopes 2 “peer group effects … a better learning environment and access to better educational resources at school … the manner in which students are allocated to schools … classes and programmes within schools … fewer disciplinary problems, better teacher-student relations, higher teacher morale, and a general school climate that is oriented towards higher performance. … a faster-paced curriculum. … Talented and motivated teachers … parents … more engaged”
Within- vs. Between-School Slopes 3 • But in listing everything, the report tells us nothing • No evidence beyond a difference in slopes • And the report claims not to suggest causality (after doing so) • But wait, there’s more …
Within- vs. Between-School Slopes 4 • How did the disparity between the two sets of slopes grow so much larger visually? • Figure 4.12 compares effects of the interquartile range of school-level SES (25TH to 75th percentiles) • With the effects of the distance between the 4oth and 60th percentiles of student-level SES
Why are the slopes different? • Model is woefully incomplete • Other background variables? • Immigration • Language • Sibling size • Student psychological characteristics • Student and parent activities • Unreliability of student reports • Absence of any school characteristics (Ch. 5)
Comparing Within and Between-School Regressions: Policy Implications?
Why not? • The PISA Index of SES obscures as much as illuminates • Content varies • No metric • Differential reliability • Again, the model is woefully incomplete • Achievement develops across time – across about 10 years by age 15
Summation • The basic outline of the report is right, but • Much more to be done to address SES-achievement relationships • More detailed analyses would yield richer policy implications • It’s worth the effort