210 likes | 291 Views
[ Resampled Range of Witty Titles]. Understanding and Using the NRC Assessment of Doctorate Programs. Lydia Snover, Greg Harris & Scott Barge Office of the Provost, Institutional Research Massachusetts Institute of Technology • 2 Feb 2010. Overview. Overview.
E N D
[Resampled Range of Witty Titles] Understanding and Using the NRC Assessment of Doctorate Programs Lydia Snover, Greg Harris & Scott Barge Office of the Provost, Institutional Research Massachusetts Institute of Technology • 2 Feb 2010
Overview Overview *NB: All figures/data in this presentation are used for illustrative purposes only and do not represent a known institution. Background & Context Approaches to Ranking The NRC Model: A Modified Hybrid Presenting & Using the Results
Background & Context Introduction History of NRC Rankings MIT Data Collection Process
Participating MIT Programs Introduction
Section 2 Approaches to Ranking Approaches to Ranking
How do we measure program quality? • Use indicators (“countable” information) to compute a rating • Number of publications • Funded research per faculty member • Etc., • Try to quantify more subjective measures through an overall perception-based rating • Reputation • “Creative blending of interdisciplinary perspectives” Approaches to Rankings
Section 3 The NRC Approach The NRC Approach
So how does NRC blend the two? The NRC used a modified hybrid of the two basic approaches: • In total, a 4-step process, indicator based, by field • Process results in 2 sets of indicator weights developed through faculty surveys: • “Bottom up” –importance of indicators • “Top-down” – perception-based ratings of a sample of programs • Multiple iterations (re-sampling) to model “the variability in ratings by peer raters.” * The NRC Approach *For more information on the rationale for re-sampling, see pp. 14-15 of the NRC Methodology Report
So how does NRC blend the two? STEP 1: Gather raw data from institutions, faculty & external sources on programs. Random University (RU) submitted data for its participating doctoral programs. The NRC Approach NRC
So how does NRC blend the two? STEP 2: Use faculty input to develop weights: • Method 1: Direct prioritization of indicators--“What characteristics (indicators) are important to program quality in your field?” The NRC Approach Calculations
So how does NRC blend the two? STEP 2: Use faculty input to develop weights: • Method 2: A sample of faculty each rate a sample of 15 programs from which indicator weights are derived. The NRC Approach PrincipleComponents & Regression
So how does NRC blend the two? STEP 3: Combine both sets of indicator weights and apply them to the raw data: The NRC Approach X = Rating
So how does NRC blend the two? STEP 4: Repeat steps 500 times for each field A) Randomly draw ½ of faculty “important characteristics” surveys C) Randomly draw ½ of faculty program rating surveys G) Randomly perturb institutions’ program data 500 times* H) Use each pair of iterations (1 perturbation of data (G) + 1 set of weights (F)) to rate programs and prepare 500 ranked lists D) Compute “regression- based” weights The NRC Approach B) Calculate “direct” weights E) Combine weights F) Repeat (A) – (E) 500 times to develop 500 sets of weights for each field I) Toss out the lowest 125 and highest 125 rankings for each program and present the remaining range of rankings *For more information on the perturbation of program data, see pp. 50-1 in the NRC Methodology Report
Section 4 Results Presenting & Using the Results
What are the indicators? Results
What will the results look like? • TABLE 1:Program values for each indicator plus overall summary statistics for the field Results
What will the results look like? • TABLE 2:Indicators and indicator weights – one standard deviation above and below the mean of the 500 weights produced for each indicator through the iterative process (and a locally calculated mean) Results *n.s. in a cell means the coefficient was not significantly different from 0 at the p=.05 level.
What will the results look like? • TABLE 3:Range of rankings for RU’s Economics program alongside other programs, overall and dimensional rankings Results
What will the results look like? • TABLE 4:Range of rankings for all RU’s programs Results
Q&A Q&A
For more information… • The full NRC Methodology Report http://www.nap.edu/catalog.php?record_id=12676 • Helpful NRC Frequently Asked Questions Page http://sites.nationalacademies.org/pga/Resdoc/PGA_051962 Resources