370 likes | 529 Views
Using CEM Data: Target Setting, Monitoring & Reporting. Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM Neil@cem.dur.ac.uk. Which Baseline to use?. Student 1 . Student 2. Student 3. Student 3 - IPR. Key Questions for Target Setting.
E N D
Using CEM Data: Target Setting, Monitoring & Reporting Belfast, March 6th 2013 Neil Defty Business & Development Manager CEM Neil@cem.dur.ac.uk
Key Questions for Target Setting • What type of valid and reliable predictive data should be used to set the targets? • Should students be involved as part of the process (ownership, empowerment etc.)? • Should parents be informed of the process and outcome?
Key points to consider might include: • Where has the data come from? • What (reliable and relevant) data should we use? • Enabling colleagues to trust the data: Training (staff) • Communication with parents and students • Challenging, NOT demoralising, students…. • Storage and retrieval of data • Consistency of understanding what the data means and does not mean
. Measuring Value Added – Terminology Exam grade -veVA +ve VA Residuals BASELINE SCORE VA Trend Line/Regression Line
A* B C Aldwulf Beowulf Subject A D Result Cuthbert E Subject B F G U Low Ability Average Ability High Ability Baseline Score Measuring Value Added – An Example National Trend ‘Average’ Student -ve (- 2 grades) +ve (+ 2 grades) The position of the national trend line is of critical importance
A* A B C D E C B A A* Some Subjects are More Equal than Others…. A-Level >1 grade
Burning Question : What is my Value Added Score ? Better Question : Is it Important ?
Value Added Charts Pre 16
VA Score Performance above expectation Good Practice to Share? Performance in line with expectation Performance below expectation Problem with Teaching & Learning?
Danger of Relying on Raw Residuals Without Confidence Limits Which Subjects Cause Most Concern?
Value Added Charts Post 16
SPC Chart VA Score Performance above expectation Good Practice to Share? Performance in line with expectation Performance below expectation Problem with Teaching & Learning? 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 Year
Subject Summary - Current Year Subject Summary - 3 Year Average
A2-English Literature Statistical Process Control (SPC) Chart 2008 2009 2010 Year
A2 – English Literature Student Level Residuals (SLR) Report Scatter Plot General Underachievement?
A2 – English Literature Student Level Residuals (SLR) Report Scatter Plot Too many U’s?
Other things to look for… Why did these students do so badly? Why did this student do so well? How did they do in their other subjects?
Summary of Process • Examine Subject Summary • Determine ‘interesting’ (i.e. statistically significant) subjects • Look at 3 year average as well as single year if available • Look at trends in ‘Interesting Subjects’ • Examine student data – Scatter graphs • Identify students over / under achieving • Any known issues? • Don’t forget to look at over achieving subjects as well as under achieving
GCSE or Baseline Test? • Do students with the same GCSE score from feeder schools with differing value-added have the same ability? • How can you tell if a student has underachieved at GCSE and thus can you maximise their potential? • Has a student got very good GCSE scores through the school effort rather than their ability alone? • Does school GCSE Value-Added limit the ability to add value at KS5? • Can you add value at every Key Stage?
Average GCSE = 6 Average GCSE = 6 Average GCSE = 6 The Effect of Prior Value Added Beyond Expectation +ve Value-Added In line with Expectation 0 Value-Added Below Expectation -ve Value-Added Do these 3 students all have the same ability?
GCSE as Baseline Same School - Spot the Difference ? Test as Baseline
Comparison to all schools Comparison to Independent Schools Only
Comparison to FE Colleges Only Comparison to all schools
Questions: • How does the unit of comparison used affect the Value Added data and what implications does this have on your understanding of performance? • Does this have implications for Self Evaluation?
Definitions: • Residual – difference between thepoints the student attains and points attained on average by students from the CEM cohort with a similar ability • Standardised Residual – the residual adjusted to remove differences between qualification points scales and for statistical purposes • Average Standardised Residual – this is the ‘Value Added Score’ for any group of results • Subject VA – averageof standardised residuals for all students’ results in the particular subject • School VA – average of standardised residuals for all students’ results in all subjects for a school / college • Confidence Limit – area of statistical uncertainty within which any variation from 0 is deemed ‘acceptable’ and outside of which could be deemed ‘important’