190 likes | 343 Views
Rob Smith: CEM Inset Provider. Working with Colleagues, Parents and Students. Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013. therobsmith@hotmail.com. Interpreting IPRs Exercise. Have a look at the three IPRs on the following pages.
E N D
Rob Smith: CEM Inset Provider Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28th February 2013 therobsmith@hotmail.com
Interpreting IPRsExercise Have a look at the three IPRs on the following pages. What do the scores suggest about the students and how would you use this information to aid the teaching and learning process for each of them?
3 Proof-Reading 88 PSA 108
Case Study 1 You are given data relating to an institution where students completed the ALIS computer adaptive test. They are chosen because they show significant differences between the various parts of the test. Remember scores are standardised around 100. a) Are there any apparent mismatches between the subjects being followed and this data? b) What support can be given to those students who have weaknesses in Vocabulary or Mathematics ? c) How might predictions made for these students be tempered in the light of the inconsistencies in the test components and missing average GCSE points scores?
Case Study 2 What are the Strengths and weaknesses of this A/AS level student? To use the IPR (Individual pupil record) familiarise yourself with the terms standard score, band, stanine, percentile and confidence band • Which AS/A level subjects might be avoided? b) This student chose English, Film Studies, Music Technology and Psychology. Is this a good choice? Do you foresee any problems?
Case Study 3 Here is the Individual Pupil Record from the ALIS computer adaptive test done in Year 12 for a current Year 13 student. This student had a high positive value added in every GCSE subject as measured using MidYIS as a baseline. ( Average GCSE score 7.44) On the next page are her A level predictions and chances graphs Why are the predictions different? Are the chances graphs useful here?
Using PARIS software and tweaking the predictions for prior value added by these subjects, then from a GCSE baseline A*s are predicted in three of the four. If we did the same for the adaptive test baseline solid Bs might be predicted in all three. It is also worth looking at the value added at GCSE. See commentary
Commentary The differences in prediction from the GCSE baseline and the computer adaptive test for some students are interesting and these can be in either direction. Here there has been a very large value added at GCSE which may or may not be sustainable at A level. This student’s history is shown below The value added here at GCSE is between 1 and 2 grades (for all institution data at year 7) and significantly positive for subjects (for the Independent school data from year 9) Actually if we measure this student’s value added from an average GCSE score of 7.44 next year, it does not tell the whole story. We need to look as well at the value added from the adaptive test too. The chances graphs should be used with extreme caution here and the growth mindset is vital if used with students
CASE STUDY No. 4 • A school uses the Yellis “predictions” to give target grades for each GCSE subject a pupil is taking • This target grade is called a ‘Baseline Suggested Grade’ • Through the progress reporting system teachers are asked to assess current progress against this BSG and to suggest what the likely outcome is AT THE END OF THE COURSE • Over the page is a pupil’s IPR followed by an end of term report • If you were the pupil’s Form Teacher, how would you approach a discussion with his/her parents at a Parents’ Evening?
Colleagues Subject Teachers Heads of Department Pastoral Staff Managers
Subject Teachers/HODs This will be interpreted as a personalised prediction The data doesn’t work for this particular student You’re raising false expectation – he’ll never get that result You’re making us accountable for guaranteeing particular grades – when the pupils don’t get them we’ll get sacked and the school will get sued
Subject Teachers/HODs Remind them that: Baseline data can give useful information about a pupil’s strengths and weaknesses which can assist teaching and learning “Predictions” are not a substitute for their professional judgement Reassure them that: It is not a “witch hunt” Value added data is used to assess pupil performance not teacher performance!
Pupils • Make sure they know why they are taking the test. • Make sure they take it seriously • Make sure they don’t deliberately mess it up in order to lower their BSGs! • Be prepared to look for clear anomalies and re-test if necessary • Explain the chances graphs to them clearly
Parents • Make sure they know why the pupils are taking the test • Explain the results to them • Explain lots of times that the chances graphs and BSGs do NOT give personalised predictions • Ensure that they receive good quality feedback from staff when ambers or reds are awarded • Encourage them to ask lots of questions