230 likes | 250 Views
Some lessons from researching practical science summative assessment Ros Roberts STEM 2010: Beyond the Classroom. Cambridge Botanical Gardens, 17 th May 2010. Starting points: 1. Summative assessment structure should encourage fieldwork in teaching including:
E N D
Some lessons from researching practical science summative assessment • Ros Roberts • STEM 2010: Beyond the Classroom. • Cambridge Botanical Gardens, 17th May 2010
Starting points: 1 • Summative assessment structure should encourage fieldwork in teaching including: • Observations – ‘seeing the world thro’ a biologist’s / scientist’s eyes’ • Solving problems / investigations • Assessment structures should not distort practice in T&L (c.f. ‘routinisation’ of very limited range of Sc1 coursework)
Starting points: 2 • Assessment in high stakes summative assessment system must be reliable • Tensions between reliability and validity of performance assessment
Assessing performance • Assessing ‘the ability to do an investigation out of the classroom’ • Via written accounts • Need up to 10 investigations to get reliable aggregate mark . Valid, reliable but impractical. • Therefore increase reliability but reduce validity in fewer assessments by: • restricting the type of investigation (which can then be practiced/routinised) • Not really assessing ‘doing’ but giving credit mainly for the substantive ideas (predictions, hypothesising, background, etc) which are also given credit in other parts of exam system. • [Research suggests that ‘ability of investigate’ is not directly correlated with other traditional measures of ability.]
Alternatives? • Assessing ‘the ability to do an investigation out of the classroom’ • Can’t be done in mass assessment system • So, what’s the next best thing? • Could focus on the ideas employed when investigating
Ideas in investigations e.g.factors affecting shrimp distribution
What is to be assessed? Substantive ecological / biological ideas Concepts of evidence (the thinking behind the doing) Observation Skills – both written and practical
Domain specification • i.e. What to teach and what to assess. • Clear articulation is essential for curriculum and assessment. • e.g. Domain Specification for Procedural Understanding • Concepts of Evidence ‘the thinking behind the doing’ • These are the ideas that are needed to develop a procedural understanding • Validated against ‘pure’ and ‘applied’ work-place science • A sub-set required for GCSE to enable pupils to solve practical problems and evaluate others’ evidence • Validated by teachers attending our development sessions • http://www.dur.ac.uk/rosalyn.roberts/Evidence/cofev.htm
Starting points: 1 • Summative assessment structure should encourage fieldwork in teaching including: • Observations – ‘seeing the world thro’ a biologist’s / scientist’s eyes’ • Solving problems / investigations • Assessment structures should not distort practice in T&L (c.f. ‘routinisation’ of very limited range of Sc1 coursework)
Bring own data: • What did you investigate? • Pick one example from your investigation where you were careful to make your design valid. Describe what you did, and explain how and why it helped to make the design valid. • Pick one example from your investigation where you made a series of repeat measurements - to take an average. Copy the series down. Explain how you decided that you had done enough measurements so that you could believe that the average was reliable. • How did you present your data so that patterns could be easily identified? Explain why you chose this method. • What pattern did you observe from the data you generated? Did all the data fit with the pattern; if not, do you have any idea why? • How did the evidence you obtained from the investigation relate to the original questions you were attempting to solve? • How much do you trust the data you obtained? Explain your answer. • Justify your choice of instruments in the investigation.
Possible response Possible response
Summary • Valid and reliable summative assessment in mass system is not possible • Assessment should encourage fieldwork teaching and not distort practice • Assessing ‘ideas’ that have been taught (inc with fieldwork) will be more reliable and, if specified well in the curriculum and assessment, has validity