1 / 16

OECD country practices on reporting and implications for Australia

This document discusses the OECD's country practices on reporting and their implications for Australia. It covers topics such as curriculum, assessment, reading literacy, social equity, and the importance of evaluation in education systems.

jtrumbull
Download Presentation

OECD country practices on reporting and implications for Australia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OECD country practices on reporting and implications for Australia Barry McGawDirector for EducationOrganisation for Economic Co-operation and Development 2005 Curriculum Corporation Conference Curriculum and assessment:Closing the gap Brisbane, Australia2-3 June 2005

  2. National reflections on PISA results.

  3. Germany seeking more specific comparisons High quality Low equity High quality High equity Reading literacy Low quality Low equity Low quality High equity Social equity (OECD regression slope – country regression slope) Source: OECD (2001) Knowledge and skills for life, Table 2.3a, p.253.

  4. Denmark’s concern about efficiency Reading literacy Cumulative expenditure per student to age 15 ($US equivalent PPP) Source: OECD (2001) Knowledge and skills for life, Fig. 3.7a, p.91..

  5. Place of national assessments • Recognition that more could be known domestically • Denmark judged to have no ‘culture of evaluation’ • Some countries report that international comparisons have stimulated domestic evaluations • Why bother? • Weighing a pig won’t make it fatter. • Without weighing the pig, how can you know how well feeding regime is working? • Purposes of assessment • Accountability – summative • Improvement – diagnostic, formative • Scope • Sample or census? • Depends on whether focus on system, schools, students

  6. Driving system reform may be helped by having disaggregated data Improving systems by monitoring and improving units: - education - health

  7. A UK health example:Reducing wait time in Accident and Emergency Units

  8. % of A&E patients waiting no more than 4 hrs 2002 2003 2004 2001 Target first introduced in National Health Service Plan, June 2000 Public Service Agreement announced, June 2001 No improvement occurring Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.

  9. % of A&E patients waiting more than 4 hrs 2002 2003 2004 2001 Dept of Health taskforce begins Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.

  10. % of A&E patients waiting no more than 4 hrs 2002 2003 2004 2001 Accident & Emergency included in hospitalstar ratings Dept of Health taskforce begins Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.

  11. % of A&E patients waiting no more than 4 hrs 2002 2003 2004 2001 Target revised to take account of clinical exceptions Accident & Emergency included in hospitalstar ratings Incentive scheme introduced Dept of Health taskforce begins Performance management Tailored support for specific problems Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.

  12. % of A&E patients waiting no more than 4 hrs 2002 2003 2004 2001 Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.

  13. Driving system reform may be helped by having disaggregated data Improving systems by monitoring and improving units: - schools? - teachers?

  14. Data form and data use • Breadth of data • What we measure signals what we value. • Does what we don’t measure signal what we don’t value? • Watch for unintended consequences. • Type of data • student performances • measurements of current performance • estimates of value added by school • comparisons with ‘like’ schools • other data on schools – input, process, outcomes? • Uses of data • school (and system) only • public use • results or rank orders • website accessibility (UK, Norway, Just4kids, Standard & Poors)

  15. Do assessment programmes make a difference? • Research evidence • a little but it is positive – programmes improve systems • Hanushek, E.A. & Raymond, M.E. (2004) The effect of school accountability systems on the level and distribution of student achievement, Journal of the European Economic Association, 2(2-3, p.406-415. • System evidence • England reports improvement among poorest performing schools • Less improvement of next 15% • Importance of monitoring the monitoring • Evaluating trends in performance • overall • for subgroups (as in US No Child Left Behind Act requirements • Evaluating interventions intended to improve performance

  16. Thank you.

More Related