330 likes | 500 Views
Measuring Learning and Improving Education Quality: International Experiences in Assessment. John Ainley South Asia Regional Conference on Quality Education for All New Delhi, India, October 24 - 26, 2007. Quality education for all. Shift from provision to outcomes
E N D
Measuring Learning and Improving Education Quality:International Experiences in Assessment John Ainley South Asia Regional Conference on Quality Education for All New Delhi, India, October 24 - 26, 2007
Quality education for all • Shift from provision to outcomes • Emergence of large-scale assessment programs • Developments in methods and reporting • Developments in applications • Assessments used to: • Monitor variations over time in relation to: • Established standards / criteria • Changes in policy and practice • Map variations within countries to establish action targets: • Regions and sub-regions • Sub-groups of students • Contextualise national patterns: • In relation to international patterns • In relation to comparable countries
Large-scale assessment surveys • Conducted at various levels • International • Regional • National • Sub-national – state or province • Provide information at various levels • System • School • Classroom • Parent and student • Indicate what is valued • Impact teaching and learning • Drive change in policy and practice
OECD PISA Population and samples 15-year-olds in school pps school sample random selection of students Domains Reading literacy Mathematics literacy Science literacy Cycle Three years Since 2000 IEA Populations and samples Grade 4, Grade 8, Grade 12 pps school sample Random selection of classrooms Domains Reading – PIRLS Grade 4 Mathematics – TIMSS Science – TIMSS Cycle TIMSS Four years, since 1994/5 Antecedents back to 1964 PIRLS Five years since 2001 Other studies: ICCS 99 and 2009 International assessment studies
OECD (PISA) framework expert development consultation future needs domain coverage Rotated booklet design data sources school student teachers option in 2009 psychometrics one parameter IRT Reporting Scale: sd = 100 Proficiency bands IEA (TIMSS & PIRLS) framework curriculum analysis (OTL) common elements what is taught domain coverage Rotated booklet design data sources school student teacher psychometrics three parameter IRT Reporting Scale: sd = 100 Proficiency bands International assessment studies
Regional assessment studies • Latin America • Latin American Laboratory for Assessment of the Quality of Education (LLECE) • Second International Comparative Study (SERCE) • Language, mathematics science • Africa • Southern Africa Consortium for Monitoring Educational Quality (SACMEQ) • Supported through IIEP
National assessment studies • NAEP (USA) • Sequences over many years • Key stage assessment (United Kingdom) • Latin America (Puryear, 2007) • Rare in 1980 • Common by 2005 • Vietnam 2001, 2007 • Australia
Sub-national assessments • Typically in federal systems • Australian State assessments • Equating at benchmark levels • Transition to a national assessment in 2008 • Germany • Canada, Ontario
Issues in national and international assessment surveys • Domains and sub-domains assessed • Census or sample • Analysis • Reporting
Assessment domains • Typically • Language (literacy, reading) • Mathematics (numeracy) • Science sometimes • Coverage within domains • Multiple matrix designs • Rotated booklets to ensure coverage • Other domains • Sample studies
Grades or ages assessed • Define population • Age • Grade • One grade or several • One grade • End of common period of schooling • Multiple grades • End of primary school • End of common secondary school • Mid-primary school
Sample or census • Advantages of census • Reporting to schools, teachers, parents • Enough data to identify disadvantaged groups • Enough data to identify regional variations • Advantages of sample studies • Cost effective • Minimal disruption to school teaching programs • Cover a wider range of areas • Combinations of census and sample • Census for literacy and numeracy • Samples for other domains
Analysis issues • Item response theory • Development of a common scale • Student performance • Item difficulty • Difference in detail • Vertical equating • Long scales • Common items overlapping • Common person equating studies • Horizontal equating • Equating over time • Common items over each cycle • Common person equating studies
Reporting assessment data • Reporting scales • Typically a mean for one grade fixed (e.g. 400) • Standard deviation of 100 • Examine distributions for different groups • Proficiency bands – standards referenced • Defined in terms of item difficulties • Band width of equal difficulty • Describe what is represented by items in a band • Report percentages • Standard setting exercises • Define standards in terms of: • Proficient standard • Minimum competency • Panels of expert judges
Scale descriptions • Provide an interpretation of scores • Monitor student development • Identify developmental continua • Plan student learning • Progress maps at state and school level
Establishing expected standards • Consultation • What should a student be able to do? • Different standards • Minimum competency • Proficient • Advanced • Provide a basis for simple comparisons
Achievement in relation to Most students in the state System average A defined benchmark
Uses of assessment • Public information • About the system overall • About sections of the education system • Accountability • Directing resources and interventions • Groups of students • Levels of schooling • Schools • Individual students • Defining learning progress • Establishing progress maps • Establishing standards • Providing examples of student work at different levels • Evaluating programs and research • Understanding “what works”
Public information • Stimulating demand for education • Identifying areas of need • Indigenous students • Boys reading • How wide is the gap • Providing comparisons internationally • Staying the same • Relative change
Directing interventions • Identifying disadvantaged students • Based on social characteristics • Based on diagnostic information – requires census • Allocating funds • Chile: bottom 10% schools • Australian states: bottom 15% schools • Focus on the early years • Providing a basis for intervention • In most education systems • Use of consultants to work with schools • Easier with census assessment • Education action zones
Evaluation and research • Evaluating what works • Starting school • Approaches in early childhood • Impact of policy interventions • Using data longitudinally • What contributes to enhanced growth • Value-added measures (NSW Smart Schools) • Studying later progress (e.g. PISA Longitudinal) • Uses of assessment data • Linkage to other data about schools • Literacy and numeracy in the middle years • Literacy development of boys • Effective teaching for literacy
Conclusions • Assessment programs have grown • International, regional, national and sub-national • Have begun to impact on policy and practice • Complementary roles at different levels • Emergent design principles • Described scales and standards referencing • Higher order skills & thinking • Domain coverage • Varied methods and formats • Enhancing application • Report meaningfully • Provide interpretation • Balance pressure and support