330 likes | 356 Views
Explore how research-driven decisions and assessment practices can enhance educational quality, address rising assessment costs, and promote equity and inclusivity. Discover implementable solutions and the role of assessments in driving educational quality.
E N D
the 13th Conference of Southern Africa Association for Educational Assessment (SAAEA)Cyprian CeleMay 21, 2019, Gaborone, Botswana
EMPHASISED YESTERDAY • Decisions driven by research and assessment information • Strengthening cooperation and harmonisation • Rising cost of assessment • Quality to back up quantity • Possibilities in assessment for educational quality • Outcome: Implementable solutions
WHY ARE WE HERE? • Share our research and assessment products in relation to their use in improving the quality of education: • better instruction and learning • accountability purposes to better performance of education implementers and improvement of learning • educational policy formulation
WHY HERE (CONT) • We are also looking at • how assessment practices and reports are being used to foster equity and inclusivity of learners • the extent to which ICT has been harnessed for better assessment practice and use • how learners who may demonstrate preference to follow different educational pathways could be served with alternative assessment.
CONSISTENCY WITH SDGs • SDGs were born in Rio, 2012 • Replaced MDGs, effective 2016. • EFA in MDGs is being continued by goal number 4 in SDGs • EFA emphasized increase in primary enrolment; Goal number 4 of SDGs is emphasizing quality of education. • Quality cannot be attained without action, looking back, making necessary adjustments and going forward.
DRIVING FORWARD • Vehicle Curriculum • Road Institutional infrastructure • Driver Teacher and Manager • Traffic Officers Educational planners/supervisors • Cameras Researchers and Assessors • GPS ASSESSMENT INFORMATION
WHAT ASSESSMENTS ARE WE TALKING ABOUT? • Summative Examinations • Survey Studies • Other standardised tests not discussed in this presentation • As we review them focus on whether they are best suited for supplying information on our themes .
SUMMATIVE EXAMINATIONS • End of primary • End of Junior Secondary • End of Senior Secondary • Technical • Business • Other tertiary levels relevant • Pre-primary?
PURPOSES OF THE EXAMINATIONS • Gauge achievement levels of individuals • Selection • Certification • Other uses may be argued in
EXAMINATION DESIGN • Written papers (free response or selection) • Practical • Coursework • Projects • Portfolio • Assessment on a continuous basis being advocated more and more
OUTPUT • All the work done by a learner is processed to obtain scores • Scores may be raw marks or IRT scales • Scores converted to grades or proficiency levels according to rules • Grades in different subjects are combined to obtain a division/Some have no divisions
COMMUNICATING TO STAKEHOLDERS AND PUBLIC • Individual performance is reported through the school, at subject level and overall or specified level • Percentages are computed • Subject level • Gender • District/region • Qualitative reports on performance of learners • Statistical analysis of item performance • Comparisons with previous years
SURVEY STUDIES • National Examples • Lesotho National Assessment of Educational Progress (LNAEP) • National Assessment of Progress in Education (NAPE) in Uganda.
REGIONAL EXAMPLES • Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ), • Monitoring of Learning Achievement (MLA)
INTERNATIONAL EXAMPLES • Progress in International Reading Literacy Study (PIRLS) • Trends in International Mathematics and Science Studies (TIMSS).
SURVEY STUDY PURPOSES • Reporting achievement at system level • Evaluating program • International/regional comparisons • Trend studies
DESIGN • Few subjects per cycle, often one or two • Literacy and Numeracy have often been studied • Often written and questionnaire • Selection and supply type items (emphasizing HOTs)
PROCESS • A learning level is decided on, eg Standard Four • Representative Sample is obtained • With IRT application, large sample is required • Content agreed upon. Regional or international not targeting a particular curriculum (may be compromise)
PROCESSING • Responses coded • Coded responses show weaknesses and strengths displayed by learners • Scores (IRT) obtained from the codes • Statistical analysis showing • Overall performance • Subgroup performance • Comparisons: • regional, • international, • trend
OUTPUT • Scores (IRT) obtained from the codes • Statistical analysis showing • Overall performance • Subgroup performance • Comparisons: regional, international, trend • Profile of learners
DO THE PROCESS AND PRODUCT HELP IN • improving instruction and learning • making accountability decisions • fostering equity and inclusivity • This presentation will concentrate on these themes • RESERVATIONS ON RELIABILITY AND VALIDITY ASSUMED NON-EXISTENT, otherwise press ‘HOME’ on the GPS.
QUESTIONS AND ANSWERS • I am asking questions. • Questions point to what I do not know and would be happy to listen to papers and discussions covering them. • There may be more important questions you have addressed • Papers and discussions will show us • the extent to which we have used assessment reports • whether we should package our feedback differently? • research done but not harnessed • other issues on use of information from assessment
INTERNAL ASSESSMENT AND LEARNING IMPROVEMENT • We may have papers documenting how teachers use internal assessment to improve on their teaching strategies in order to improve learning. • How do the teachers analyse learner responses to sieve instructional improvement points? • How is this practice enforced at school level? • Could teachers be supported better on this?
EXTERNAL EXAMINATION REPORTS TO SUPPORT LEARNING • Do the reports reach schools • managers • heads of departments • classroom teachers • Are the reports understood? • How is the information used to improve teaching and learning? • Is the tail wagging the dog: teaching to the test? • Are the reports found adequate or improvement is needed? (±)
EXAMINATION USE (CONT) • How is disaggregated data being used for instructional purposes: gender, region, etc. • Observed: 15-9 +5 -3; 15- (9+5) -3 • Anything more from the massive data? • Use of ICT to provide item level information: costs vs value? • Do teachers sieve their instructional deficiencies from the examination results? • Are there effective instructional strategies out there?
SURVEY STUDIES (exams +) • Do sampled learners do their best – low-staked? • Do reports reach all, including ones not sampled? • Are the results used for instructional improvement? • Reforms stimulated by these studies.
ACCOUNTABILITY AND LEARNING IMPROVEMENT • Accountability is about value for money. • Praise/reward if performance is good; otherwise punish • Teacher in the frontline. • Can she explain poor performance? • School manager oversees the teacher? • Should learner performance be part of performance contracting? • At what level should we draw the accountability line?
ACCOUNTABILITY FOR LEARNING IMPROVEMENT II Are assessment reports enough for making accountability decisions? • Instances of application of accountability to drive the quality of education • What benefits/challenges have been encountered in applying accountability? • Does accountability have an impact on school based scores? – Any survival tactics? • Are there ways of applying accountability with minimal adverse effects?
CHICKEN OR EGG FOR ACCOUNTABILITY • Make the teacher happy before accountability or vice versa: • housing • Medicals • scholastic provisions • salary
TEACHER READINESS • How well was the teacher trained? • What in service training does the teacher get? • Corrective inspection
EQUITY AND INCLUSIVITY • Does accommodation equate task demands on all learners: • Blind • Deaf • Physical disability • Do the accommodated tests measure the same constructs? • Do we cater for the specially gifted in our assessments?
INCLUSIVENESS EXPANDED • Communities: • Do we accommodate for opportunity to learn? • Comparability within and between boards
CONCLUDING REMARKS • Assessment is a powerful tool. • It is the eye of the educator. • The challenge is to act on what we see in order to guide education towards quality.