480 likes | 625 Views
CREOLE 21 October, 2009. Using English in final examinations: Rationales and realities SPINE team. What is SPINE?. Student Performance in National Examinations: the dynamics of language in school achievement (SPINE) www.bristol.ac.uk/spine (ESRC/DfID RES-167-25-0263)
E N D
CREOLE21 October, 2009 Using English in final examinations: Rationales and realities SPINE team
What is SPINE? Student Performance in National Examinations: the dynamics of language in school achievement (SPINE) www.bristol.ac.uk/spine (ESRC/DfID RES-167-25-0263) Bristol team: Rea-Dickins, Yu, Afitska, Sutherland, Olivero, Erduran, Ingram, Goldstein Zanzibar team: Z. Khamis, Mohammed, A. Khamis, Abeid, Said, Mwevura LTA = high stakes; > 50% of school aged children leave school at the end of Basic Education as unsuccessful
Case 1: context Khadija is 15 years old and in Form 2 of secondary school. Her learning in primary school was entirely through the medium of Kiswahili, with English taught as a subject. She experienced an abrupt shift from Kiswahili as medium of instruction on transition to secondary school as in Form 1 she was expected to learn all her subjects through English. However, in reality both her L1 (Kiswahili) and L2 (English) are used when she is being taught Maths, Science, Geography and her other subjects. But, at the end of Form 2 she is required to demonstrate her subject learning in formal school examinations entirely through the medium of English.
4 Insert the SPINE diagram here
The Policy & the Politics: historical perspectives • Continuous assessment (CA) was introduced in Tanzania schools in 1976 after Musoma Resolution (NECTA Guidelines 1990:1). • In Musoma Resolution, Nyerere (1974) (see Lema et al 2006:111-112) criticised the examination system existed since independence and addressed the need to change it. • He maintained that the assessment procedures valued more competitive, written type, theoretically and intellectually focused examinations.
The Policy & the Politics: historical perspectives • He saw the need to make Assessment procedures more friendly, less competitive and practically focused. In that manner, he directed that students’ assessment must; • Include works that can enable students to function effectively in their environments. • Cover both students’ performance in theoretical and practical works. • Be carried out throughout the year.
The Policy & the Politics: historical perspectives • The National Examination Council of Tanzania (NECTA), started to include projects, character assessment, exercises and tests in the assessment system Teachers were then given responsibility to; • measure students’ ability to use their acquired knowledge and skills in their local environments. • keep records of their progress continuously
Tanzania and Zanzibar Education Policies Both The Tanzania Education and Training Policy (1990:79) and The Zanzibar Education Policy (2006:28) stated that continuous assessment (based on teacher assessment) combined with the final exam would be the basis for awarding of certificates at Secondary Education levels. • The Zanzibar Education Master Plan of 1996-2006 emphasized consistency between teacher assessment and National examinations for selection purposes and quality control (p.43)
Language policy and Assessment • Students’ assessment is affected by Language Policy because students have to demonstrate their knowledge and abilities through a language ratified by the system. • The Zanzibar Education Policy of 1995, Kiswahili version Section 5.72 (p28), The Zanzibar Education Master Plan of 1996-2006 (p43) and the Education Policy of 2006 (p38), all emphasized the use of English Language as a medium of instruction for all subjects in secondary education except for Kiswahili and Islamic studies.
Language policy and Assessment • So, corollary to this, English, as a language of instruction in teaching science at secondary level has automatically become language of assessment; continuous assessment and examination. • This implies that students’ opportunities to demonstrate their knowledge in examination is limited if they are not proficient in the language
Coursework assessment as part of national examining: implementation issues At the level of Form II, Teachers Assessment (TA) was implemented basing on the following assessment procedures: • Class works • Oral questions • Home works • Weekly tests • Terminal tests 11
Continuous Assessment format & Guidelines • Term: --------- Class: --------- subject---------
Continuous Assessment format & Guidelines • Note: Coursework assessment will be obtained from class works, home works and weekly tests or as it will be directed otherwise by the Department or the Ministry. All the marks scored by the pupil from the mentioned activities within a particular month, will be combined and transformed into percentage of ten (10%) before filled in the relevant space in this form. • Source: Ministry of Education Working Guidelines: Guideline No. 9 (1995)
Continuous Assessment format & Guidelines Additional notes from another version • If the term is less or more than four (4) months, four activities chosen by the teacher him or herself should be picked at equal intervals. The teacher is advised to give as many tests as he or she can in a term but should choose only four for recording. • In getting pupil’s marks for terminal exam for each subject, the exam itself should contribute 60% and classroom assessments (coursework assessments) should contribute 40%
Coursework assessment as part of national examining: implementation issues • The language to be used is as indicated in the lesson plan. Source: Ministry of Education Working Guidelines: Guideline No. 9 (1995) • These notes appear to contradict with the first version which might be leading to confusion. This version requires the teacher to find average of marks in a month while the second requires the teacher to choose one activity from which the marks can be recorded.
Coursework assessment as part of national examining: implementation issues • Homework, class work, oral questions, weekly tests make 40% and the terminal exam makes 60% which when combined make 100%. • The hundred percent obtained is sent to the Department of Curriculum and Examination for further processing. There was variability among teachers on: • The specific activities that constitute teacher assessment marks.
Coursework assessment as part of national examining: implementation issues • Number of activities they should take marks from for each month. • Decision on students’ absentees • How to get 10% of marks for each month; some picked the one that students did better, some picked randomly, and some put average. • Understanding of what assessment is to be done. • Awareness of assessment guidelines
Coursework assessment as part of national examining: implementation issues • Percentage of marks that go to the final examination • Sharing what teachers know and practice about assessment guidelines among teachers within schools • Level of monitoring of Teacher Assessment by head teachers. • So, students from the same school and from different schools are assessed in different ways.
Overall, teachers were observed using L1 during the lessons in 49% of cases. Specifically, they did so some of the time in 20% of cases, most of the time in 11% of cases; and rarely in 18% of cases.
Teachers presented concepts clearly most of the time (28% of cases), some of the time (62% of cases). Teachers’ presentation of concepts was unclear (6%) or confusing (3%) in 9% of cases.
In 42% of cases observed, teachers rarely probed pupils’ comprehension. They did not probe pupils’ comprehension at all (24%). Teachers frequently probed pupils’ comprehension (8%); and sometimes probed pupils’ comprehension (24%).
Teachers provided feedback to learners using wide range of strategies (1%); with hardly any or no feedback in 19% of cases. In 78% of cases, teachers were observed either using some range (24%) or a limited range (54%) of feedback to the learners
In 83% of cases, learners rarely (28%), almost never or never (55%) provided extended responses; 16% pupils provided extended responses some of the time; only 2% learners were observed providing extended responses most of the time.
Findings 24 24
Findings 25 25
Student performance on exam items 26 English RC Qn: How whales resemble man 45 students took this item: • 35.6% = no answer • 26.7% = wrong answer • 28.9% = partially correct answer • 8.8% = correct answer
Interview: D1 who didn’t answer Q3 explains 27 D1: “because I did not understand by this this … resemble” (lines 115-117) Int: “If I tell you that resemble means ‘to look like’ … can you do the question now? D1: “Yes” Int: OK so what’s the answer? D1: “Man … is warm blooded … and whales also … whales have lungs and man also have lungs …” (122-133)
Linguistic accommodations make a difference Modifications to item • i.e. changing the word ‘below’ to ‘under’/ ‘younger’ cf. “below 14 years” (refer to the age where the item is) was interpreted in 3 different ways: • includes the 14 year olds (3+2+5+4+2=16) • cells on the left of the cell containing 14 (10,11,12,13) • cell below the cell containing 14 (which says 2) 30
Biology: responses to original item 31 • No answer = 67.4% • Wrong answer = 21.7% • Partially correct answer = 6.5% • Correct answer = 4.4%
Biology: on locusts 34 Questions: a) In which picture do you think the locust will/may die? b) Why do you think it will/may die?
Original item modified: 35 • Greater contextualisation • Simplification of instruction • 2 structured parts: A & B • Visual clues to support information retrieval • Rephrasing of the item • Altering item layout
Results 36 • Original item • Only 32.6% of students wrote an answer • Just under 11% gave a partially correct or correct answer • Modified item • 100% responded to this item • 42% gave a partially correct answer to Part A • 53% gave a partially correct answer to Part B
ENG and MATH 40
ENG and BIO 41 41
How about KISWAHILI & ARABIC? 43 • Although other two languages (Kiswahili and Arabic) are also significant predictors of the students’ performance in maths, biology and chemistry, it is noted that they are less capable of explaining the variances than ENGLISH. • KISWAHILI explains (275.993-88.954)/275.993=31.54% of CHEM total variance, (114.968-76.585)/ 114.968=33.39% of BIO total variance, (90.4-75.678)/90.4=16.29% of MATH total variance. • ARABIC explains (275.993-203.452)/275.993=26.28% of the CHEM total variance, (114.968-88.569)/114.968=22.96% of the BIO total variance, (90.4-71.885)/90.4=20.48% of the maths total variance.
Summary of the multilevel models (a) 44 • It is very clear that ENGLISH is a significant and substantial predictor of the students’ performance in MATH, BIO & CHEM. • The school-level variances explained in the cons models as well as in the models including ENGLISH as the single explanatory variable demonstrated a substantial proportion of the variance is attributable to school factors
Summary of the multilevel models (b) 45 • Not much improvement in terms of fitness of the models (measured by the change of % of school-level variance in the total variance). • Therefore, essential to collect further school- and pupil-level data to examine what factors (e.g. English language learning opportunities at home and at school, academic English proficiency) and how much they account for the variances (in the tradition of school effectiveness studies) – our plan for nationwide data collection using pupil and headteacher questionnaires and vocabulary knowledge test
Using English in high stakes examinations (SSA, UK, TIMSS, PISA): the realities 47