280 likes | 534 Views
Literacy Assessment: An Overview. S. Venkatraman, AIMS-UIS, UNESCO Bangkok. Source of literacy statistics: common scenario. Most literacy data come from census. However, some household and labour force surveys also collect literacy data.
E N D
Literacy Assessment: An Overview S. Venkatraman, AIMS-UIS, UNESCO Bangkok
Source of literacy statistics: common scenario • Most literacy data come from census. • However, some household and labour force surveys also collect literacy data. • The most common method of collecting the literacy data in these censuses and surveys is to ask a question, such as: “Are you literate or not?” or “Can you read and understand a letter or a newspaper easily, with difficulty, or not at all?”. • There is no scientific assessment test involved and they result in literacy information in dichotomy way how many are illiterate and how many are not. • These literacy statistics are called “Reported literacy statistics”
No data on status of literacy • What is available are inadequate for policy- based on proxies -> proven poor predictors of literacy (self-declaration, years of schooling) • Difficult to judge the adequacy of current programmes • No means to track literacy and numeracy profiles reliably over time- unable to judge reforms Background
more pressing due to the profound impact that literacy and numeracy have been shown to have on social cohesion and the preservation of cultural and ethnic minorities, rates of overall economic development, population health and the efficiency of education systems • If the needs for literacy information in the future are to be met an efficient and effective literacy assessment methodology is required that will produce reliable and comparable data.
“Literacy is the ability to identify, understand, interpret, create, communicate and compute, using printed and written materials associated with varying contexts. Literacy involves a continuum of learning in enabling individuals to achieve his or her goals, develop his or her knowledge and potentials, and participate fully in the community and wider society." UIS LAMP Background
Sources of literacy statistics: other possibility • There are specialized surveys that use an assessment tool (based on cognitive testing models) to test individual’s skill level which provide more comprehensive information on literacy. • Some countries have attempted testing • Some are mini tests, focus only on students, adhoc or sub-national • The IALS (International Adult Literacy Survey) and ALL (Adult Literacy and Life Skills Survey) use assessment methodology that ensures comparability over time and across populations subgroups. • These surveys provide a literacy profile of the individuals tested and estimates for the population.
Cognitive science has made significant progress in the identification of components of reading ability that can help identifying what is behind poor performance or low skill level. It measures degree of use of Reading & Language use in different contexts These literacy statistics are called “(Assessment) Tested literacy statistics”
REGULAR data collection Household surveys Large population COVERAGE Censuses In-depth data on literacy SKILLS and people’s background Declarations & mini-tests Literacy Measures Literacy tests as part of part wider evaluations In-depth data on higher literacy levels Programme evaluations Individual diagnostics LAMP ALL, IALS Data on students PISA, SACMEQ, PIRLS Comparative cross-national assessement surveys Ad hoc literacy assessment surveys Sources of literacy data
Issues in current literacy statistics • Like other statistics, literacy statistics also have many data issues. Among the issues, • definitions and • its comparability (over time and across populations subgroups), • availability, • timeliness and • quality of the data are the most concerns for both data producers and users.
Emerging trend • Traditionally, literacy is defined as ‘a person’s ability to read and write, with understanding, a simple statement about one’s everyday life’. But, dividing the world into literates and illiterates oversimplifies the nature of literacy. • It is increasingly accepted that there is a continuum of literacy skills and they can be applied in a functional way, i.e., reflecting everyday situations such as reading a bus schedule or using a computer. • For example: In Cambodia Literacy Assessment survey, the literacy is defined by three categories (complete illiterate, semi-illiterate and literate). Again literates are divided into three levels (basic, medium and self-learning). • In IALS and ALL tried to measure the people literacy in three skills (prose, document, and quantitative) divided into five levels .
FIVE LEVELS OF LITERACY Level 1 indicates persons with very poor skills, where the individual may, for example, be unable to determine the correct amount of medicine to give a child from information printed on a package. Level 2 respondents can deal only with material that is simple, clearly laid out, and in which the tasks involved are not too complex. It denotes a weak level of skill, but more hidden than Level 1. It identifies people who can read, but test poorly. They may have developed coping skills to manage everyday literacy demands, but their low level of proficiency makes it difficult for them to face novel demands, such as learning new job skills. 5 Levels of Literacy used for assessment
FIVE LEVELS OF LITERACY Level 3 is considered a suitable minimum for coping with the demands of everyday life and work in a complex, advanced society. It denotes roughly the skill level required for successful secondary school completion and college entry. Like higher levels, it requires the ability to integrate several sources of information and solve more complex problems. Levels 4 and 5 describe respondents who demonstrate command of higher-order information processing skills. Which literacy skills are assessed and how?
Story line: • Why do we care about skills? • How we think about skills? • Are there differences in the stock of skill? • How have stocks changed over the past decade? • Do these differences matter to individuals? • Do these differences matter to macro-economic performance? • How will skill flows transform the stock: youth, adult education and training, immigration and skill loss • Multiple disadvantage, high ICT use, wage inequality and growth • Implications for policy • How a survey/ assessment responds to the policy needs
The UIS launched the LAMP (Literacy Assessment Monitoring Programme) initiative which aims to define and measure a spectrum of literacy skills in a country for 15+ population at HHD level. It helps to give a detailed information on the distribution of skills in a given population. LAMP Instruments: Background Questionnaire, Component skill (prose, document & numeracy Qs) tests ( 2 Modules- one for low and other for high skilled) LAMP
LAMP incorporates a new conception of literacy: • Literacy is a tool that one uses to respond to new and unfamiliar reading (and numeracy) tasks • Literacy is the ability to identify, understand, interpret, create, communicate and compute, using printed and written materials associated with varying contexts. • Literacy involves a continuum of learning in enabling individuals to achieve his or her goals, develop his or her knowledge and potentials, and participate fully in the community and wider society • The factors that underlie performance are largely, but not completely, the same in language and culture
LAMP incorporates a new conception of literacy: • Literacy includes both learning to read and reading to learn: • Learning to read involves mastery of the components that underpin fluent and automatic reading • Reading to learn involves mastery of texts and tasks of increasing difficulty. To be placed at a level adults must get 80% or more of items at a level correct
LAMP incorporates a new way to assess literacy levels directly: a household survey and a reliable test LAMP: Estimates for Levels 1, 2, 3 , 4/5 + components Background Questionnaire Filter Module Module A: Low skilled Module B: High skilled prose, Document, numeracy Prose, doc, numeracy Locator Items Book 1 Component Skills Book 2 Learning to Read Reading to Learn
LAMP “Reading to learn” measures: The application of automatic and fluent reading to solve everyday problems involving print and/or numbers • Prose literacy – the knowledge and skills needed to understand and use information from texts including editorials, news stories, brochures and instruction manuals. • Document literacy – the knowledge and skills required to locate and use information contained in various formats, including job applications, payroll forms, transportation schedules, maps, tables and charts. • Numeracy – the knowledge and skills required to effectively manage the mathematical demands of diverse situations.
LAMP “Learning to Read” Measures: The component skill measures that make up reader profiles are measured by timed and un-timed tests of: • 1. Alphanumeric perceptual knowledge and familiarity Recognise the letters of the alphabet and recognise single digit numbers. • 2. Word recognition Recognise common words that appear frequently in print. These common words are expected to be in the listening/speaking lexicon/vocabulary of an individual who is a speaker of the target language. • 3. Decoding and sight recognition Produce plausible pronunciations of novel or pseudo words by applying knowledge of the sight-to-sound correspondences of the writing system, and do this accurately, rapidly and with ease. • 4. Sentence processing Process simple written sentences and apply language skills to comprehend - accurately, rapidly and with ease. • 5. Passage reading Process simple written passages and apply language skills to comprehend - accurately, rapidly and with ease.
How data from LAMP can be applied: • To better understand the social and economic costs of low literacy at both the individual and macro level • To understand the cost of inaction • To argue for increased resources • To allocate available funds optimally • To target population sub-groups that are judged to be at risk • To design more efficient and effective educational programs for adults • To monitor trends in performance • To market literacy programs • To increase the effectiveness of investments in tertiary education
This is an initiative developed by AIMS unit, UNESCO Bangkok Why need a literacy module? Literacy Module for Household Surveys
To develop and pilot test a simple tool and methodology which • provide more informative literacy statistics than currently available in most countries • only require limited financial, technical and operational resources • links to other socio-economic information • To develop a module and a manual on analysis of the literacy and relevant data from HHD surveys. • To serve as a monitoring tool for literacy initiatives Objectives
Module tries to capture the literate environment of an individual • It also looks at the individual’s behaviour in relation to the literate environment • i. HHD, ii. Community and iii. School • Use of Literacy in these environments • The Module also looks at Reading and Writing • An inventory of languages spoken at HHD, community and school (reading & writing) • Looks at ‘Social Environment” ( as indicator of a literate population in a given environment) Key Features
The instruments and data collection: • A literacy questionnaire module consists of a set of about 10 questions which is to be attached onto existing HHD surveys. • This will provide information on characteristics known to underlie differences in literacy skills. • Manual of analysis literacy and other relevant data from household surveys
Identification particulars • Questions divided into 4 blocks: • Access to reading materials • Language background • Education background • Use of literacy skills Other information necessary: age, sex, place of residence, educational attainment (formal/non-formal->level, grade or number of years in school/programme) The Module
LAMP and the Literacy Module are two very different tools The Module is NOT a substitute for LAMP LAMP’s new vision is more about country level analysis and using country contexts (cost reduction) Use of regional expertise LAMP is about cognitive testing and Literacy Module is about Literate environment and individual’s behaviour in using the Literate environment. Both Tools:
Issue of definition: Different types of assessments – sub-national / simplified / adhoc Other figures used are often about infrastructural and access issues. It is also about ‘usage’ and ‘impact’. Possibility of adapting a tool like the “Module”?- problems/advantages? Similarly issues in doing full scale assessment and other student learning outcome surveys? Some final thoughts :
Thank You! s.venkatraman@unescobkk.org