190 likes | 294 Views
Cognitively-Based Assessment Enabled by Technology. Eva L. Baker. UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation (CSE) National Center for Research on Evaluation, Standards, and Student Testing (CRESST). AERA 44.38 April 2001.
E N D
Cognitively-Based Assessment Enabled by Technology Eva L. Baker UCLA Graduate School of Education & Information StudiesCenter for the Study of Evaluation (CSE) National Center for Research on Evaluation, Standards, and Student Testing (CRESST) AERA 44.38 April 2001
Technology Principles for the Design and Use of Educational Information • Problem definition • Assessment • Data interpretation and representation • Examples and inferred principles • Key research
Problem • Global notions of assessment design—match or aligned to standards, illustrate a preferred format; normed interpretation • Naive view that mere access to data will improve performance • Policy now expects multiple purposes to be served by limited assessment(s) • One-at-a-time mentality • Assessment “systems” remain to be achieved
To be Productive in Technology-Based Assessment/Improvement Systems • Design reusable components—tasks, data modules, scoring protocols, reporting • Specify details guiding the integration of system elements • Plan for rapidly changing technology • Include in the system both data elements, user models, and interpretative options
Assessment Design Strategy • Start with cognitive demands • Guide task development, test integration, and scoring elements • Implement in subject matter domains or skills (soft or hard) • Monitor precursor or developmental sequence • Review for linguistic appropriateness • Determine key data elements or processes to be collected
Families of Cognitive Demands:Both Domain-Dependent and Domain-Independent Features
Authoring Tools • Assessment tasks and tests • Data representation • Interpretation • Public reporting
CRESST Authoring System Plan: Part 1 • Templates based on current model-based assessments • Web-based with expert and peer review • Automated scoring using extant- or expert-based systems • Correspondence with “content and performance standards” or other system goals
Principles for Assessment Design Today • Contain cost by automation • Start with pervasive rather than ephemeral elements (e.g., cognitive demands) • Implement in content and skill domains • Assess and correct linguistic complexity and other likely sources of construct-irrelevant variance • Generate resusable structures, including support by users (teachers, administrators, publishers) • Link to other existing system elements
Automation: Part 2 • Depends on realization of “Learnome” maps of domains • Proofs of concept in literacy, geography, math, technical skills, chemistry • Selection of primitives or objects • Links to Web-enabled content classification • Default conditions supporting validity for purpose, reliability, and flexibility • Interactive user trials in real and controlled settings
Principles for a Rapidly Changing World • Automate design based on Learnome primitives • Technological support for test administration • Automate data collection for on-the-fly technical quality monitoring • Create “add an egg” versions with talkies • Develop comparability indices
System Data Interpreter(s) and Reporting Systems • Early version—QSP—data manager intuitive, novice user, disaggregation, query based, longitudinal story for individual, unit, institution, or program • Multiple purposes—feedback, evaluation, accountability, and individual diagnosis • Additional data—meeting requirements or supporting validity interpretations • Top-down, bottom-up • Massive differences in user knowledge requirements and expectations
New Version • Expanded user set • User-selected data elements and representations • Local flexibility expanded • Scenarios to simulate consequences of selected actions on groups, schools, or system
Report Card Generator • Automated representations of extant data elements • Iconic, metaphorical, intuitive • Multiple media—Web • Institutional, program, individuals • Inexpensive and fast
Next Generation Reporting • Multiple metaphors • Intuitive, dynamic, and progressive • Extensible and portable • User selection of options based on personal mental model
Principles for Data Representation and Interpretation • Explicit user models—purposes and element preferences • Responsive timing • Local automation of some functions • Representation flexibility • System supports for mental models and partial knowledge
Key Research • Learnome mapping and primitive development • Limits of on-the-fly technical quality supports • Flexibility by mental model of user(s) • Updates of “prescription” selection and scenario building • Integrating the user in the representation