630 likes | 720 Views
Technology: Access to the Future Regional Meetings. Ellen B. Mandinach Naomi Hupert EDC Center for Children and Technology WWW.edc.org/CCT November 21, 2003 January 23, 2004 February 20, 2004 March 19, 2004.
E N D
Technology: Access to the FutureRegional Meetings Ellen B. Mandinach Naomi Hupert EDC Center for Children and Technology WWW.edc.org/CCT November 21, 2003 January 23, 2004 February 20, 2004 March 19, 2004
“A paradox gradually became evident: The more a technology, and its usages, fits the prevailing educational philosophy and its pedagogical application, the more it is welcome and embraced, but the less of an effect it has. When some technology can be smoothly assimilated into existing educational practices without challenging them, its chances of stimulating a worthwhile change are very small.”Salomon and Almog, 1998
According to Papert (1987), if an instructional technology is harmless that it is easily integrated into existing pedagogical practices without many changes, then it will be equally harmless in making an instructional difference.
Question: What is the difference between elephants mating and the implementation of educational technology? Answer: There is a lot of dust and noise, and nothing happens for a very long period of time!
Think of a pointillist painting by Seurat or an impressionist work by Monet • Step up to the painting. • Step all the way back. • Compare what you see. That is precisely what evaluation and assessment require - taking multiple perspectives of the same phenomena and getting different feedback.
Question: How can the accountability issue be addressed? Answer: With great difficulty Use a number of measurement strategies, asking different, but related questions, all concerned with various aspects of the learning process and outcomes.
Assessment, Evaluation, and Research Different purposes, questions, and objectives: • Assessment - the measurement of learner performance. Can be part of an evaluation, but not synonymous. • Evaluation - the examination of a system, product, or program - formative, summative, or both - to determine how it is functioning, being implemented, and how it can be improved. • Research - encompasses assessment, evaluation, and much more.
Evaluation and Research:Some Tradeoffs • Is likely to be most effective when it is both formative and summative. • Should be used for planning as well as a systematic research tool. • Must consider the information desired by the stakeholders. • Must use measures that will maximize the potential for detecting impact. • Must use standardized tests or targeted assessments. • Must use experimental versus other designs. • Must use comparison/control or matched groups.
Some Caveats • All parties involved need to change their conceptions of proof of successful implementation of technology and its impact on teaching and learning. Need for continuous adjustments in: • pedagogical philosophy. • assessment techniques. • strategies, roles, priorities, and schedules. What does it mean to say, “it works”?
Is there enough implementation of the technology to enable measurement? There must be enough technology implementation to produce the desired outcomes. If you are only using technology for a small amount of instructional time, there will be limited exposure for each student each day. Therefore the targeted outcomes are not likely to show substantial effects size.
Evaluation/Research Issues • Experimental paradigm • eliminate possible explanations • verify hypothesized causal relationship • control for contaminating influences • random assignment of students to experimental and control groups • controlled application of the stimuli • Barriers to and issues in the use of an experimental design. • selection, acquisition, and installation of hardware and software • training and ongoing support of teachers • enlisting support, encouragement, and participation of students, teachers, administrators, school board members, and parents • actual classroom implementation (for as long as it takes)
Quasi-Experiment Versus Formative Experiment • Not sufficiently powerful • Not sufficiently sensitive to changes • Inadequate evaluative question - Does it work?
“Rarely does one study produce an unequivocal and durable result: multiple methods, applied over time and tied to evidentiary standards, are essential to establishing a base of scientific knowledge.”Shavelson & Towne, 2002
Methodological Implications for Technology-Based Educational Reform Efforts • Longitudinal Design • Multiple Methods • Hierarchical Analyses • System Dynamics
Longitudinal Design Sacrifice the quasi-experimental design in favor of ongoing longitudinal data collection and analyses that are carried out continuously and indefinitely. Reasons that quasi-experiments are difficult to implement in classroom settings: • pre and post measures • experimental and control groups • random assignments of students • control of how and when the “stimulus” is administered
Educational innovation efforts often generate multiple outcomes that require the triangulation of traditional and nontraditional data collection and assessment methods. For example: Multiple Methods Think aloud protocols Classroom observations Peer observations In-depth interviews with teachers, students, and administrators Content analyses of essays Paper and pencil assessments Performance assessments Notebooks and portfolios Performance on traditional tests, assignments, and projects Many data items routinely generated in the operation of any school (e.g., GPA, tardiness, attendance, drop-out rates, course taking patterns) Gather data at multiple time points Case studies Bottom line: Identify consistent patterns across data sources
HIERARCHICAL ANALYSES Examine impact at different levels of analysis: • Student Learning - Processes and Outcomes • Classroom Dynamics - the Changing Patterns of Interactions between Students and Teachers • The School as a Social Organization - its Structure and Functions
SYSTEM DYNAMICS • A school is an interdependent, multilayered system. One has to understand what is happening across layers and levels. • Reform efforts take place in real world contexts that are composed of many interrelated, dynamic factors. • Any attempt at educational reform will have multiple components and multiple impacts, and they will interact across levels of organization over time. • Asking a single question about impact or outcome is naive. Thus, we need to approach educational reform systemically.
“The character of education not only affects the research enterprise, but also necessitates careful consideration of how the understanding or use of results can be impeded or facilitated by conditions at different levels of the system. Organizational, structural, and leadership qualities all influence how the complex education system works in practice.”Shavelson & Towne, 2002
How do you know if the technology is “working”? • Questions - better, more, what ifs, what and how they ask • Deeper understanding • More engaged and on task • How they interact • Challenged • Argumentation • Application • Independence and self-directedness • Planfulness and organizational skills • Increased problem solving, logical analysis • Reponses in discussion • Decision making • See the light bulb go on - the aha experience • See the obvious • Students in charge of their own learning • Success is being able to handle failure and learn from it
WHAT Does NCLB Want? • To determine with scientific rigor: WHAT WORKS. • Translation: The impact of the intervention must be to: INCREASE TEST SCORES!
What is Required by the What Works Clearinghouse and NCLB • Randomized, controlled, experimental studies, using the medical model of research. • Not matched comparisons. • Not quasi-experimental designs. • Must establish causality, ruling out plausible explanations. • Small, focused “interventions.” • Limited teacher professional development components. • Short-term. • School patterns are not changed. • Students are the unit of assignment, not classrooms or schools. • No contextualization. • Foremost, there must be valid and reliable evidence that the intervention improves student achievement through scientific evidence.
The Medical Model as the Gold Standard • The Institute for Educational Sciences (IES) in the US Department of Education invokes the medical model of research as the standard toward which all research should strive. • Yet is this gold standard achievable? • Is it the right gold standard or a silver bullet? • For example, can an instructional “intervention” be examined in the same way as a course of pharmaceutical treatment?
Research and Evaluation Methodology Required by NCLB:Randomized Field Trials (RFT’s) The rationale for RFT’s is the quest for unambiguous information in education.
“To be scientific, the design must allow direct, empirical investigation of an important question, account for the context in which the study is carried out, align with a conceptual framework, reflect careful and thorough reasoning, and disclose results to encourage debate in the scientific community.”Shavelson & Towne, 2002
The Six Guiding Principles of Scientific Inquiry(Not the Seven Deadly Sins) • Pose significant questions that can be investigated empirically (ruling out counter interpretations and bringing evidence to bear on alternative explanations) • Link research to relevant theory • Use methods that permit direct investigation of the question • Provide a coherent and explicit chain of reasoning • Replicate and generalize across studies • Disclose research to encourage professional scrutiny and critique
Which Really is the Driving Factor -Research Questions or Methods? • The question should drive the research methodology, not the research methodology driving the questions. • Unfortunately, all too often the reverse has been happening because of political pressures. • Mandated questions, methods, and potentially answers as well.
The Question Should Drive the Research Design • What is happening (e.g., descriptions of population characteristics)? • Is there a systematic effect (i.e., systematic means causal)? • How or why does it happen? • Need to account for contextual factors. • Replicability of patterns across groups and time.
Evaluation • Should be meaningful and constructive. The results and information should benefit the students, teachers, school, and district. • Should not be punitive. • Should be informative, providing information on what is going on, how to improve, or other important questions.
Designing Evaluations • Use targeted evaluations. • Match your goals to data collection activities - that is, let the questions drive the methods. • Use measurable components. • Consider the design before it is implemented. • Be flexible. Things change and the evaluation design must change accordingly.
Numerous Caveats to RFT’s • Fidelity of implementation • Variability of treatment • Overlap between treatment and control groups • Adequacy of outcome measures • Multiple treatment interference • Relevance of control condition to policy issues • External validity
NCLB Goals:Impact on Students • Primary Goal - To improve student academic achievement through the use of technology in elementary schools and secondary schools. • Additional Goals • To assist every student in crossing the digital divide by ensuring that every student is technologically literate by the time the student finishes the eighth grade, regardless of the student’s race, ethnicity, gender, family income, geographic location, or disability. • To encourage the effective integration of technology resources and systems with teacher training and curriculum development to establish research-based instructional methods that can be widely implemented as best practices by State educational agencies and local educational agencies.
NCLB Questions:Impact on Students • Is academic achievement improving with effective technology use? • Are students acquiring 21st century skills through effective technology use? • Are students more engaged in learning through effective technology use?
Necessary Conditions • Effective Practice - Is classroom practice characterized by powerful, research-based strategies that effectively and appropriately use technology? • Educator Proficiency - Are educators proficient in implementing, assessing, and supporting a variety of technology-based teaching and learning practices? • Robust Access Anywhere Anytime - Do students and staff have robust access to technology anywhere, any time, to support effective designs for teaching and learning? • Digital Age Equity - Is the digital divide being monitored and addressed through resources and strategies aligned to 21st century vision? • Vision and Leadership - Is there a 21st century vision? Is the education system transforming into a high-performance learning organization?
Information and Communication Technology Literacy • ICT literacy is more than just the mastery of technical skills. It also includes: Cognitive skills. The application of cognitive skills and knowledge. • ICT literacy is seen as a continuum of skills and abilities from simple, everyday tasks to complex applications.
A Working Definition ICT literacy is using digital technology, communications tools, and/or networks to access, manage, integrate, evaluate, and create information in order to function in a knowledge society. Source: ICT Literacy Panel, 2002.
ICT Proficiency Skills • ACCESS - knowing about and knowing how to collect and/or retrieve information. • MANAGE - applying an existing organizational or classification scheme. • INTEGRATE - interpreting and representing information. It involves summarizing, comparing, and contrasting. • EVALUATE - making judgments about the quality, relevance, usefulness, or efficiency of information. • CREATE - generating information by adapting, applying, designing, inventing, or authoring information.
Three Proficiencies • Cognitive Proficiency - the desired foundational skills of everyday life at school, at home, and at work. Literacy, numeracy, problem solving, and spatial/visual literacy demonstrate these proficiencies. • Technical Proficiency - the basic components of digital literacy. It includes a foundational knowledge of hardware, software applications, networks, and elements of digital technology. • ICT Proficiency - the integration and application of cognitive and technical skill. Seen as enablers, they allow individuals to maximize the capabilities of technology. At the highest level, ICT proficiencies result in innovation, individual transformation, and societal change.
The SETDA Technology Literacy Working Definition Technology literacy is the ability to responsibly use appropriate technology to communicate, solve problems, and access, manage, integrate, evaluate, and create information to improve learning in all subject areas and to acquire lifelong knowledge and skills in the 21st century.
Sources of 21st Century Skill Definitionswww.ncrel.org/engauge/skills/sources.htm • The enGauge 21st-Century Skills • National Education Technology Standards • SCANS (Secretary’s Commission on Achieving Necessary Skills) • A Nation of Opportunity: Building America’s 21st Century Workforce • Preparing Students for the 21st Century • Standards for Technological Literacy, Content for the Study of Technology • Being Fluent with Information Technology • Information Literacy Standards for Student Learning • Growing Up Digital
Interactive Assessment and Evaluation • Assessment can be used both as a teaching tool and an evaluation mechanism. • Assessment and instruction should be iterative - a feedback loop to provide information to both student and instructor - continuous and interactive. • Assessments should assist students to evaluate their learning processes and outcomes. They should help to facilitate learning. • Techniques should capitalize on the affordances of the technology. • Online assessment should not be restricted by time constraints or resources.