1 / 67

Learning, Assessment, and Accountability: Priorities for Educational Reform

Learning, Assessment, and Accountability: Priorities for Educational Reform. Eva L. Baker. UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST) British Columbia Ministry of Education Conference Victoria, BC

hall-jarvis
Download Presentation

Learning, Assessment, and Accountability: Priorities for Educational Reform

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning, Assessment, and Accountability: Priorities for Educational Reform Eva L. Baker UCLA Graduate School of Education & Information StudiesNational Center for Research on Evaluation,Standards, and Student Testing (CRESST) British Columbia Ministry of Education ConferenceVictoria, BC March 2004

  2. Today’s Topics • YOU can make a difference in educational reform • Coping strategies for lurching change • Review key aspects of current policy reform • Improving assessment, learning, and transfer • New paths from CRESST work • Principles and criteria for improvement

  3. Intellectual Goals for Reform Mastery • Understand deep goals of reform and how they can be achieved • Managing assessment knowledge: research-based procedures for use • Development of social capital in education

  4. Policy Context • Assumption: Schools are not meeting goals • Need for new instruments and mechanisms • Trade process for accountability, individual scores • Now specify process, accept most outcomes • Devolve responsibility to states and LEAS • Innovations—charters, private managers, vouchers • Retain federal authority—Adequate Yearly Progress

  5. 15-Year History of U.S. Attempts • 1989 NCTM Math Standards, NCEST • America 2000, Goals 2000, Improving America’s Schools Act, VNT, NCLB • Policies have remained relatively consistent—political sides change • NCLB expanded national policies • Bipartisan support • Early in implementation with known problems • Lessons learned?

  6. No Child Left Behind • Builds on standards and assessments of IASA • Annual testing in Grades 3-8 plus high school • Growth targets set by states so that all children reach “proficiency level” in 12 years • 95% participation required • Disaggregated groups reporting • Tests and proficiency definition left to states • Options if school “fails” • High failure rates

  7. Policy Limits • Accountability crux • Consequences based on outcomes AYP targets • Approaches imported from centrally controlled systems; agreement on outcomes—training • Contrasts to schoolhouse traditions—local control • Weak curriculum/instructional “alignment” • Tests may not be sensitive or conceptually connected to instruction and learning • Recruiting and maintaining quality staff

  8. Early Policy Consequences • External, varying standards and tests from States • Unrealistic targets (AYP) plus • Short timeline to serious sanctions • Ergo: Raised scores only evidence of learning • Neither likely to measure “high standards” nor to create assessment results that respond to quality instruction • Growing enthusiasm for use of classroom assessment for accountability • Test prep increases • Popularity of reform unstable

  9. Gaps in Practice for the “Theory of Action” of Accountability Goals aligned with instruction and tests measure goals. Feedback on results improves the instruction and learning • Problem 1: Alignment is Asserted • At best, links tests with some standards • No complete standards-instruction-test-results loop • No common technical approach to document • Wrong metaphor (geometric congruence)

  10. Description:Extra comfort for senior dogs. Our popular orthopedic pet bed, made extra thick for aging dogs. A full 4" of medical grade convoluted foam supports bones and joints, and the elevated headrest provides proper neck and spine alignment. http://www.petdiscounters.com/dog/beds/cu_orthopedic.html

  11. Joyful manifestation of the heart’s desire. http://www.powerofyoga.com/

  12. Always check Alignment readings before and after work is performed. http://www.fly-ford.com/StepByStep-Front-Series.html

  13. How to Monitorand Improve Alignment • Count items for each standard • Understand weighting of results • Analyze, review, and share lessons that exhibit standards and promote transfer • Examples using teacher assignments • Support collaboration

  14. Gaps in Practice for “Theoryof Action” in Assessmentand Testing • Problem 2: Assessment Design and Reporting • Multiple purposes, uses, and audiences • Limited designs and types • Unresolved quality issues in traditional testing • A better path

  15. Needs sensing System monitoring Accountability Program evaluation Improvement Achievement Certification Progress Diagnosis Selection Placement Assessment Purposes Comparisons

  16. Review of Achievement Testing Traditions • Any new approach is compared to the extant commercial standard • Familiar, inexpensive, “trustworthy,” independent of particular learning and teaching, correlated, national norms • One purpose—one test framework • Too many tests with no clear evidence related to accreted purpose(s) • Optimize measurement efficiency

  17. What Should a Coherent Assessment System Do? • Detect differences in instruction • Partially guide educational improvement • Impact positively on instructional practice • Reflect current views of learning and sustained performance • Support fairness

  18. What Should a Coherent Assessment System Do? (Cont’d) • Promote transfer of learning to new applications • Represent the real range of cognitive task demands • Exhibit technical quality for intended purpose(s) • Support enthusiasm for teaching and learning

  19. How to Support Deep Learning?Families of Cognitive Demands • Scientifically based components of school learning • Based on syntheses and targeted research • Map assessment demands to learning processes and products first rather than to psychometrics • Re-emphasize focused thinking, self-management, and transfer of learning skills

  20. Intellectual Capital Cognitive Families Content Understanding Learning Teamwork and Collaboration Problem Solving Metacognition Learning to Learn Communication

  21. From Science to Models to Templates DOMAIN-INDEPENDENTPRINCIPLES SCIENTIFICFINDINGS COGNITIVE DEMANDS MODEL SCIENTIFICFINDINGS CONTENT CONTENT CONTENT SUBJECT MATTER SPECIFIC MODELS TEMPLATE TEMPLATE TEMPLATE B C A

  22. From Templates to Tasks TEMPLATE TEMPLATE TEMPLATE B C A TASK TASK TASK TASK TASK TASK TASK TASK TASK TASK TASK TASK

  23. Domain-Independent Definition: Content Understanding • Domain-independent set of principles: • Understanding is based on the demonstrated relationships among principled declarative and procedural knowledge • Ability to express critical relationships • The quality of the relationships is judged from an expert knowledge perspective

  24. Domain-Independent Definition:Problem Solving • Depends upon finding the problem (if masked) • Using knowledge to identify critical barriers and ways around them • Selecting procedures to follow, recognizing impasses, and adjusting plan • Has knowledge, metacognitive, motivational, analytic, and feedback components

  25. Ontology

  26. Research-Based Model: Deep Understanding of Content (Domain Independent) • Principles or themes (big ideas) • Key prior knowledge • Explicit relationships • Avoid misconceptions • Expert performance-based scoring

  27. Template Ingredients (Specifications) • Task(s) • Format(s) • Prompt(s) and requirements • Scoring • Directions • Sample

  28. Common Attributes of Template for Deep Understanding of Content • Present primary source materials in each domain • Student required to integrate prior knowledge and principles • Scored by using expert performance by subject matter experts

  29. Three Templates for the Model of Deep Understanding of Content • Explanation • Explanation with explicit knowledge • Graphical representation of relationships

  30. Content UnderstandingTemplate #1 Explanation • An array of primary source materials • A prompt that asks for an explanation in context • Constructed (written) answer • Evaluated by means of a scoring rubric that embodies key elements of learning model

  31. Content Knowledge Prompt:Hawaiian History WritingAssignment—BayonetConstitution Imagine you are in a class that has been studying Hawaiian history. One of your friends, who is a new student in the class, has missed all the classes. Recently, your class began studying the Bayonet Constitution. Your friend is very interested in this topic and asks you to explain everything that you have learned about it. Write an essay explaining the most important ideas you want your friend to understand. Include what you have already learned in class about Hawaiian history, and what you have learned from the texts you have just read. While you write, think about what Thurston and Liliuokalani said about the Bayonet Constitution, and what is shown in the other materials. Your essay should be based on two major sources: • 1. The general concepts and specific facts you know about Hawaiian history, • and especially what you know about the period of the Bayonet Constitution. • 2. What you have learned from the readings yesterday. Be sure to show the relationships among your ideas and facts.

  32. Excerpts from Hawaiian HistoryPrimary Source Documents LILIUOKALANI For many years our sovereigns had welcomed the advice of American residents who had established industries on the Islands. As they became wealthy, their greed and their love of power increased. Although settled among us, and drawing their wealth from resources, they were alien to us in their customs and ideas, and desired above all things to secure their own personal benefit. Kalakaua valued the commercial and industrial prosperity of his kingdom highly. He sought honestly to secure it for every class of people, alien or native. Kalakaua’s highest desire was to be a true sovereign, the chief servant of a happy, prosperous, and progressive people. And now, without any provocation on the part of the king, having matured their plans in secret, the men of foreign birth rose one day en masse, called a public meeting, and forced the king to sign a constitution of their own preparation, a document which deprived [him] of all power and practically took away the franchise from the Hawaiian race.

  33. Content Knowledge Prompt(Cont’d) It may be asked, “Why did the king give them his signature?” I answer without hesitation, because he had discovered traitors among his most trusted friends and because the conspirators were ripe for revolution, and had taken measures to have him assassinated if he refused. It has been known ever since that day as “The Bayonet Constitution,” and the name is well-chosen; for the cruel treatment received by the king from the military companies. [text continues] *From Hawaii’s Story by Hawaii’s Queen, Liliuokalani (Boston: Lee and Shepard Publishers, 1898). • Explain to your friend who missed class the reasons and differences for the Queen and the Senator’s approach to Hawaii’s future. • Scoring Rubric • General impression (on task) • Principles and themes • Prior knowledge • Relevant concrete examples • Avoidance of misconceptions

  34. Template #2Prior Knowledge and Explanation • Explicit measurement of knowledge domain in the explanation • Adds short-answer or selected response • Helps interprets explanation performance

  35. Consider the two systems shown above, a balloon and a rocket being launched into space. Using what you know about physics and the applicable laws, write an essay explaining and comparing the forces present in each system. In your essay discuss all the major similarities and differences between the two systems. Also address the following questions in your essay: In which direction will the balloon travel once it is released? How is this similar to the rocket system? How is it different? Explain your answers using what you know about forces. If you placed both of these systems in space, would you expect the movement of the rocket and the balloon to change compared to their movement through air? Would anything else change? Explain why a rocket starts off moving slowly and gets faster and faster as it climbs into space. Does the same thing happen to the balloon? Why or why not?

  36. Template #3Knowledge Representation • Same prompts • Key aspects of ideas, supporting facts and views, and their relationships • Relationship is explicit • Organizational options • Core and peripheral • Hierarchical • Cause-and-effect • Chronological • Expert scoring

  37. History

  38. Environmental Science

  39. Online Shot Depiction

  40. Genetics

  41. Bicycle Pump

  42. Measuring Transfer • Vary • Content complexity • Number of task elements to address, including distracters or irrelevant content • Graphical support or distraction • Need to prioritize requirements • Linguistic demands

  43. Measuring Transfer (Cont’d) • Response types • Constructed response modes • Length • Response support/prompts • Degree of stringency in criteria

  44. Evidence for Model-Based Assessment (MBA) • Across age ranges (preschool to adult) • Reliable scores • Teachable • Impact long-range outcomes (HS exit exam) • Automated scoring using a subset of common elements (DI) across topics • Cost low, quality maintained • Reusable elements

  45. CRESST Validation Studies • Score reliability • Task and rater generalizability • Stability of student performance over time • Relationships among measures • Instructional sensitivity • Opportunity to Learn (OTL) • Effect of school composition on performance • Cut-score modeling

  46. CRESST Validity Criteria for Tests and Assessments at Any Level • Fairness • Cognitive complexity • Content domain • Instructionally sensitive • Transfer and generalization • Learning-focused • Validity evidence reported for each use • Trustworthy • Credible

  47. Criteria for Judging Utility of Any Assessment Design • Promote learning of the curriculum • Support cognitive complexity and content richness • Avoid unnecessary language complexity • Support transfer • Reusable components, i.e., templates or objects to save renewal cost • Economical (future, on-the-fly, open-ended scoring) • Engage teachers in challenging instruction • Fair and public

  48. Criteria for Useful Assessments in Classrooms • Validity—detects differences in instruction • Samples the domains claimed to be measured • Provides information about where to focus attention rather than success/lack of success • Integrates cognitive skills and content • Includes transfer for situations and response types • Economical, transparent, and usable • Develops rather than constrains teacher growth

More Related