1 / 97

How do we know what they know?

How do we know what they know?. FASS Meeting Orlando, FL April 5, 2004 Arthur Eisenkraft (eisenkraft@att.net). Are we listening?. The optometrist The Duracell competition Two objects falling in a vacuum The cord of wood. No Child Left Behind.

Download Presentation

How do we know what they know?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How do we know what they know? FASS Meeting Orlando, FL April 5, 2004 Arthur Eisenkraft (eisenkraft@att.net)

  2. Are we listening? • The optometrist • The Duracell competition • Two objects falling in a vacuum • The cord of wood

  3. No Child Left Behind • Enormous concern about NCLB and other high stakes assessments. • NCLB – a potential nightmare – • AYP - “need of improvement” • public embarrassment • best students leaving and the scores dropping more, closing of schools. • What can we do? • This year alone 26,100 of the nation’s 91,400 have been labeled “schools that need improvement.” (Sam Dillon, 1 in 4 Schools Fall Short Under Bush Law, N.Y. Times, January 27, 2004 at A21)

  4. NCLB - Advice • Assessment in Support of Instruction and Learning:Bridging the Gap Between Large-Scale and Classroom Assessment - Workshop Report (2003)Board on Testing and Assessment, Mathematical Sciences Education Board, Center for Education • www.nap.edu

  5. The Deborah Meier Amendment • A basic test should first be taken by the folks we honor by electing to office. • The people who legislate or mandate a test should be required first to take it themselves to ensure that it's measuring what they think it is. It's a form of validity checking. • They might even have their scores posted! • Seeking Alternatives to Standardized Testing • By Jay Mathews, Washington Post Staff WriterTuesday, February 17, 2004; 10:46 AM

  6. Today’s Discussion • Formative classroom assessment can positively impact instruction and therefore is our best approach to students performing better on all tests. • What can teachers do? • What can we do to support teachers?

  7. Brief History of Assessment • When did it all begin? • No Child Left Behind • FCAS • New York State Regents

  8. How do we assess? • Please discuss • Observation • Portfolio • Projects • Questioning • Paper and pencil • Interview • Presentation • Checklist • Skills • Self assessment • Quizzes • conferences

  9. Classroom Assessment The Grade Book Tests Quizzes Homework Class participation (?) Lab reports Attendance (X) Projects The Final Exam Local State – High stakes These are often treated as summative though they do inform as formative. Other formative assessments include: Questions in class Practice tests

  10. Get tests back immediately • They can then be used for formative assessment. • How can anyone continue instruction when you have a tool that informs you of student understanding? • Easily measured by supervisors and students alike

  11. Formative Assessment • The value of formative assessment (Paul Black): students often have limited opportunities to understand or make sense of topics because many curricula have emphasized memory rather than understanding. Textbooks are filled with facts that students are expected to memorize, and most tests assess students’ abilities to remember the facts.

  12. What Goes Wrong? • Tests that do not correlate with understanding • Force Concept Inventory (FCI) • Regents exam question on moving galaxies • Private Universe videotapes

  13. What Goes Wrong? • Tests that do not correlate with understanding • Force Concept Inventory (FCI) • Regents exam question on moving galaxies • Private Universe videotapes • We’re not testing what we teach • Harris cartoon of mouse and maze

  14. What Goes Wrong? • Tests that do not correlate with understanding • Force Concept Inventory (FCI) • Regents exam question on moving galaxies • We’re not testing what we teach • Harris cartoon of mouse and maze • We’re not teaching what we test • “Waldo” phenomenon

  15. Improvements • Rubrics • Clearly defined grading schema – matrix • B++++ and A----- • Have students help create rubric • Ownership • Motivation • Have students self – evaluate with rubric

  16. The grading rubric +/- Student Grade Teacher Grade A A A C C C

  17. Improvements • Rubrics • Clearly defined grading schema – matrix • B++++ and A----- • student and teacher comparisons: A,A or C,C or A,C or C,A • all require very different discussions • Saphier effective instruction • testing for understanding • how do you know what the students know?

  18. Cognitive Empathy With references from The Skillful Teacher Jon Saphier

  19. Checking for Understanding Knowing when students don’t understand suggests that teachers have means for checking for understanding. What means do we have for checking for understanding?

  20. Checking for understanding • Presses on • No test return for 3 weeks • After math lesson, here’s your 25 problems • Take a clean sheet, we’re going on • No clue there are kids in the room • Never asks students to explain • Incorrect response, - can anyone else answer

  21. Checking for understanding • Presses on • Reads cues • Their looks • Eye contact • Nodding heads • Asleep or awake • Misbehavior • I can see it in their eyes

  22. Checking for understanding • Presses on • Reads cues • “dipsticks” • White-boarding • Short quiz • Raise the hand • Choral answers • Cards with A or B • Raise fingers with 1(index) or 2 • List answers on board – which answer is the best

  23. Checking for understanding • Presses on • Reads cues • “dipsticks” • Uses recall questions • What do we already know • List examples • Who invented • Where’s waldo • Definitions • Name the parts of the microscope • The scientific method • Restating what is already known

  24. Checking for understanding • Presses on • Reads cues • “dipsticks” • Uses recall questions • Uses comprehension questions • Explain • Justify • Compare • Apply • Calculate • Why • Summarize

  25. Checking for understanding • Presses on • Reads cues • “dipsticks” • Uses recall questions • Uses comprehension questions • Anticipates confusion • Photosynthesis, Kreb’s cycle • Understanding that some kids are literal • Underground railroad • Misconceptions research • Look at prior knowledge • Teacher examining their own assumptions

  26. A TEST for Checking for Understanding How do you know that a student understands? What evidence do you have? How often should you be able to answer this question?

  27. The National Science Education Standards (NSES) Less Emphasis On: Assessing what is easily measured More Emphasis On: Assessing what is most highly valued Less Emphasis On: Assessing to learn what students do not know More Emphasis On: Assessing to learn what students understand

  28. Instructional Models • Karplus • three-phrase learning cycle • exploration, invention and discovery • Lawson • exploration, term introduction, and concept application • Bybee  5E • Engage, explore, explain, elaborate, evaluate • 7E clarification of 5E

  29. 4 Q Assessment Model • What does it mean?

  30. 4 Q Assessment Model • What does it mean? • How do we know?

  31. 4 Q Assessment Model • What does it mean? • How do we know? • Why should I believe?

  32. 4 Q Assessment Model • What does it mean? • How do we know? • Why should I believe? • Why should I care?

  33. The other four questions • What did you say? • Should we take notes? • When is class over? • Will this be on the test?

  34. Challenges • Identify who are we testing • Students • Teachers • Schools and districts

  35. Challenges • Identify for what purpose • (from Classroom Assessment and the NSES, NRC) • Help students learn • To illustrate and articulate the standards for quality work • To inform teaching • To guide curriculum selection • To monitor programs • To provide a basis for reporting concrete accomplishments to interested parties • For accountability • Certification • Reporting individual achievement • Grading • Placement • Promotion • Accountability • Parents to taxpayers • (from High Stakes Assessments, NRC)

  36. Challenges • Are we trying to use ONE instrument • for all (students, teachers, schools)? • for all purposes? • Understanding vs. belief • Mazur student taking FCI

  37. www.nap.edu

More Related