1 / 87

Critical Thinking Assessment

Critical Thinking Assessment. February 4, 2010. Jack Bautsch. Brian Palmer. Barbara Goldner. Ann Murkowski. Verna Swanljung. Aryana Bates. Deanna Li. Tracy Furutani. Cesily Crowser.

rbender
Download Presentation

Critical Thinking Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Critical Thinking Assessment February 4, 2010 Jack Bautsch Brian Palmer Barbara Goldner Ann Murkowski Verna Swanljung Aryana Bates Deanna Li Tracy Furutani Cesily Crowser

  2. To decide upon a writing prompt was an arduous process. We decided as a group that it needed to be as open-ended as possible. This is what we came up with.

  3. Write a short essay addressing the following: Barack Obama recently became President of the United States. The cartoon can have many meanings. 1. What does the cartoon say to you? 2. Does it leave anything out? 3. Which classes that you have taken here at North help you to think about your answer? For your information: The words listed in this cartoon are the names of some important people and events in United States history.

  4. I think it would have been more useful for assessing critical thinking if we had added one more question that asked them to show the evidence.

  5. We learned that next time that if we want to evaluate something we have to be specific and ask for it.

  6. As far as the rubric goes, it is clear in my mind, at least, that this is just a model for us that has been vetted by the AACU. It isn’t some pie in the sky that we are all going to use to evaluate our assignments.

  7. We decided to try this tool, see where it fails, and then tweak the definitions in a way that works for us. We decided to see how we can make it work.

  8. The prompt question is, “What does this cartoon say to you?” That requires a relationship with what is going on inside the illustration.

  9. I am not so sure that many people have assimilated and can relate to this mountain of names, so in some respects this prompt is exclusionary. It puts some people at a disadvantage.

  10. Based on my group’s experience and the other group’s scores, it seems to me the scores we found are remarkably low.

  11. I can see either of two conclusions: either our students are not getting the learning outcome, or our assessment is flawed.

  12. One of the rubric dimensions concerned considering other perspectives. But if the prompt asks, “What does the cartoon say to you?”, you are not going to tap into that.

  13. If we are going to do this again, we need to ask questions that will reveal the answers we are wishing to assess.

  14. Our group felt the same. The rubric did not match the task at all.

  15. We had some people who answered the question really well, but if you are taking the rubric as written, they were falling short.

  16. We are working with Civic Engagement now, starting anew. We have a similar rubric from the AACU, so now the task is to create a curriculum around that rubric.

  17. That is so different from where this is. This starts with the curriculum we have and looking to find what critical thinking we can find.

  18. When I decide what I want my students to know from this class and I write a test, I have this rubric in my mind.

  19. If they cannot do these basic things, then they most likely will not pass my class.

  20. Beyond that basic, of course there is more to know. It is good that they know more.

  21. From there I construct my tests based upon what I see.

  22. This rubric on critical thinking is a good rubric, if we can find the right questions that can address young people from Running Start and people from different countries.

  23. The assessment must be unbiased without an assumption of prior knowledge and where everyone can talk from a personal viewpoint.

  24. I was talking to Deanna before we started. Our math students have to do critical thinking, so how would we apply this rubric?

  25. I could not think of a way to do it. I felt like a duck out of water.

  26. I don’t understand the rubric, yet I have to evaluate someone else’s work. It was not a very good feeling.

  27. You and I come from a similar perspective, and I do teach critical thinking. I can see assessing some of my assignments with this rubric.

  28. But using this rubric for this cartoon was a problem. I am not used to reading and evaluating this kind of thing.

  29. We were so lucky to have Aryana, someone with a social sciences perspective. We had to spend time together to understand how Aryana looks for evidence in this type of work.

  30. The second problem was that the papers were pretty poor. Do we say they are all poor or do we put it in the context?

  31. They were given this setting with this amount of time. Do we take that into account, or do we just say that we do not see evidence of satisfactory critical thinking?

  32. We are seeing that this rubric is not structured from a mathematics and sciences perspective. For me it did make sense to use it on a piece of writing that had been done in a very short period of time.

  33. It is easier to use from a particular disciplinary perspective, for sure.

  34. I am feeling better about what we are doing in the next set of rubrics where we are doing it inside our classes, where it is more meaningful to me and more meaningful to my students.

  35. For the Information Literacy and Civic Engagement Essential Learning Outcomes groups of faculty are working together starting from the rubric. We are finding places in our classes where we can assess one or several of the items.

  36. We are starting from the rubric to line up assignments that are germane to our curriculum. We are from different disciplines, but we all teach Information Literacy. We are sharing how we make that clear and how our students demonstrate that competency in our different classes.

  37. This approach seems much more relevant. I am working on the Civic Engagement outcome. I know I really want to do this, but as a parent educator how do I work on it? Where do I start?

  38. Then I thought, “Oh! Kindergarten readiness.” I teach parents how to be involved in the school and advocate for their child.

  39. It was interesting to take the rubric and ask myself where it fits, as opposed to using a given rubric to assess something outside of a given context.

  40. Applying the Critical Thinking rubric to this cartoon task left me feeling bad. How can I give these students a zero when they didn’t even know what we were looking for?

  41. I know these names that are in the cartoon. I have a relationship with these names. But some people didn’t know; they thought it was somebody standing with a Priest or a Judge.

  42. It seems to me many people got stuck on the fact that they didn’t understand or have a relationship with the message. When we ask them how it touches them in some sort of way, nobody had the courage to say, “This doesn’t touch me at all.” That would have made an interesting essay.

  43. In our group we had an international student who admitted she didn’t know what it was about or the context, but nevertheless she was able to illustrate that she could take something that was foreign and think about it.

  44. That was a demonstration of critical thinking, talking about what she did and did not know in an intelligent way.

  45. I think the cartoon lacks clarity for many people. The question, “Does it leave anything out?” implies an understanding of what should be in here. If it is not clear or logical, you are not going to be able to answer the question.

  46. When you were developing this assessment what was the first question you started with that you said was more sophisticated?

  47. We started with a text-based question and threw that out in favor of an image-based question. The major reason was that we didn’t want to place a bias against students who were reading in a second language and who only had an hour to read, comprehend and respond.

  48. So we moved from very specific to super open-ended.

More Related