1 / 22

Assessing When Numbers Don’t Count

Assessing When Numbers Don’t Count. Binghamton University March 23, 2007. Today’s Objectives. Define what a discourse-based, or qualitative assessment method is Review a few of these methods Work with some case studies to better understand how these methods might be used

hugh
Download Presentation

Assessing When Numbers Don’t Count

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing When Numbers Don’t Count Binghamton University March 23, 2007

  2. Today’s Objectives • Define what a discourse-based, or qualitative assessment method is • Review a few of these methods • Work with some case studies to better understand how these methods might be used • Discuss the uses and limitations of these methods, especially with regard to assessment

  3. The Pressure to Assess • This week—Spellings Commission meeting to discuss several issues. One of them is student learning outcomes • There is consistent pressure to use standardized tests and surveys; we are currently using one of them, the NSSE • Through the use of standardized tests and surveys, we gain perspective on how we compare to other institutions • We might be able to gain a “value added” perspective

  4. Weaknesses of the Standardized Test/Survey Approach • We obtain a “macro” perspective, but may not gain a “micro” perspective • We have little control over the questions • “Value added” is still difficult to establish with standardized tests • Faculty see little value in using standardized tests, especially in interdisciplinary studies and the liberal arts • Standardized tests/surveys often assume they can measure most of what concerns faculty in respect to teaching and learning

  5. Forging the Middle Ground:Discourse-Based Assessment Methods • Allow for the discovery of the unanticipated • Relevant to interdisciplinary study • Maximizes faculty/staff input when discourse is highly valued • Good to use when the number of objectives outweigh the amount of time available to assess student learning • Great contextualizer • When performed carefully and compared with other methods, great return on investment

  6. Types of Qualitative Assessment • Focus groups • Expert panels • Open-ended surveys • Ethnographic studies (participant observations) • Portfolio reviews • Primary trait scoring • Delphi panels

  7. Qualitative Assessment is an Inductive Process Defined Focus Observations Analysis Summary Report Comparison with Other Assessments Action

  8. Expert Panel • Type of focus group • Focus is on (for assessment purposes) a particular assignment or performance • Not a simple conversation; is done methodically, with precision, and is systematic • Often involves convenience or random samples of a homogeneous population • Must include carefully written questions • Might include a survey or other assessment technique as part of the process

  9. Krueger’s 10 Quality Factors in Focus Group (Expert Panel) Research • Clarity of purpose • Appropriate environment • Sufficient resources • Appropriate participants • Skillful moderator • Effective questions • Careful data handling • Systematic and verifiable analysis • Appropriate presentation • Honoring the participant, clarity, and method

  10. Expert Panel Procedure Write Questions Use Additional Method? Select Focus Select Experts Conduct Expert Panel— How to Assess Agreement? Report Results Logistics

  11. Final Thoughts on Expert Panels • Good method of assessing “ineffable outcomes” • Better when expert panel questions and conversations are grounded in standards and/or student learning objectives • Also good when specific focus is on a particular assignment or performance • It is advisable to use a secondary method either prior to or during expert panel • Does not control for anonymity among respondents

  12. Primary Trait Scoring • Focus is upon one particular assignment, performance, etc., that is reflective of several aggregate student learning outcomes • Rate each outcome according to a scale—ex., proficient, satisfactory, unsatisfactory • Idea is to look at trends, not numbers, that spark discussion

  13. Primary Trait Scoring--Procedure • Choose an assignment in which students demonstrate summative knowledge, skills, or competencies • Carefully rate student performance according to the scale • Place checkmarks in each column • Look for visual trends • Discuss why these trends occur, what basis these rating occurred, and what specific issues are revealed through the analysis • Combine with other findings, or make plans for action

  14. Example of Primary Trait Analysis

  15. Primary Trait Analysis—Final Thoughts • Method to get faculty or staff to talk about what assessment results mean • A good starting point toward developing a rubric • Enables discussion, which can lead to further discovery • “Simple, stress free, and easy”

  16. Delphi Panel Introductory Exercise • Divide into 3 groups • Get out piece of paper and individually write down, “what do students have the most difficulty with when they first come to college (as first-year students)?” • Try to create frequency counts—combine like answers and tally them • Discuss

  17. Questions for Groups • What do these say about the difficulty students might have when they start? • Take a look at the most “popular” answer—do these ordinarily achieve “majority vote status? • Even in cases where “majority vote status” is achieved, might less popular answers indicate group consensus?

  18. Introduction to the Delphi Method • Combination of at least 3 methods—open-ended survey, closed-ended survey, and expert panel • Unlike expert panel, attempts to maximize anonymity of respondents to control for power dynamics among these respondents • Assumes highly motivated groups of experts (faculty or staff) willing to participate in more than one round of questions

  19. Introduction to the Method • Find homogeneous group of experts who can comment either on one assignment or specific student learning outcomes • Create an open-ended survey in which respondents are asked to identify strengths and weaknesses in student performance in reference to specific standard or student learning outcome • Content analyze responses by combining like responses, placing how many times each was mentioned in parentheses • “Cut and paste” these onto a survey, and ask respondents to indicate to what extent they agree with each on a 4 or 5 point scale • Report those responses that indicate consensus • If needed, move to 3rd round, in which respondents rank these consensus items

  20. Strengths of Delphi Method • A way of addressing “ineffable outcomes” • Can be used to designate the most agreed-upon student learning objectives that faculty have communicated • Can be used to gather information from employers, internship supervisors, alumni, etc. about specific items of interest

  21. Limitations of Delphi Method • Can be time consuming • Takes some knowledge of statistics • Not a method that can be used by itself; usually results need to be compared with direct assessments of student learning

  22. Today’s Activities • Separate into three groups; select group note taker • If you have not already, read the case study packets • As a group, discuss questions at end of case study—debate, applaud, etc.—do something active • Write answers on provided sheet of paper

More Related