130 likes | 138 Views
This article discusses the use of indirect assessments in evaluating student learning, specifically focusing on criteria, validity, and the importance of writing effective survey items. It provides insights into common issues and considerations in conducting indirect assessments.
E N D
Introduction to Indirect Student-Learning Assessment (Part II) Dr. Wayne W. Wilkinson September 28, 2016 ITTC Faculty Center Arkansas State University
Refresher: Indirect Assessments • Require the inference of student learning: • No direct evidence or demonstration • Common topics of indirect assessments: • Perceptions of successfully meeting program outcomes • Satisfaction/attitudes/feelings toward program • Utility of program
Criteria: The Survey Must Have a Purpose • Standards used to help make evaluative judgments • Program outcomes, program quality, program utility . . . • Common issues • Choices of proper criteria to use (e.g., program outcomes) • Disagreements over definition (what does “communicate effectively” mean?)
Conceptual vs. Actual Criteria Conceptual Actual Measures of conceptual criteria: Operational definitions • Theoretical construct: • Ideal set of quality factors
Conceptual and Actual Criterion Relations • Conceptual criteria are theoretical . . . actual and conceptual should overlap • Criterion deficiency • Criterion contamination • Criterion relevance
Content and Face Validity • Content Validity: The degree that an assessment includes relevant aspects of the criteria • Determined by Subject Matter Experts • Face Validity: do the items seem legitimate for a measure of the criteria? • Reactions and attitudes of test takers
Types of Items • Open-ended • Restricted (closed-ended) • Partially open-ended • Rating scales
Rating Scales • Likert-type scale: • Aggregated ratings of favorable or unfavorable statements • Sematic differential: • Bipolar adjectives • Measure several variables on common scale
Rating Scale Issues • Number of points on scale: • Odd or even (neutral point) • Labeling of scale: • Number of anchors • Numerical values • Item-rating scale congruence
Writing Good Items • Avoid very long items • Avoid negative statements (no, none, not, etc.) • 5th– 6th grade reading level • Avoid “check all that apply” items
Writing Good Items • Avoid items that express more than one thought (double barreled items) • Avoid evaluative assumptions The program aided my understanding of selecting and conducting the proper analysis for specific research questions. The program provided an excellent preparation for my career.
Assembling Your Survey • Surveyspace and justification • Question organization: • Keep related items together • Question order can have unintended effects • Sensitive topic items after less sensitive • Graphic navigation paths
Survey Evaluation Session • Survey goal: English BA Program Indirect Assessment • Justification for each item • Criteria Content Validity • Contamination or deficiency • Item issues • Common indirect assessment topics: • Perceptions of successfully meeting program outcomes • Satisfaction/attitudes/feelings toward program • Utility of program