410 likes | 631 Views
Please check, just in case…. Announcements. We will work on the Terminology Treasure Hunt in class in two weeks. I need a few volunteers to help me bring over materials to class next week.
E N D
Announcements • We will work on the Terminology Treasure Hunt in class in two weeks. I need a few volunteers to help me bring over materials to class next week. • Don’t delay on starting your standardized test review/critique. Please contact Brigid (brigidov@unm.edu) if you or your partner cannot come in on a Thursday 2-5, when she is scheduled in the Ed Diag office.
APA Tip of the Day: Quoting on-line sources • Include the author – if a person is not listed, use the name of the organization (e.g. Council for Exceptional Children) • Include the year – look carefully – often a year is listed at the bottom of the page in small print. • Include the page number, paragraph number, or, if neither of these are available, the section heading for the quoted text. See APA manual, 2010, pp. 171-172
APA Examples From APA manual, 2010, p. 172: • In their study, Verbunt, Pernot, and Smeets (2008) found that “the level of perceived disability in patients with fibromyalgia seemed best explained by their mental health condition and less by their physical condition” (Discussion section, para 1). Note: Discussion is the complete section heading. • “Empirical studies have found mixed results on the efficacy of labels in educating consumers and changing consumption behavior” (Golan, Kuchler, & Krissof, 2007, “Mandatory Labeling Has Targeted,” para. 4). Note: Mandatory Labeling Has Targetedis NOT the complete section heading, therefore, it has quotation marks.
Topic: Norm-references vs. criterion referenced tests October 8, 2013
data collection measurement evaluation testing assessment
Common Purposes of Assessment • Identification of atypical learning needs (i.e. disability and/or gifted/talented). • Determination of language proficiency. • Evaluation of current academic performance. • Accountability.
Prereferral Intervention This is a GENERAL EDUCATION process that should not necessarily lead to student referral to special education evaluation.
Special Education Assessment Processes • Screening • Evaluation (initial) • Re-evaluation • On-going data collection (i.e. classroom-based assessment)
Screening • Quick evaluation(s) in area(s) of concern. • May be administered by specialist or classroom teacher, • Typically used to rule out need for further assessment.
Special Education Evaluation • Team Process • Parent Consent • Parent participation • Non-discriminatory: • cultural & linguistic bias • appropriate instruments • multifaceted assessment
More useful purposes of assessment • What helps this student learn best? (Needed supports, scaffolding, cueing & prompting strategies.) • Patterns of language use by context. • Available supports for learning. • Patterns of progress toward specific learning goals. • Interaction of learning environment on learning, performance, and behavior.
Important! Our assessment methods MUST match our purpose.
Effective assessment means using the right tool for the job.
Quick Write: Why is it important for special educators to under stand the language of testing if they probably won’t be administering any diagnostic assessments?
Effective assessment means using the right tool for the job.
Common Assessment Techniques • Standardized Assessments: • norm- referenced and/or • criterion-referenced • other (neither NR or CR) • Non-standardized (informal) assessments: • norm-referenced • criterion-referenced • other (neither NR or CR)
Other kinds of assessments: • Developmental scales • Dynamic assessments • Language samples Additional instruments that might be used in assessment: • Family/child history • Interviews • Other documents observations
Standardized Tests: Tests that are “designed by test specialists and administered, scored, and interpreted under standard conditions.” (Linn & Gronlund, 2000, p. 44) Standardized vs. informal measures
Quick Question: • Are classroom-based assessments usually standardized? • Why or why not?
Norm-referenced Tests Describe “performance in terms of the relative position held in some known group (e.g., typed better than 90 percent of the class members).” (Linn & Gronlund, 2000, p. 42) NR assessments compare individual performance against others’ performance.
Quick Question: • Would norm-referenced classroom-based assessments be appropriate for students identified with special education needs? • Why or why not?
Criterion-referenced Assessments Describe “the specific performance that was demonstrated.” (Linn & Gronlund, 2000, p. 42) They are used to compare individual performance against a preset standard (criteria).
Strengths and limitations limitations NOT weaknesses
“There is no such thing as a ‘good’ or ‘bad’ test in the abstract, and… there is no such thing as the one ‘best’ test, even for a specific situation.” (Bachman & Palmer, 1996, p. 6)
Six aspects of test quality: • Reliability • Construct validity • Authenticity • Interactiveness • Impact • Practicality
Individual Activity: Individually, come up with at least three sentences using the word “reliable” and at least three sentences using the word “valid.” These sentences do not have to have anything to do with assessment or evaluation -- they can be the kinds of things you would say in ‘real life.’
Definitions Reliability: 1) “Reliability refers to the results obtained with an assessment instrument and not to the instrument itself.” 2) “An estimate of reliability always refers to a particular type of consistency” (i.e. over time, inter-rater reliability, with different tasks).
Definitions, cont. Reliability: 3) “Reliability is a necessary but not sufficient condition for validity.” 4) “Reliability is primarily statistical.” (Linn & Gronlund, 2000, pp. 108-109)
Definitions, cont. Validity: 1.) “Validity refers to the appropriateness of the interpretation of the results of an assessment procedure for a given group of individuals, not to the procedure itself.” 2.) “Validity is a matter of degree; it does not exist on an all-or-none basis.”
Definitions, cont. Validity: 3.) “Validity is always specific to some particular use or interpretation. No assessment is valid for all purposes.” 4.) “Validity is viewed as a unitary concept based on various kinds of evidence.”
Definitions, cont. Validity: 5.) “Validity involves an overall evaluative judgment. It requires an evaluation of the degree to which interpretations and use of assessment results are justified by supporting evidence and in terms of the consequences of those interpretations and uses.” (Linn & Gronlund, 2000, pp.75-76)
Definitions, cont. “Validity is an evaluation of the adequacy and appropriateness of the interpretations and use of assessment results.” (Linn & Gronlund, 2000, p. 73)
Cautions related to use of “validity” "Validity refers to the appropriateness of the interpretation of the results of an assessment procedure for a given group of individuals, not to the procedure itself." "Validity is a matter of degree; it does not exist on an all-or-none basis."
Cautions, cont. "Validity is always specific to some particular use or interpretation. No assessment is valid for all purposes." "Validity is a unitary concept." "Validity involves an overall evaluative judgment." (Linn & Gronlund, 2000, pp. 75-76)
Main Points: 1) A test must be reliable for the interpretation to be valid. 2) Reliability, in and of itself, is not enough -- the interpretation must also be valid for the individual and specified purpose of assessment. 3) Validity refers to the interpretation of the test results, not to the test.
Main Points, cont.: 4) As a special educator, the most important thing you need to learn about standardized tests is how to interpret assessment results -- are they valid FOR THIS CHILD AT THIS TIME?
Please take some time for the mid-semester course evaluation.