210 likes | 220 Views
Learn key considerations for technical assessments, including types, levels, development methods, administration, and analysis. The discussion covers important issues, such as validity, reliability, ethical and legal considerations, and the development cycle. Expert insights from Dr. John M. Townsend provide valuable guidance for enhancing assessment practices. Contact Dr. Townsend for further information or consultation.
E N D
What Should You Consider When Talking About Technical Assessments? John M. Townsend Tennessee Board of Regents NACTEI 27th Annual Conference
Why are you assessing? Most important question • For Perkins IV? • For Program Improvement? • For student attainment?
If For Student Attainment At what level is appropriate? • Entry level • Mastery level • Industry certification level
When do you assess? End-of-program End-of-course End of Pathway Cluster related
Norm-referenced Comparison among individuals Demonstrates changes of students as a group Competency-referenced Attainment of an individual Measurable learning outcomes based upon industry standards Norm- or Competency-referenced
Buy or Develop? • Standardized instruments • Centralized state-developed instruments • Local assessments recognized by state
Performance What the student should be able to do Checklist Limited number of areas can be assessed in specified time Cognate What the student should know Test-items Many areas can be assessed in a specified time Performance or Cognate
Building an Assessment • Adopt– seek items or assessments that you can adopt as is • Adapt – items or assessments that can be modified to fit your standards • Build – items or assessments when others cannot be found
Table of Specifications (blueprint) • Determine competencies to be assessed • Determine competency groups (3 – 10) • Determine percentage of each cognitive typology per group
Development Questions • Is cost and time worth benefit of test? • Who should develop the assessment? • Test security issues?
Multiple-choice elements: Stem Correct response Foils or distractors
Fixed-form High reliability Needs high security Limited number of test items needed Variable-form Good reliability Lesser security needed Requires a large number of test items Fixed-form v Variable-form
Assessment Administration • Decide the media for administration • Identify who should conduct the assessment • Determine when the assessment activities should occur
Reporting issues: • Grading of the assessment • Assessment improvement needs • Reports needed
Assessment Review • Validity – the degree to which certain inferences can be made from test scores. • Reliability – the degree of consistency between two or more measures of the same thing.
Item & Assessment Analysis Find a good statistician, psychometrician or educational psychologist !
Ethical Issues in Testing Code of Fair Testing Practices in Education by the Joint Committee on Testing Practices, American Educational Research Association, American Psychological Association & National Council on Measurement in Education
Legal Issues • Accommodations for members of special populations • Access to the test-items by parents and others • Utilization of assessments in relation to passing or graduation
Development Cycle • Competency learning outcome development • Blueprint development • Test-item development • Field Testing • Item-analysis
Development Cycle • Set cut scores • Test form development • Assessment administration • Reporting of results
Contact Information: Dr. John M. Townsend Executive Director, Workforce Development Office of Academic Affairs Tennessee Board of Regents 1415 Murfreesboro Pike, Suite 350 Nashville, TN 37217 615-366-4428 John.Townsend@tbr.edu