140 likes | 288 Views
Putting it All Together and Bringing it Home. Workshop Flow. The construct of MKT Gain familiarity with the construct of MKT Examine available MKT instruments in the field Assessment Design Gain familiarity with the Evidence-Centered Design approach
E N D
Workshop Flow • The construct of MKT • Gain familiarity with the construct of MKT • Examine available MKT instruments in the field • Assessment Design • Gain familiarity with the Evidence-Centered Design approach • Begin to design a framework for your own assessment • Assessment Development • Begin to create your own assessment items in line with your framework • Assessment Validation • Learn basic tools for how to refine and validate an assessment • Plan next steps for using assessments
Assembly of Final Version of Test • Select refined items that • Are aligned with content and process laid out in the test specification (i.e., via expert panel review) • Perform appropriately in think-alouds and pilot testing • Perform appropriately in statistical analyses (results of field testing – IRT analyses, coefficient alpha, factor analysis) • Layout test booklet • Sequence of items so that a few easy items appear in the beginning and then items of different levels of difficulty are distributed throughout the form • Create multiple forms, if applicable • Review and finalize test administration directions
Purpose of Technical Documentation To provide test users with the information needed to make sound judgments about the nature and quality of the test, the resulting scores, and the interpretations based on the test scores
Technical Documentation Standards Provide information in the technical documentation on the following: • Nature of the test • Intended use • Description of test development process • Technical information related to scoring and interpretation • Technical information related to evidence of validity and reliability • Guidelines for test administration and interpretation
Some Examples of Technical Documentation The NAEP 1998 Technical ReportThis report provides details on the instrument development, sample design, data collection, and data analysis procedures for the 1998 National Assessment of Educational Progress (NAEP) in the subject areas of reading, writing, and civics. Nearly 448,000 public- and nonpublic-school students across the United States were assessed at the national (grades 4, 8, and 12) and state (grades 4 and 8) levels. http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2001509 TIMSS 2003 Technical Report http://timss.bc.edu/PDF/t03_download/T03TECHRPT.pdf
Important Caution Our emphasis is on evaluating programs, not individual teachers We are not discussing assessments that will have consequences for individual teacher evaluation or employment decisions
Resources for Assessment Development • Personnel • Money • Time It will take as little or as much of these as you can put into it.
Develop Pool of items Refine items Collect/ Analyze Validity Data Refine items Types of Personnel TEST OPERATIONS EXPERTS ASSESSMENT DESIGN SPECIALISTS Define item Template (Define Test Specs) Assemble Test CONTENT EXPERTS Define item Specs Domain Analysis RESEARCH PARTICIPANTS Document Technical Info CONTENT EXPERTS Domain Modeling (Design Pattern) PSYCHOMETRICIAN
Develop Pool of items Refine items Collect/ Analyze Validity Data Refine items Selected Major Time Considerations Define item Template DEVELOPING AND REFINING ITEMS (Define Test Specs) Assemble Test Define item Specs EXPERT PANEL REVIEWS THINK-ALOUDS PILOT TESTING FIELD TESTING DUPLICATION, SCORING, DATA ANALYSIS Domain Analysis DOMAIN MODELING Document Technical Info Domain Modeling (Design Pattern)
Develop Pool of items Refine items Collect/ Analyze Validity Data Refine items Financial Considerations LABOR OUTSIDE EXPERTISE Define item Template OUTSIDE EXPERTISE (Define Test Specs) Assemble Test Define item Specs Domain Analysis PARTICIPANT STIPENDS MATERIALS, DUPLICATION, SCANNING Document Technical Info Domain Modeling (Design Pattern) SCORING DATA ANALYSIS
Activity #6Bringing this to Your Work • Reflect on what we have discussed in this workshop: • MKT • Assessment Design • Assessment Development • Assessment Validation • Suggested questions are on the activity sheet • Following, please share
Sharing • What are the important points you will bring home from this workshop to your own assessment development processes? • What questions did you have at the beginning of the workshop? Are they answered? • What will you do differently in your assessment processes? • Were there any “ah-ha” moments for you in this workshop? • What challenges do you expect to face? • What questions do you still have? • What are your next steps?
Feedback Nikki Shechtman nicole.shechtman@sri.com Teresa Lara-Meloy teresa.lara-meloy@sri.com SRI International