160 likes | 324 Views
Speaking and Listening: Sounds Straightforward, but What’s the Real Story?. NCSA June 22, 2016 Wes Bruce – ELPA21 Technology Consultant Carsten Wilmes, Ph.D. – WIDA Director of Assessment Terri Schuster – Nebraska Title III Director
E N D
Speaking and Listening: Sounds Straightforward, but What’s the Real Story? • NCSA June 22, 2016 • Wes Bruce – ELPA21 Technology Consultant • Carsten Wilmes, Ph.D. – WIDA Director of Assessment • Terri Schuster – Nebraska Title III Director • Jen Paul – EL and Accessibility Assessment Consultant, Michigan Department of Education
Who are the English Language Proficiency Consortia • ELPA21 • Funded by USED Assessment Grant • Oregon is the lead state • Brand new assessment based on the ELP standards developed by WestEd • Assessment operational 2015-16 • Standard setting this summer • WIDA • Originally funded by 2002 USED Grant • Housed at University of Wisconsin-Madison • Directed by WIDA Board of Directors (SEAs) • New online ACCESS for ELLs 2.0 assessment operational 2015-16 (in addition to paper) • New standard setting July/August of this year
Evolving the Assessment of Listening and Speaking • English language proficiency tests have been assessing the “four” domains for quite a while. • We are focused on how this next generation of EL tests addresses Listening and Speaking • Previously • Listening via teacher read or recorded prompts that students responded to • Standardization for read aloud • Errors in advancing to the correct prompt • Speaking was often 1 to 1 with student reading and teacher scoring response with rubrics then recording the student’s score • Time consuming • Reliability of scoring • Standardization
Question: Can technology improve the assessment of Speaking & Listening? • Provide a more standardized domain assessment , increase reliability and validity without increasing costs • Removing the demands of 1 to 1 delivery, teacher training and scoring of Listening item • ELPA 21 and WIDA worked to build items that could use an “intuitive” interface that students with little exposure to technology could use • All grades K-12 use the same interface (WIDA 1-12). • All audio is professionally recorded (not TTS) • Students get one (WIDA)/two (ELPA 21) opportunities to hear the prompts in Listening • Students get one (WIDA)/two (ELPA 21) opportunities to record in Speaking
Speaking requires additional hardware • Students must have headsets (headphone with microphone) • Directional adjustable microphone the best • Think about durability and sanitation • The headset must be compatible with the testing device • Dual RCA plugs, single RCA, USB • Adapters (Y, USB to RCA) • The headset must “fit” the user (ELPA 21 is testing Kindergarten students) • Cost is an issue • Headsets are “only” needed for the maximum number of students who will test simultaneously in one site • Can easily be organized into kits and shared between testing sites
ELPA21 “Controls” Listening Speaking Record, stop and play controls • Separate control for directions and question. • Darker background when “playing”
ELPA21 Speaking and Listening Administration Considerations Lessons Learned
English Language Proficiency (ELP) Assessments and English Language Arts (ELA)—What’s the diff? ELP Assessments ELA Assessments Most assess reading skills and sometimes writing, rarely listening or speaking Construct usually centered reading comprehension, vocabulary, writing skills • Required to assess and report on all 4 language domains • Reading, Writing, Listening, and Speaking • Construct is the learning of a new language • The interactive nature of language makes assessing listening and speaking challenging
Listening and Speaking Evolution: ELPA21 and ELDA (Legacy Assessment) ELPA21 (K-12) Legacy Test (ELDA 3-12 *) Listening Students listen to CDs while working through a booklet Speaking Individual administration Students listened to CDs/look at pictures in booklets Scored on the fly with rubrics by test administrator * K-2--inventory of skills • Listening • Students wear headsets and listen to audio prompts as they move independently through an online form • Speaking • May be group administered • Students listen to prompts and speak into headsets where their responses are recorded as they move independently through an online form • Scored at a scoring center
Benefits: Online Assessment of Listening and Speaking • May be group administered (with caveat for young students) • Embedded audio eliminates need for additional playing equipment • Students may re-record response • Recorded responses allow for re-listening during scoring • Eliminates need for training local test administrators on rubrics • Trained scorers at a center • Uniformity in training of scorers/more consistent ratings
Challenges:Online Assessment of Speaking and Listening • Use of headsets • Can be costly and may not be purchased with Title III funds • May be shared as test does not need to be taken at the same time • Headsets used with other software programs may work • Need to fit student’s head • Students need practice speaking into them • Microphones pick up background noises/other students and teacher talk • Teachers sometimes heard “coaching” students too much • Consider adding physical space between students for speaking portion • Recommend trying out several before place large order
Challenges Continued…Assessing young students • Low student/test administrator ratio is recommended • Test administrators should be very familiar with Directions for Administration prior to test time • Practice with Interactive Demo (practice test) is critical • Speaking record/playback/re-record features • Headsets need to fit • Test administrators need to understand when it is appropriate to help with technology/navigation and what constitutes too much assistance
Preparation and Practice = Success • Advance coordination between school/district IT folks and vendor • Technical support • Ensure devices are supported • Familiarity with directions and navigation by test administrators • Understand allowed accessibility features and how they may be applied in the platform (embedded and non-embedded) • Plenty of practice with interactive demo (practice test) and using the features prior to testing