1 / 10

Measurement of Technology Skills, Attitudes, and Science Content in NSF/ITEST DAMSALS-2

Measurement of Technology Skills, Attitudes, and Science Content in NSF/ITEST DAMSALS-2. by Gerald Knezek & Rhonda Christensen University of North Texas, USA DAMSALS-2 Evaluators Presented to the American Evaluation Association Annual Conference Toronto, Canada October 27, 2005.

aldis
Download Presentation

Measurement of Technology Skills, Attitudes, and Science Content in NSF/ITEST DAMSALS-2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measurement of Technology Skills, Attitudes, and Science Content in NSF/ITEST DAMSALS-2 by Gerald Knezek & Rhonda Christensen University of North Texas, USA DAMSALS-2 Evaluators Presented to the American Evaluation Association Annual Conference Toronto, Canada October 27, 2005

  2. About DAMSALS-2 • Delta Agriculture Middle School Applied Life Science (DAMSALS) • Northeastern Louisiana (Mississippi Delta) • 1 of 13 funded for 3-yrs. By NSF in 2003 • Primary goals: • Enhance IT skills in teachers & students • Improve teacher content knowledge • Increase student achievement • Feed USA IT professional pipeline • Secondary goal: • Collect data to extract common findings across projects

  3. Evaluation of DAMSALS-2 • Institute for Integration of Technology into Teaching and Learning (IITTL) • Pre-Post Assessments for • Magnitude of Change (ES) • Technology Skills for Teachers/Students • Content Knowledge for Teachers/Students • Classroom Behaviors of Teachers • Student Achievement • State-wide, national comparison groups

  4. Instrument What it Measures When?

  5. Instrument What it Measures When?

  6. Findings • Large teacher gains: • Technology Skills ES = .87 (Time 1 to Time3) • Curriculum Integration Skills ES = 1.05 (Time 1 to Time 3) • Content Knowledge ES = 1.95 (PrePost Institute) • Practice students at summer institute asset • Teachers regressed by end of school yr.somewhat in Attitudes (ES = -.21) • Student gains during school year: • IT Attitude ES = -.04 (Time of year measurement factor) • Technology skills ES = .19 • Older students appear to gain in achievement: • 8th grade LEAP ES = +.23 2004-05 (state went down) • 7th grade ITBS 54 to 57 NPR 2004-05 (48-49 statewide)

  7. Challenges Regarding Teachers • Small n of teachers • Incoming skills varied • Attrition is issue • Local support uncontrollable • Gathering data problematic • Measurement of science content knowledge

  8. Challenges Regarding Students • Project too short to expect achievement gain (need longitudinal follow up) • Measurement of achievement varies by grade level • Characteristics of classes (6th, 7th, 8th) vary within one school • Selection of appropriate comparison group is problematic • Proving positive impact is due to ITEST is difficult

  9. Challenges Across Projects • Agreement on Instruments • Forum to Exchange Results • Like PT3 collaborative exchanges • Local time/resources to support sharing

  10. For Further Information • About Project: • Patty’s name or URL here • About Instruments: • http://www.iittl.unt.edu • About Evaluation Issues: • Gknezek@gmail.com • Rhonda.Christensen@gmail.com

More Related