130 likes | 247 Views
Using the IDEA System. Results from our Spring 2010 Pilot. Spring 2010 Pilot. All on line, all used diagnostic form 5 faculty 9 courses 44% average response rate
E N D
Using the IDEA System Results from our Spring 2010 Pilot
Spring 2010 Pilot • All on line, all used diagnostic form • 5 faculty • 9 courses • 44% average response rate • Results (2 faculty hard copies) sent to Beth Pellicciotti and were sent confidentially to faculty via campus mail, an institutional group report was also sent (no names, just aggregate data)
Faculty Side • Fill out faculty information form “FIF” sent as a link (demographics and choose 3-5 essential objectives out of 12) • Sent back a link that you cut and paste into blackboard • Students are also sent link via Purdue email • Students click on link, fill it out • If they don’t, they keep on getting emails till they do
Technology Side • Heather Zamojski was our technology person and has been involved in prior pilots • Easy to use with our present technology • All involved with the pilot recommend its use
Report Overview • Page 1 – Big Picture • How did I do? • Page 2 – Learning Details • What did students learn? • Page 3 – Diagnostic • What can I do differently? • Page 4 – Statistical Detail • Any additional insights?
Adjusted takes in 5 variables Student Work Habits (#43) Student Motivation (#39) Class Size (Enrollment, FIF) Student Effort (multiple items) Course Difficulty (multiple items) Raw, all data is unadjusted Adjusted Scores versus RawAdjusted score- faculty isn’t penalized for students that aren’t motivated, hard courses and large courses.
4.5+4.5 4.4+4.4 4.2+4.2 +3.2 7 Page 2: Progress on Relevant Objectives
Resources • Knowledge Center: www.theideacenter.org • POD-IDEA Center Notes • POD-IDEA Center Learning Notes • IDEA Papers