600 likes | 724 Views
Using IDEA for Assessment, Program Review, and SACS. University of Alabama Birmingham September 11, 2012. Shelley A. Chapman, PhD. Plan for this Session. Program Evaluation & Assessment of Student Learning Group Summary Reports Aggregate Data File Benchmarking Reports Accreditation Guides.
E N D
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD
Plan for this Session • Program Evaluation & Assessment of Student Learning • Group Summary Reports • Aggregate Data File • Benchmarking Reports • Accreditation Guides
What makes IDEA unique? • Focus on Student Learning • Focus on Instructor’s Purpose • Adjustments for Extraneous Influences • Validity and Reliability • Comparison Data • Flexibility
Student Learning Model: 2 Assumptions Assumption 1: Types of learning must reflect the instructor’spurpose.
Student Diagnostic Form Assumption 2: Effectiveness determined by students’ progress on objectives stressed by instructor
Diagnostic Report Overview • Page 1 – Big Picture • How did I do? • Page 2 – Learning Details • What did students learn? • Page 3 – Diagnostic • What can I do differently? • Page 4 – Statistical Detail • Any additional insights?
The Big Picture 1If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.
4 Progress On Relevant Objectives 4.3 + 4.3 4.1 4.2 3.6 5
Summary Evaluation: Five-Point Scale 50% 25% 25%
The Group Summary Report How did we do? How might we improve?
Defining Group Summary Reports (GSRs) • Institutional • Departmental • Service/Introductory Courses • Major Field Courses • General Education Program
GSRs Help Address Questions • Longitudinal • Contextual • Curricular • Pedagogical • Student Learning-focused
Adding Questions Up to 20 Questions can be added • Institutional • Departmental • Course-based • All of the above
Local Code Use this section of the FIF to code types of data.
Defining Group Summary Reports • Local Code • 8 possible fields • Example: Column one – Delivery Format • 1=Self-paced • 2=Lecture • 3=Studio • 4=Lab • 5=Seminar • 6=Online Example from Benedictine University
Example Using Local code Assign Local Code • 1=Day, Tenured • 2=Evening, Tenured • 3=Day, Tenure Track • 4=Evening, Tenure Track • 5=Day, Adjunct • 6=Evening, Adjunct Request Reports • All Day Classes • Local Code=1, 3, & 5 • All Evening Classes • Local Code=2, 4, & 6 • Courses Taught by Adjuncts • Local Code=5 & 6
Description of Courses Included in this Report Number of Classes Included Diagnostic From 42 Short Form 27 Total 69 Number of Excluded Classes 0 Response Rate Classes below 65% Response Rate 2 Average Response Rate 85% Class Size Average Class Size 20 Page 1 of GSR
UAB Spring 2012 Page 1 of GSR
Assessment of Learning What are our faculty emphasizing? How do students rate their learning? How do our courses compare with others? How do our students compare with others (self-rated characteristics)? What efforts can we make for improvement? (How can we “close the loop”?)
Are we targeting “Core Competencies” in the Core Curriculum? IDEA Learning Objectives
Are we targeting “Core Competencies” in the Core Curriculum? IDEA Learning Objectives
What are We Emphasizing? Page 2
Do Students’ report of learning meet our expectations? Objective 1: Gaining factual knowledge (terminology, classifications, methods, trends) Pages 5 and 6
How do students rate their learning? Part 1: Distribution of Converted Scores Compared to the IDEA Database Page 3
Overall Progress Ratings (Courses) Page 3 Percent of Classes at or Above the IDEA database Average
Overall Progress Ratings (Courses) Part 3: Percent of Classes at or Above ThisInstitution’s Average Page 4
Which teaching methods might we use to improve learning? Page 7 Teaching Methods and Styles
Relationship of Learning Objectives to Teaching Methods
How do students view course work demands? Page 8B Student Ratings of Course Characteristics
Aggregate Data File Allows you to • Use Excel Spreadsheet • Use with SAS or SPSS • Ask other types of questions • Display data in different ways
Instructors’ Reports on Course Emphases: Selected Pairings-Writing and Oral Communication
Instructors’ Reports on Course Emphases: Selected Pairings-Critical Thinking & Writing
Highlights for Sample U • Remarkably similar profiles across terms • Overall response rates ranged from 66% to 80% • 1stterm in which administration was primarily online achieved a 75% response rate • Transition from paper to online (fall 2009 to fall 2010) does not show major differences in profiles • Sample U faculty focus on 4-5 outcomes as essential/important • Over the last 3 terms, a significant increase on several objectives has been observed: application of course material, oral and written communication skills, & analysis and critical thinking skills (objectives 3, 8, & 11, respectively)
Benchmarking Institutional and Discipline Reports
Benchmarking Reports • Comparison to • 6-10 Peers • Same Carnegie Classification • IDEA database
Comparison Groups * Peer group is based on 6-10 institutions identified by your institution
Benchmarking Reports The student, rather than the class, is the unit of analysis Percentage of positive ratings is given rather than averages
Response Rates Your University: Student participation is similar to that of each comparison group Your University=79% Peer=77% Carnegie=79% National=75%