300 likes | 594 Views
Going Beyond Grades In Assessing Student Learning Slides: www.zogotech.com/grants/. Dr. Blaine Bennett Michael Taft Michael Nguyen. Outline. About SWTJC, History Data-based Decision Making?? Data Collection / Analysis Intervention: Math linking Intervention: Intrusive Advising (T5)
E N D
Going Beyond Grades In Assessing Student LearningSlides: www.zogotech.com/grants/ Dr. Blaine Bennett Michael Taft Michael Nguyen
Outline • About SWTJC, History • Data-based Decision Making?? • Data Collection / Analysis • Intervention: Math linking • Intervention: Intrusive Advising (T5) • Institutionalize Processes: Student Learning • Questions Speaking: Dr. Blaine Bennett
About SWTJC • Large service area • 11 Counties (16,769 square miles) • Rural • HSI ~ 6,000 Enrollment • 70+% FTIC need remediation • Achieving the Dream • Developmental Education • Gateway Courses • Culture of Evidence
Data-Based Decision Making Example: HSI-STEM
Background: SWTJC Student Data Warehouse Speaking: Dr. Blaine Bennett
Gateway Completion Analysis • Look at course completion rates, poke around • Compare time-to-degree for gateway • Look for gaps in sequence completion
Data-based Decision Makinge.g. Course Completion Slice and dice, discover problems Look at different variables: Ethnicity, Gender, Age, Location, Mode of Instruction
Biggest Gatekeeper: College Algebra Correlation between gateway course completion and graduation. Within 4 terms of completing College Level math (in green) 30% of students have graduated (almost double that of the reading and writing gateway courses!)
Intervention? Data show: • College algebra is biggest gatekeeper • Students taking too long between time completing last dev ed math course and completing college algebra Need to close the gap But how??
Intervention #1:Math Linking • Solution: Link the highest level of dev math and College Algebra into a single semester format • Target math-linking at the “best” dev ed students
Challenges Who are the best deved students?? John Doe’s Placement John Doe’s Class History Where is this student in dev ed pipeline?? Too much data, not enough information
Subject LevelsTest Scores + Grades = Subject Levels Test History Bottom Line Class History è • Incorporate test scores + grades into single “level” • Easy to understand, analyze
Intervention Now we have easy way to pull out the best dev ed students, we can put them in a linked class Let’s see the results
Math Linking Assessment 90% of Students in the linked class (blue) completed college level algebra within 2 terms. For non-linked students it took nearly 10 terms to achieve even 40%.
Math Linking Assessment The data clearly demonstrates that math linking is having the desired effect We are looking at other linking options including MATH 0302 to MATH 0303
Intervention #2: Intrusive Advising Intervene with students Early and Often Query to identify specific sub-groups of students
Student Engagement Noel-Levitz retention variables Computed At-risk indicators Downloaded ERP information Gives advisors / faculty easy access to data to make data-driven decisions on an individual student basis (note: all information here is randomized)
Contact via Multiple Modes Email Mail Phone Face to Face (Classroom) Above: Identify students who may be at-risk because they have significantly increased their course load. Click the email button to contact them and record those contacts in a central location.
Target specific student groups Identifying students who may be at risk: there are 474 students who are retaking a class in 2007 Spring, have < 45 credits and have < 2.5 GPA.
But Are They Learning?? Lots of interventions and improved completion, retention rates, but how can we prove they are learning?? How can we measure student learning?
Review Learning Indicators Direct Grades (GPA) Test Data Placement Completion (dev ed / gateway courses, core, certificate/degree) Subject Levels (normalized placement) Academic Status (Probation, Suspension, PTK, Near graduation, LHF) Outcome Assessment Data Indirect (affects learning) Demographic & Personal Data Attendance (99% of success is showing up) Student Activities/Support Services (Engagement- Advisement, Tutoring) Enrollment patterns External factors (Financial, job, family, etc)
Outcome Assessment Project • Objective: assess student learning institutionally • Process: Faculty develop outcomes • For example: College Level Algebra: Solve equations/inequalities of various types with particular emphasis on linear and quadratic polynomials • Develop multiple choice questions for each outcome • Establish mastery level (e.g. 75% 3 of 4 questions for related outcome answered correctly) • Score assessments and store results in DW
Data Flow Key points: Automatic downloads, no additional data entry
Prosper Answer Form Objective and subjective (i.e. writing samples) responses captured
Improved Programs & Learning (Institution) Grade distribution follows normal bell curve. Would expect the post-test assessment to line up, but it is far behind
Improved Programs & Learning (Faculty) For some faculty it’s pretty close, so there’s hope
Conclusion • Analyze – Intervene – Assess • Need multiple data sources to analyze beyond just grades (i.e. outcomes assessment ,attendance, pre/post testing, NSC) • A data warehouse (i.e. ZogoTech) can help
Questions Southwest Texas Junior College Blaine Bennett blaine.bennett@swtjc.edu (830)591-7275 ZogoTech Michael Taft mtaft@zogotech.com (214) 774-4780 x801 Michael Nguyen mnguyen@zogotech.com (214) 774-4780 x803