1 / 69

Measuring Instructor Effectiveness in Higher Education: A Mixed Methods Case Study

Measuring Instructor Effectiveness in Higher Education: A Mixed Methods Case Study. Dr. Gregory T. Bradley Dr. Scott W. M. Burrus Dr. Melanie E. Shaw Dr. Kevin Stange. Overview.

Download Presentation

Measuring Instructor Effectiveness in Higher Education: A Mixed Methods Case Study

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Instructor Effectiveness in Higher Education: A Mixed Methods Case Study Dr. Gregory T. Bradley Dr. Scott W. M. Burrus Dr. Melanie E. Shaw Dr. Kevin Stange

  2. Overview This presentation is an overview of the methodology of a mixed methods case study that is part of phase two of the University of Michigan and University of Phoenix research partnership focused on measuring instructor effectiveness, factors that may influence instructor effectiveness and the relationship between effectiveness and student learning.

  3. Methodology As discussed by Yin (2014), this study will follow a mixed method, multiple case studies approach. Multiple case studies provide the opportunity for what Yin calls "replication logic”. This involves selecting cases that demonstrate how findings are theoretically predictable. This replication results in demonstrably stronger case studies and extends the study’s opportunity for relevance outside the situated context, or provides greater transferability. The unit of analysis will be at the faculty level.

  4. Study aims • Near-term: Better understand the instructor’s background and behaviors that influence effectiveness • Teaching experience and cross institutional activities • Instructor cost of living • Course engagement • Teaching effectiveness • Student learning outcomes • Long-term: Develop and evaluate policies that harness this understanding to improve student performance • Hiring, teaching assignment, other personnel policies • Incentives for course engagement

  5. Research Questions • What are the characteristics of effective instruction? • How do effective instructors engage in their courses? • How do faculty define effective instruction? • How do students define effective instruction? • How many courses (or students) can effective instructors teach at one time? • Is there a relationship between pay and effectiveness? • What best practices can be implemented to facilitate effective instruction? • What are the recommendation for instructor training, mentoring, and support?

  6. Data collection methods • Examination of Course Room data (Mixed) • Review of faculty-student engagement (e.g., discussion forums, assignments, other communications) • Artifact assessment for student learning outcome mastery • Academic metrics database for course room interactions • Faculty Survey (Quantitative, follow-up to Phase 1)) • Background (work & teaching experience, education, etc) • Work commitments outside of UPX • Philosophies and approaches to teaching

  7. Data collection methods, continued • Focus groups/interview (Qualitative) • Faculty focus groups/interview • Student interviews/focus groups • Faculty does faculty location or geography influence faculty engagement? (Quantitative, follow-up to Phase 1) • How does cost of living related to the faculty engagement?

  8. Sampling For the mixed and qualitative component: The sample for this study is comprise of two courses, Math 208/209 and Accounting 290/291. The sampling strategy for this project is purposive. Adhering to principles of saturating, we will purposively sample a maximum of 12 Math 208/209 course (six of these courses will be purposively selected from the last year of the phase 1 sample of those courses taught by faculty designated as “effective” and six of these courses will be purposively selected from those courses taught by faculty designated by those faculty designated as “ineffective”) and a maximum 12 Accounting 290/291 courses or until we reach saturation. This approach is supported by Mason, 2010 and Palinkas, et. al, 2015.

  9. Sampling, continued . • For the quantitative, follow-up to phase 1 component, the sample will include those faculty involved in phase 1.

  10. Analysis and Reporting • As this is a mixed method multiple case studies we will follow a cross case analysis to explore how cases (based on the unit of analysis = faculty) are comparable and contrasting building thematic evidence to inform and answer the research questions. Quantitative analysis will also occur and align with appropriate hypothesis testing. • Reporting will initially be internal to key university stakeholders and then externally and include both peer-reviewed conferences and publications.

  11. Assignments • Roles: • Dr. Scott Burrus (UoP) and Dr. Kevin Stange (UM) are Co-Principle Investigators on the overall joint initiative for phase 2. Quantitative components focused on faculty cost of living are under the auspices of Kevin Stange. For the Mixed Method Multiple Case Study, Scott Burrus will take the lead as Project Director overseeing the and project managing the overall project. He will also participate in all data collection, analysis and reporting aspects of the project.

  12. Assignments, continued • Dr. Melanie Shaw is the lead research fellow/research associate serving on the mixed method case study and will take the lead on the faculty survey and will assist the other research associates on the other dimensions. • Dr. Meena Clowes is the lead research fellow/research associate on the mixed method case study and will take the lead on the examination of the course room dimension of the project and will assist the other research associates on the other dimensions. • Dr. Helen Zaikina-Montgomery is the lead research fellow/research associate on the mixed method case study and will lead the faculty and student focus groups/interview component of the study and will assist the other research associates on the other dimensions. Note: Research Center Affiliates (e.g., SAS research faculty) will support various data collection activities.

  13. Timeline • Timeline: • June-August, 2017 – Finalize design, instrumentation, sample selection, and data access. Obtain COR and IRB. Begin data collection. • August-October, 2017 – Data collection across all dimensions • October-November, 2017 – Begin Analysis process to find emerging themes • November,2017-January, 2018 – Continue initial Analysis and determine if further analysis is warranted • February-June, 2018- Draft initial reporting for internal and external audiences and consider external venues (e.g., AERA, DLA, OLC)

  14. Early Alert System (EAS) in Online Education: Student Demographic Profile Dr. Helen Zaikina-Montgomery Dr. Scott Burrus Dr. Meryl Epstein Dr. Elizabeth Young Dr. Roger Gung Dr. Pan Hu Ms. Sue Yuan

  15. References • Mason, M. (2010). Sample Size and Saturation in PhD Studies Using Qualitative Interviews. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 11(3). • Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K. (2015). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health, 42(5), 533–544. • Yin, Robert K., (2014). Case Study Research: Design and Methods, 5th Ed. Sage, Thousand Oaks, CA.

  16. Early Alert System (EAS) in Online Education: Student Demographic Profile Helen Zaikina-Montgomery, Ph.D. Scott Burrus, Ph.D. Meryl Epstein, Ed.D. Elizabeth Young, Ed.D. Roger Gung, Ph.D.

  17. Overview and Purpose of Study Examine the demographic characteristics of students who receive Early Alerts (EAs) in courses Compare demographics of students who receive EAs to students who do not receive EAs Existing research on course intervention strategies, such as EA identifies a need to better understand demographics of students who struggle in course work Build statistical models to evaluate the impact of demographic characteristics to receive Early Alerts Further evaluate the impact of Early Alerts on students’ probability of passing course [Ortagus, 2017]

  18. EAS in the Present Study Context Implemented in 2007 with the goal of increasing course completion by alerting the student’s Academic Counselor to contact the student & discuss concerns. Updated January 2016 to automatically alert the student’s Academic Counselor (AC) if the student did not submit an assignment or participate in the online discussions in the course. Current EAS Process

  19. Supporting Research Overview Student retention and graduation rates are a topic of institutional concern and academic examination Due to a difference in modality of delivery, online courses are structured differently than traditional on-ground or blended courses Online courses require students to possess more intrinsic motivation and higher levels of organizational and self-management skills [Allen, Seaman, Poulin, & Taylor, 2016; Braxton, 2002; Eaton, 2011; McElroy & Lubich, 2013]

  20. Supporting Research Overview • National Postsecondary Student Aid Study (NPSAS) shows that being married, being a parent, and a full-time employee were positively associated with online course enrollment. • NPSAS data showed that minority students were less likely to engage in online education than their non-minority peers • Effective interventions for students in online courses need to be well-matched to the online learning environment and to the demographic characteristics of those who are most likely to struggle in their course work [Allen, et al., 2016; Donnelly, 2010; McElroy & Lubich, 2013; NPSAS, 2012; Ortagus, 2017]

  21. Questions guiding this Research What is the demographic profile of students who received an Early Alert, and how do they differ demographically from those who did not receive EA? What is the demographic profile of students who pass the courses, and can we measure the impact of EA on passing course?

  22. Data Source • Collected from the university Office of Business Analytics and Operations Research. • Included records from students who were issues an early alert either by the course instructor or the learning management system (LMS) sometime during their course • Included courses with start dates between November 24, 2015 and January 26, 2016. A total of 26,573 student records were accessed and used in the study of which 2.4% (n = 640) were students who received an Early Alert

  23. Early alert Profiling results

  24. What is the demographic profile of students who received an Early Alert?

  25. How to students on who received an Early Alert differ demographically from students who did not receive an Early Alert? • Female students more likely to receive an EA than male students • Younger students and students who were single were more likely to have an early alert • Students who received an EA were less likely to pass the course in which early alert was filed, three times more likely to withdraw from the course, and four times more likely to fail the course

  26. How to students on who received an Early Alert differ demographically from students who did not receive an Early Alert? Chi square tests applied to EAS status related to categorical variables * = Values significant at the .05 level ** = Value significant at the .0001 level

  27. Early alertModeling

  28. Modeling Approach

  29. Early Alert Variables Exploration

  30. Early Alert Variables Exploration

  31. Early Alert Variables Coefficient

  32. Odds Ratio Note: Odds Ratio is the odds of the positive outcome at one level of Xirelative to the odds of the positive outcome at reference level of Xi

  33. Early Alert Model Performance The early alert model (7 variables) has a C statistic of 0.736, which can be considered a good model performance

  34. COURSE-PASSModeling

  35. Course-Pass Variables Exploration

  36. Course-Pass Variables Exploration

  37. Course-Pass Variables Coefficient

  38. Odds Ratio Note: Odds Ratio is the odds of the positive outcome at one level of Xirelative to the odds of the positive outcome at reference level of Xi

  39. Course-Pass Model Performance The early alert model (9 variables) has a C statistic of 0.798, which can be considered a good model performance

  40. Conclusions & Future research Students who receive EA tend to be more academically “at risk” than non-EA students, and have 41% lower passing rate. Some of the expected factors (those that add more responsibility, such as being married and having dependents) were not associated with EA status Additional data and further analysis, including hierarchical linear modeling to account for potential nested effects should be conducted Universities can continue to develop interventions for “at risk” students keeping in mind factors from this study Students’ GPA, collection status, form of payment, income have strong impact on students’ receiving EA For a non-EA student, the odds of passing a course is 4.4 times as the odds of an EA student, due to the inherent risk

  41. Questions & Discussion

  42. Variables List

  43. Variables List (Cont’d)

  44. Modeling Approach

  45. Variable Rank Order Based on Information Value (IV)

  46. Assessing the Validity and Reliability of the SmarterMeasure Learning Readiness IndicatorTM Dr. Gregory T. Bradley Dr. Scott W. M. Burrus Dr. Melanie E. Shaw Dr. Karen Ferguson

  47. Overarching Objectives of the Inquiry • To replicate the factor structure of the three factorable primary components and 18 subcomponents of the SmarterMeasure Learning Readiness IndicatorTM (SMLRI) using an online student population (i.e., construct validity). • To measure the item reliability of the subcomponents using the same population of students.

  48. Points of Guidance Relative to the Inquiry With more than two decades of growth in online offerings at institutions of higher education (Allen & Seaman, 2013), institutional leaders are challenged to identify strategies to retain students. Student retention rates are historically lower in online courses than in similar face-to-face courses (U.S. News and World Report, 2015). As such, there are several categories identified as critical to student retention, including readiness for online learning and student support to ensure success (Harrell, 2008). To ascertain online learning readiness and needed support services, tools must be developed and effectively used. One of the tools commonly used is the SmarterMeasure Learning Readiness Indicator TM (SMLRI). This tool is an online assessment of student readiness to engage in online or technology-rich learning environments (SmarterServices, LLC., 2016).

More Related