430 likes | 540 Views
Are Indigenous students adequately represented in unit feedback surveys at CDU?. Why is this important. Performance linked funding policy environment. University Base Funding ISSP funding Me Too’ strategies risky if student profile or study experience is significantly different.
E N D
Are Indigenous students adequately represented in unit feedback surveys at CDU?
Why is this important • Performance linked funding policy environment. • University Base Funding • ISSP funding • Me Too’ strategies risky if student profile or study experience is significantly different. • Evidence based strategies for achieving student outcome targets. • Policy implementation timing means Universities don’t have time to conduct new research.
Why is this important • Most, if not all Australian universities, already conduct regular surveys of a student’s teaching and learning experience. • Many universities already use these in the formulation and measurement of student success strategies. • Are they fit for this purpose?
How to answer these questions/ • Analysis of MyView Data at CDU. • Why CDU?
CDU • CDU has the highest access and participation rates for Indigenous students in HE in the country. • 6.8% of all domestic enrolments at CDU are from Indigenous students, compared to 1.6% across the rest of the sector. (Department of Education and Training, 2016b). • Sector trends in participation, retention and success evident in CDU’s student population. • Gap between Indigenous student outcomes and Non-Indigenous students found in National data also evident at CDU.
Are student unit feedback surveys at Charles Darwin University representative enough of Indigenous students to be confidently used in the formulation and measurement of programs designed to improve student outcomes?
Whole of University • Indigenous Student Specific • Specific Courses, sub-cohorts • For un-successful and not-retained Indigenous students
MyView Survey • Survey of a students experience of teaching and learning in a unit of study. • Survey Population: All students in eligible units in a semester. • Survey Responses: Gathered 100% online, no strata management in collection.
Data Scope • MyView survey 2015 to 2017. • Summer Semester excluded. • Students who withdraw before census date not in survey population. • Enabling and Cross-Institutional / Non-Award students excluded.
Measuring survey representativeness Understanding Representativeness of a cohort within a broader group • % of survey population v % of survey responses. • % change required to achieve representativeness helps to counter this. Understanding reliability of cohort specific data • Confidence intervals Identifying non-response bias
Calculating Confidence Intervals to MyView • Confidence interval is calculated on the mean of the observations for each question in the survey. • This was then summarised as a min, max and average confidence interval for each cohort.
Profile Attributes • Student, unit and course attributes added to survey data, as at time of survey distribution. • Student Attributes • Gender • Aboriginal and/or Torres Strait Islander (Indigenous) Status • Age • Socio-economic status of Home Postcode - using the SEIFA Education and Occupation index (ABS, 2006a) • Remoteness of Home Postcode – using the ASGS Remoteness Structure (ABS, 2006b) • Student Type (Domestic v International) • Unit Attributes • Unit mode of study • Unit teaching discipline • Unit Location • Course Attributes • • Course Level – Using Australian Qualifications Framework (AQF) • • Course Commencing Status
Student success • Student success measured using GPA calculated from all units included in the ESP in the semester. • Aggregate Pass/Fail average calculated.
Student retention • Student retention from year of inclusion in ESP into the following academic year was also calculated. • Used principle of Department of Education & Training Retention rate, but applied at an individual level. (Department of Education and Training, 2016a).
Research Limitations • Single university. • Non-response bias not confirmed or defined. • Appropriateness of the tool for use in student success strategy development and measurement.
How represented are Indigenous students within the survey data, when it is being analysed at a whole of university level?
Indigenous Student Representation in ESP versus in Survey Responses, By Survey Collection Period
Average response rate adjustment required to achieve repetitiveness of student cohort in whole of university survey data.
How confident can we be, when focusing on analysis of just the Indigenous student cohort, that the Indigenous student survey responses are representative?
Average, Maximum and Minimum Confidence Interval by student Cohort
How representative is the Indigenous student survey data of representing the student and course enrolment profile of the Indigenous student population at CDU?
Survey response rate by age range, by target cohort, compared with overall response rate as benchmark
Survey response rate by remoteness area of student home address, by target cohort, compared with overall response rate as benchmark
Survey response rate by gender, by target cohort, compared with overall response rate as benchmark
Survey response rate by AQF level, by target cohort, compared with overall response rate as benchmark
Survey response rate by socio-economic status, by target cohort, compared with overall response rate as benchmark
Other Demographics Under-represented FOEs • Education and Psychology Disciplines
How represented are unsuccessful and/or not-retained Indigenous students within the survey data, when the whole Indigenous cohort is the focus of the analysis?
Survey response rate by GPA, by target cohort, compared with overall response rate as benchmark
Survey response rate by Pass / Fail average GPA, by target cohort, compared with overall response rate as benchmark
Survey response rate by retention status, by target cohort, compared with overall response rate as benchmark
Survey response rate for students who have not withdrawn and who have a pass average GPA, by target cohort, compared with overall response rate for successful and retained students as benchmark
How confident can we be, when focusing on analysis of just unsuccessful and/or not-retained Indigenous student cohort, that the student survey responses are representative?
Confidence interval at 95% confidence level, for Indigenous student cohort, versus not-retained or not successful sub-cohorts, single collection period (1CP) v aggregate collection periods (4CP)
Key Findings • Indigenous students are under-represented in MyView. • The confidence intervals, even for a single survey period were narrow enough to give statistically confidence in using the survey data to focus on analysis of Indigenous student responses. • And if we aggregate survey periods, we can also be statistically confident using the data to focus on analysis of unsuccessful or non-retained Indigenous students.
Key Findings • However, the under-representation of unsuccessful and/or non-retained students indicates the possible presence of non-response bias. • This means even though the confidence intervals are narrow, the possible non-response bias means that the responses possibly don’t represent the views of the survey population.
Achieving better representation? • Changing the tool or the collection methodology significantly may impact utility of the tool in meeting it’s primary purpose.
Achieving better representation? • Stratified sample response management to better manage response rates for Indigenous (and International) student cohorts. • However, given unsuccessful and/or non-retained students account for all of the survey under-representation, and these students can’t be identified at the time of survey collection, stratified sample management may consume huge resources for very little return.
Achieving better representation? • Stratified sample response management to better manage sub-cohorts such as: • under 30 years of age; • from very remote, or outer regional remoteness areas; and • in education and phycology disciplines • Whole of University approach to increasing engagement from young students • Indigenous specific approach to increasing responses from remoteness areas and disciplines.
Achieving better representation? • Weighting responses. • Given the statistical confidence in the data, it would theoretically be possible to weight the existing data to achieve better data representativeness for analysis. • However, without understanding and defining the non-response bias from unsuccessful and/or non-retained students it would be misleading to do so.
Understanding non-response bias • Future research could be undertaken to confirm if the under-representation of unsuccessful and/or not retained students is causing non-response bias, and to define the nature of this. • A range of approaches in the literature to do this: • Late response analysis • Interview cross analysis
Should the data be use for student success and retention strategy formulation or measurement? • No • Define non-response bias • Explore appropriateness of the questions