520 likes | 611 Views
Accommodating twice the students … and more than usual who aren’t meeting expectations. Bubble in The System:. A look at risk planning, what went wrong, how things progressed regardless, and what the future might hold…. Presenters: Simon Winberg & Robyn Verinder
E N D
Accommodating twice the students … and more than usual who aren’t meeting expectations. Bubble in The System: A look at risk planning, what went wrong, how things progressed regardless, and what the future might hold… Presenters: Simon Winberg & Robyn Verinder Department of Electrical Engineering University of Cape Town November 2009
Outline • The Context • Problems and what was done about them • How they happened • How we planned for these potential risks • What was done in practice • Will these problems go away? • Tracking the bubble • Reflections • Conclusions
The Context • 1st year Computing for Electrical Engineers course • For students who did not do Computer Studies (CS) for matricOR • For students who did CS but achieved a lowCS mark (below a C). Students (w/o CS) but did wellfor math and science have thenoption to do 1st year computercourses in the CS dept.
Focus of this study EEE1003W Computing for Electrical Engineers EE 1st year Intake CS100X Computer Science 1st year
Class composition (2008) Computing for Electrical Engineers 87 registered (4 repeating) 84 wrote final exam (3 dropped out or transferred)
Class composition (2009) Computing for Electrical Engineers 85 more students at start of 2009 68 more throughout 2009 than 2008 81% more students in 2009 Total 1st year intake for Electrical Engineeringwas 49% higher this year than average of 2006-8 167 registered (3 repeating) 152 wrote final exam (15 dropped out or transferred – 9% dropped vs. 3% in ‘08)
Problem 1: Large intake Larger than anticipatedintake for EE 1st year intake upby 49% from 2008 Much of this 49% additional students seems to have gone into the Computing for Electrical Engineers course
Enrolment per programme • 2006-8 the split between programmes remained fairly consistent; • 2009 more mechatronics 162 184 175 260
Total EE first year intake by nationality 262 184 175 162
Problem 1 consequences • Finding a big enough lecture venue; • Lab space limitations & availability; • Difficulty finding enough good tutors. • Problems down the line…
Problems downthe line… LECTURE 1 LECTURE 1 I’m John I’m Bob • Lab expansions • Size of 2nd & 3rd year laboratories increased • Acquisition of more equipment • Staffing • More contract staff(TAs, tutors, etc.) • More duplicated lectures, • More double-period lectures • Administrative headaches • Finding & focusing funding towards these ‘immediate’ needs
Problem 2:Fundamental Problem • “A big bubble in the system” • Significantly weaker students • Possibly a once-off event • A larger portion of students found the course work more difficult than previously… although the level of the coursework has not changed much
Is it a once-off bubble? But is this a bubble,i.e. a once-off event,or is it the new norm? Will the proportion ofweak students returnto what it was before? Perhaps… will be discussed later…
2010 Intake estimates:‘admission probable’ • The latest estimated intake figures above have been calculated by the EBE faculty based on provisional results. • The actual figures can only be confirmed once this year’s matric marks have been obtained and processed. • It might just be a once-off bubble *Figures provided by UCT EBE Faculty (20 Nov 2009)
How did it happen? Oops! • Change in matric subjectsand examination • Removal of higher grade option • Change to 2 papers for math exam • Removed geometry paper • Matric results appear to be skewed • No longer a good predictor for performance in engineering, which higher grade math used to be.
NationalBenchmark Tests (NBTs) Statistics from HEQC (2009). MacGregor, K. ‘South Africa: Shocking results from university tests’, World University News – Africa Edition. Issue 35, August (2009). • National benchmark tests confirm these experience in electrical engineering: • Only 7% proficient in math • 25% proficient quantitative literacy • 47% proficient academic literacy in English
NBTs Should these tests still beused just to benchmarkmatric results, and notinfluence admission? OR could these be used todecide university admission?
2008 Risk planning related to Problem 1 (intake) * Based on email correspondence November 2008 • Faculty provided 2009 estimation: • Plan for 20% increase: i.e., 100 students (but not an 80% increase!) • Based on total EE intake estimations for past 5 years (giving average of 10% increase per year over 5 years) * • Risk planning for Problem 1: • Ensuring sufficient budget for twoadditional tutors. • More multiple choice / other techniquesfor faster marking
2008 Risk planning related to Problem 2(weaker students) • Mostly based on 2008 reflections… • Initial diagnostic assessment (IDA) • Held in 1st week to identifystudents needing extra support • Bi-weekly extra tuition sessions • TA to assist struggling students • Hot seat during lab times • Chief tutor / tutors’ tutor help students with more tricky theory-related questions or assist other tutors.
What was done in practice… www.drollthings.com/?p=2384 Our risk planning was beneficial, but: The problem wasoutside theenvisagedboundaries
Initial diagnostic assessment (IDA) Results • Students achieved basic requirements of: • Computer literacy requirements(use of MS Word, Google searches, etc.). • Basic computer skills (using MS Excel, etc.) forbasic engineering type problems. • But later showed difficulties in: • Mathematics: difficulty in more complexproblem-solving tasks; • Academic English proficiency: understandingproblem descriptions; articulating solutions.
What was done • More tutors • Lab assignments madesmaller, given more time • 2008: 6 pracs • 2009: 8 pracs • Project separated into two • 2008: 1 project • 2009: 2 projects • Additional tests, to help improve the pass rate for June and Nov exams
What was done … essentially a lot of WORK - chopping and changing aspects of the course …
Throughput • Despite challenges, thepass rate was not too farbelow previous years… • Pass rate 2009: 82% (18% failed) • 141 wrote exam (11 did not get a DP) • 125 of 152 students passed. 8 borderline. • Pass rate 2008: 86% (14% failed) • 82 wrote final exam (2 did not get a DP) • 72 of 84 students passed. 2 borderline.
Will the problems go away? ± 1,500,000 learners started school in 1997 =100,000 learners ± 500,000 learners (30%) wrote matric in 2008 The bubble University Intake <100,000 (20%) achieved a university endorsement … So, the problem is likely to remain for a while longer Approximate HEQC figures (2009)
Students have basic computer ‘skills’ But lacking foundation knowledge (maths, advanced academic literacy) needed for programming-based solving of engineering problems Limited depth of knowledge Computer literacy, basic applications, basic academic English Mathematics, Problem-solving, high level academic literacy Adapted from: European Science Foundation (2002)
Tracking the bubble: how did the students progress? • Which of the weak become strong? • Methodology • Utilizing an adapted for of skills matrix presented by Nicholls (1995) • Matric results used to determine initial placement of averaged math & science mark (left = higher mark) • Initial diagnostic assessment used to determine initial placement of computer skill
Knowledge and skills matrix Representation schema adapted from Nicholls, J. (1995)."The MCC decision matrix", Journal of Management Decision: 33(6): 4-5.
Knowledge and skills matrix Full steam ahead! Results based on cohort of 2009 students that exhibited good math & science matric grades, but demonstrated only mediocre computer proficiency in computer test 1.
Knowledge and skills matrix Guide me! Results based on cohort of 2009 students that exhibited good math & science matric grades, but demonstrated only mediocre computer proficiency in computer test 1.
Knowledge and skills matrix Full steam ahead! Potential traps Results based on cohort of 2009 students that exhibited comparatively lower math & science matric grades, but demonstrated good computer proficiency in computer test 1.
Knowledge and skills matrix SOS! Results based on cohort of 2009 students that exhibited comparatively lower math & science matric grades, but demonstrated good computer proficiency in computer test 1.
Tracking the bubble: How did the weaker students’ positions withinthe skills matrix change?
Knowledge and Skills matrix – starting point 33 55 60 Failed IDA: 4 =10x students
Knowledge and Skills matrix – progression 33 54 1 60 Failed IDA: 5 =10x students
Knowledge and Skills matrix – progression 6 really good 33 54 60 Failed IDA: 5 =10x students
Knowledge and Skills matrix – progression 33 54 60 Failed IDA: 5 =10x students
Knowledge and Skills matrix – progression 84 54 51 passed 9 Failed IDA: 5 =10x students
Knowledge and Skills matrix – progression 84 54 (2) 7 Failed IDA: 5 =10x students
Knowledge and Skills matrix – progression 41 127 13 (5) 7 Failed IDA: 5 =10x students
Knowledge and Skills matrix – final point 127 13 (5) 7 Failed IDA: 5 =10x students
Reflections • Crooks, T. (1988). "The impact of classroom evaluation practices on students." Review of educational research 58(4): 438. • Zimmerman & Bandura, et al. (1992). "Self-motivation for academic attainment: The role of self-efficacy beliefs and personal goal setting." American Educational Research Journal 29(3): 663. • Students with low math/science and low computer skills generally achieved notably less progression than their class mates • Many reasons for this • Lack of motivation, lack of confidence(e.g., not daring to ask for help) • Insufficient support structures…see:
Reflections • * Baron, J. and M. F. Norman (1992). "SATs Achievement Tests, and High-School Class Rank as Predictors of College Performance." Educational and Psychological Measurement 52: 1047-1047. • Students showing good math/science and lower computer skills generally performed successfully • Confirmed by many studies e.g. Baron & Norman (1992)* • This cohort had the necessary foundational knowledge to succeed. • BUT: why were 7 of the 60 left behind?
Reflections * Bergin, S. and R. Reilly (2005). "Programming: factors that influence success." ACM SIGCSE Bulletin 37(1): 411-415. • Students showing low math/science but good computer skills generally performed successfully… but some didn’t. • This cohort had to build the necessary foundational knowledge, which more advanced knowledge depended on. • Starting with good computer skills, in the information age, is a likely facilitating factor to learning* • 13 of the 55 students failed
Reflections • Students showing high math/science and good computer skills generally performed excellently. • This cohort had the advantage of the needed foundational knowledge, in addition to good computer skills to help them learn new material. • These students wentfrom good to great! • These are what we all hope for.
Reflections • What could have caused the 7 of the 60 students that did reasonably for math & science to fail? • I would have expected good math and science marks to demonstrate that the students would quickly learn computer skills • Perhaps 7/60 (12%) isn’t a major concern • Something may be wrong with my approach • Could it be an effect of last year’s matric results?