800 likes | 915 Views
Computer Science Education Research Conference (CSERC ‘11) 7 th April 2011, Heelen, The Netherlands Sally Fincher. Computing Education Research. About me. I run the Computing Education Research Group in the Computing Laboratory at the University of Kent (have done since 1997)
E N D
Computer Science Education Research Conference (CSERC ‘11) 7th April 2011, Heelen, The Netherlands Sally Fincher Computing Education Research
About me • I run the Computing Education Research Group in the Computing Laboratory at the University of Kent (have done since 1997) • So know about trying to do disciplinary-specific education research • I edit the Journal Computer Science Education • So have seen what others think are appropriate outputs of disciplinary-specific education reearch • I was Secretary of ACM Special Interest Group on Computer Science Education (SIGCSE) for 6 years • So have some sense of the scale of the interest in this area
Bootstrapping & friends • From 2002/3 I devised (together with Marian Petre from the Open University and Josh Tenenberg from the University of Washington, Tacoma) a series of workshops, aimed at helping people find a “way in” to CS Education research.
Bootstrapping & friends • We collected some of that material into a book • I’ll revisit some of that today
Topics and Areas • One of the ways we sliced things in 2004 was by topic, by what people were interested in researching. We listed 10 areas then: • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Developing a Computer Science Curriculum in the South African Context
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline A Lab-based Approach for Introductory Computing that Emphasizes Collaboration
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Plagiarism detection for Java: a tool comparison
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Peer Production & Peer Support at the Free Technology Academy
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Student discussion forums: What is in it for them?
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Sciences, Computing, Informatics: who is the keeper of the Real Faith?
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Game Based Learning for Computer Science Education
A Distributed Virtual Computer Security Lab with Central Authority Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline
Topics and Areas • Student Understanding • Animation, visualization & simulation • Teaching methods • Assessment • Educational technology • Transferring professional practice to the classroom • Incorporating new developments & technologies • Transferring from f2f to distance education • Recruitment and retention (incl. diversity & gender) • Construction of the discipline Some challenges for Computer Science Education in a Knowledge Society
Topics and Areas • Today, these don’t seem to be wrong. • They’re still areas that motivate people to work in the area. The still give a response to the question“Why are you interested in Computing Education Research?” • But they are not a complete answer. • They don’t speak to what people want as a result of a piece of Computing Education Research • And they don’t begin to address the “How do you do Computing Education Research?” question
CSEd Research: Three lenses • Discipline • Classroom • Community
CSEd Research: Three lenses • Discipline • Classroom • Community
Discipline • The methods of CS education research are not the methods of CS. • You cannot study classrooms, people and their interactions with algorithms and proofs. • As CSEd researchers, we have to use other methods – other epistemologies, “ways of knowing” ̶ from other disciplines • Education ... • Sociology ... • Psychology ... • Anthropology ... • Can be methodologically isolating
Cognitive Psychological tradition • Many early CS Ed investigations (1980’s) were from the psychological tradition – conducted by psychologists for the most part. • Programming was a new and interesting way to investigate how people think, and was conducive to laboratory-based investigation. • There was a very influential series of meetings/publications Empirical Studies of Programmers
Cognitive Psychological: what people want • The sort of positivist, quantitative knowledge that cognitive psychology creates is very attractive – and persistently so. • PPIG posting, 20th March 2011 • To clarify my test … I'm going to compare miniC (a minimal C implementation built on BYOB) vs regular C environments. I want to test if students that learned C by using miniC: - do less syntactic mistakes - remember to declare their variables more often - use sequence/loop/conditionals in a more consistent way
PPIG mailing list: 20th March 2011 • What I'm not sure is if they are using the correct instruments to get correct, unbiased data from their tests. … • After all, from these tests they should claim that their work is successful. Are they doing it in the right way? Shouldn't we have a common, clearly-understood test-bed on which this kind of experimentation should be performed? This doesn't mean that the test-bed should be unupdateable, but at least important part of it should. Otherwise our tests wouldn't be comparable as they should.
PPIG mailing list: 20th March 2011 • What I'm not sure is if they are using the correct instruments to get correct, unbiased data from their tests. … • After all, from these tests they should claim that their work is successful. Are they doing it in the right way? Shouldn't we have a common, clearly-understood test-bed on which this kind of experimentation should be performed? This doesn't mean that the test-bed should be unupdateable, but at least important part of it should. Otherwise our tests wouldn't be comparable as they should.
PPIG mailing list: 20th March 2011 • What I'm not sure is if they are using the correct instruments to get correct, unbiased data from their tests. … • After all, from these tests they should claim that their work is successful. Are they doing it in the right way? Shouldn't we have a common, clearly-understood test-bed on which this kind of experimentation should be performed? This doesn't mean that the test-bed should be unupdateable, but at least important part of it should. Otherwise our tests wouldn't be comparable as they should.
PPIG mailing list: 20th March 2011 • What I'm not sure is if they are using the correct instruments to get correct, unbiased data from their tests. … • After all, from these tests they should claim that their work is successful. Are they doing it in the right way? Shouldn't we have a common, clearly-understood test-bed on which this kind of experimentation should be performed? This doesn't mean that the test-bed should be unupdateable, but at least important part of it should. Otherwise our tests wouldn't be comparable as they should.
Epistemology • We can see in this plea the assumptions – and expectations – of the experimental sciences. • A fixed natural world, with knowable data • The notion of predictive theory • The expectation that my classroom, my students and their abilities are in some sense the same as yours, are directly comparable with yours.
Cognitive Psychological, today • Allison Elliot Tew & Mark Guzdial • Developing a validated assessment of fundamental CS1 concepts. Proceedings of the 41st SIGCSE Technical Symposium on Computer Science Education, (Milwaukee, WI), 97-101, 2010 • The FCS1: A Language Independent Assessment of CS1 Knowledge. Proceedings of the 42nd SIGCSE Technical Symposium on Computer Science Education, (Dallas, TX), 2011
Other questions, other needs • I was trying to understand real learning in real classrooms, but I was using the conventional pencil-and-paper things and massaging the data with multivariate statistics. Collecting the data directly from students ... and then trying to make sense of it just led me into a huge shift in my thinking • I went away and started to think about it – opening up the whole business of how experience is interpreted. I don’t think before that I’d thought much about how people construct their own meaning for experiences and events (quotes reported in Fencham, p. 42 & 45)
Sociological tradition • Sociology asks different questions, and uses different methods to answer them. It is (usually) more group-based than psychological investigations, and more interested in the nature of processes. Its theories (characteristic of the social sciences) are explanatory rather than predictive. Key sociological questions are: • What role does education play in the life chances of different groups? • How can we best explain why some groups systematically win and others lose? • Is education a means of liberating individuals or is a means of social control?
Sociological tradition • Lecia Jane Barker, Kathy Garvin-Doxas & Michele Jackson • Defensive Climate in the Computer Science Classroom Proceedings of the 33rd SIGCSE technical symposium on Computer science education (Covington, KY), 43-47, 2002 • A learning environment comprises “all of the physical surroundings, psychosocial or emotional conditions and social or cultural influences” present in a learning situation. • Over the course of an academic year they “ethnographically” observed 10 courses for a total of 254 hours.
Sociological tradition (ii) • Categories emerging from data analysis included 1) impersonal environment and guarded behaviour; and 2) the creation and maintenance of informal heirarchy resulting in competetive behaviours. These communication patterns lead to a defensive climate characterised by competitiiveness rather than cooperation, judgments about others, superiority and neutrality rather than empathy.
Educational tradition • Although “education” as a field is itself is a hybrid disciplinary influence and method. • The Case for Case Studies Communications of the ACM (CACM) Volume 35 Issue 3, March 1992 • Marcia Linn & Michael Clancy
Educational tradition • Another way of working – teaming up. • “Marcia Linn showed up at my door ... she said, we want to research people learning to program and we've heard you teach a lot of those.”
Teaming up • “There was a vast set of things I got from this. I was a clueless teacher, trying things as a whim and flying by the seat of my pants. What I got from Marcia is that she could see the big picture and say “Oh yeah—what you're doing is—and here's how it is in math, physics and chemistry” ... then I could see how what they were doing in other disciplines related to what I was trying.”
Tested • Mixed methods – “balanced groups” (some with case studies, some without), small-scale observations, statistical conclusions. • “I still think that—I have not found a better way to address complex concepts but to use a case study to make concepts concrete”
Computer Science • Compared to other areas of disciplinary-specific education research, we—uniquely—have additional disciplinary influences ... • ... but they’re not academic influences • They come from the practice of the discipline. • And they come in two ways ... • Industrial practice • (e.g. pair programming) • Practice of our craft • (e.g. tool building – Alice, Greenfoot, Scratch)
Computing Education Discipline • Partly a matter of temperament—what methods and approaches are you comfortable with? • Partly a matter of epistemology—what questions do you want to ask and what evidence will satisfy you that they’ve been answered? Psychology Statistics Anthropology
CSEd Research: Three lenses • Discipline • Classroom • Community
CSEd Research: Three lenses • Discipline • Classroom • Community
Classroom • Most CS Education researchers are not motivated by generalised results – at least not at first. • Most are motivated to understand what happens in their classroom, how to describe the learning that takes place there and what to do to change/improve it. There are some obvious problems with this
A single author, presenting results for a single institution. Fincher (2001): A Fictitious Paper A Study of Assessment of Programming Skills of First-Year CS Students Sally Fincher University of Kent UK
Fincher (2001): A Fictitious Paper • Explanations? • Sally can’t teach. • The students are British. • Sally teaches at an atypically poor institution • If Sally changed: • From Pascal to C to C++ to Java to Python • Objects early ← → Procedural • Used closed labs ← → Used open labs • More assignments ← → Less assignments • < insert your favourite deck chair permutation for the “C.S. Titanic” >
McCracken, et al. (2001) 10 authors; data collected at 4 universities in 2 countries: They allcan’t teach? They can’t all be British! They are allatypically poor institutions? Fincher (2001): A Fictitious Paper
The “ITiCSE Working Group Model”: i • ITiCSE (Innovation and Technology in Computer Science Education). • ACM SIGCSE European conference. • 16th year (27-29 June, Darmstadt) • Has associated “working groups”
The “ITiCSE Working Group Model”: ii a) topics are proposed, and peer-reviewed b) one or more topics are selected for presentation c) the topic is posted with an invitation for others to join in the work specified d) the resulting group(s) work electronically before the conference, then work at the conference (and often for a day or more in advance) e) the group(s) write a paper detailing their results. This is peer-reviewed and, if accepted, published in the SIGCSE Bulletin f) the group disbands
The “ITiCSE Working Group Model”: iii Usually these groups produce a report that: • Distils collected resources and experiences on an issue of direct relevance to practicing teachers. • For example Resources, Tools, and Techniques for Problem Based Learning in Computing, 1998 & A Road Map for Teaching Introductory Programming Using LEGO Mindstorms Robots, 2003 • Addresses common problems that benefit from the application of collective intellectual and analytical effort. • For example: How shall we assess this?, 2004 & Evaluation: turning technology from toy to tool 1996
The “ITiCSE Working Group Model”: iv • McCracken (2001) was the first to adapt this to an empirical study • Subsequently replicated, notably: • 2004: The “Leeds Group” • A study of novice program comprehension • Data from 12 universities in 6 countries
The “ITiCSE Working Group Model”: v • Lightweight • (Relatively) modest commitment of time/effort
“Bootstrapping” model • Series of interventions to give practitioners a “way in” to CS Education research
Year One Intervening Year Two Four day workshop “Input” – methods, presentations Work on their own studies Introduce the Experiment Kit Execution of the Experiment Kit Four day workshop Analyse data in aggregate. Write paper Work on their own studies The Bootstrapping Model
Experiment Kit Structure (Bootstrapping) 1. Question formulation 2. Protocol a. Data collection specification b. Human Subjects materials c. Background questionnaire d. Discriminator question e. Specification of set-up f. Experimenters script (including guidance on notes/diagramming) g. Participant design brief h. Design criteria elicitation Stimuli set i. Design criteria elicitation Recording Sheet 3. Analysis protocol 4. Background 5. Literature