210 likes | 328 Views
HEA STEM Conference London, April 2012. assessment of reflection, critical evaluation and leadership skills through participation in online discussions Associate Professor Jacqui Taylor Psychology Research Centre, School of Design, Engineering & Computing. Motivation for paper.
E N D
HEA STEM Conference London, April 2012 assessment of reflection, critical evaluation and leadership skills through participation in online discussions Associate Professor Jacqui Taylor Psychology Research Centre, School of Design, Engineering & Computing
Motivation for paper Increasing use of blogs, SNSs, VLEs, wikis to encourage academic discourse between students Online group discussion develops traditional + new skills Pearlman (2010) “students must develop C21st competencies if they are to be academically successful and prepared for future workplaces in a global knowledge economy ...we need to create learning environments that make learning these skills possible” However, difficult to assess -> unassessed / not used Assessment quantitatively and qualitatively changes participation and learning
Presentation Outline • Assessing f-to-f discourse • Assessing (evaluating) online discourse • Current assessment • Potential for using quantitative content analysis (QCA) and automated methods
1. Assessing f-to-f academic discourse between students Develop: understanding of different views public speaking / debating skills Can assess individual presentation skills or contribution towards group project, but actual discourse difficult (real time + multiple parties) - recorded time-consuming
2. Assessing online academic discourse between students Recorded + serial (reflection) but little guidance for educators Vonderwall et al (2009) highlight paucity of practical advice - discuss processes that ‘could’ be assessed e.g. learner autonomy, learning community and writing skills ‘asynchronous online discussions facilitate a multidimensional process of assessment ...further research is needed to understand what assessment strategies or criteria enhance assessment and learning’ (p309)
Empirical research – mainly evaluation Newman et al (1994) critical thinking f-to-f and online seminars (repeated): difference in quality + quantity more new ideas emerged f-to-f, but ideas online rated as more important, justified or linked Heckman et al (2002) 4 f-to-f and 4 online discussions online discussions generated high levels of cognitive activity, equal to or were superior to those identified in the f-to-f discussions
Factors to consider (i) task type Kanuka et al. (2007) different activity affected quality of students’ online messages + student engagement (iii) individual differences gender and personality related to preference, not performance International and ALN students Others: group composition, moderation, synch/asynch..
Assessment Gafni & Geri (2010) assessing online contributions increases participation + enhances quality of academic discourse Swan et al. (2008) provide assessment criteria = deeper learning Two ways to assess contributions to online discussions (Hazari, 2004): analytic marking = assign marks to specified criteria holistic marking = assign marks to whole unit of analysis without scoring individual criteria Unit of analysis: discussion, message, sentence, line
3. My assessment practice(based on Mason, 1991/5) Measurement of online transcripts based on educational value exhibited - useful questions: whether participant draws on own experience? whether refer to course material or extended? whether initiate new ideas for discussion? whether message builds on previous messages?
Participation in three online discussions and leadership of one discussion Participation = 15% (x3) individually quality of each message: clarity, timeliness/appropriate research, intellectual (evaluative) holistically: reflection Leading = 55% individually: initial and concluding message holistically: facilitation / motivation, reflection, coherent viewpoint
(i) Reflection encourage to reflect on academic + credible media sources (BBC, weblinks to news) recognise: links to personal experiences - illustrate important concepts + apply to wider context news.bbc.co.uk/2/hi/uk_news/education/4250281.stm Dating websites can give people a "surprisingly high" chance of long-term romance, suggests a study published on St Valentine's Day.
(ii) Critical evaluation / extended research recognise: - questioning / building on previous messages / research - extent and timeliness of resources used e.g. published within last two years and for research not already covered in lectures
(iii) Leadership skills - handout benefits + problems of virtual teams + tips style = balance between social and task-based recognise attempts to encourage interactivity / motivating comments e.g. referring to names (social presence) at start: definitions, boundaries, time plan at end: summarise research + views weave research into responses + questions
(4) Potential for using Quantitative Content Analysis (QCA) and automation to assist assessment Motivating factors: change to assessment strategy (the sole assessment (from 30% to 100%) more detailed feedback to students student expectations and experience e.g. student effort (min to max limits!)
Most research theoretically driven by framework of Garrison et al (2001) (cognitive presence, social presence and teaching presence) Rourke et al (2001) 19 studies on potential use and methodological challenges-> primarily aimed at educational technologists and researchers “QCA method to systematically and rigorously measure cognitive presence online - useful in guiding educators in adoption, design, implementation but less useful as an assessment tool”
Software could potentially assist assessment and feedback last 5 yrs explosion in use of software to analyse text 100s software products available – see Kdnuggets: Text Analysis, Text Mining and Information Retrieval http://www.kdnuggets.com/software/text.html many for use in specific fields – specific features - politics (e.g. Hopkins & King, 2010) - health (e.g. Kim, 2009) - marketing and advertising few packages or pedagogic use / social sciences
…a long way Despite advances of methods for automated CA in media analysis, most methods only able to highlight /count instances of pre-specified words or phrases - long way from automated assessment of critical thinking! - but new products, e.g. NodeXL (free, open-source template) makes it easy to develop network graphs from data entered in Excel to provide assessment feedback in a visual format
Online resources READING Linguistic Inquiry and Word Count (LIWC) http://www.liwc.net/ TextSTAT http://neon.niederlandistik.fu-berlin.de/en/textstat/ QSR NVivo http://www.qsrinternational.com ATLAS.ti http://www.atlasti.com/ Ranks NL: http://www.ranks.nl/ TAMS Analyzer (Text Analysis Mark-up System) http://tamsys.sourceforge.net/ Basic = word frequency lists and concordances Advanced = powerful search possibilities (e.g. to identify regular expressions or phrases)
Conclusions • assessment methods need to be modernised, to reflect changes in learning activities taking place with Web 2.0 • consider experiences/expectations of current generation • McGaw (2009) new 21st Century skills which are developed through interactions using social media - assessment of these skills remains a challenging area for academics • QCA has potential to standardise and save time in future • consider individual differences, e.g. students with additional learning needs (eg dyslexia), those with English as a second language - assessed according to relevant marking guidelines
Thank you!Please contact me for further infojtaylor@bournemouth.ac.uk • Two adverts: • Thursday 19 April London 2012 BPS Division of Academics, Researchers and Teachers in Psychology Inaugural Conference • Wednesday 13 June Bournemouth 2012 HEA Workshop Using the Internet for Learning, Teaching & Research in Psychology
Refs to my research in this area • Mair, C. & Taylor, J (2011). Critical reflection and discussion mediated by a VLE. INTED Conference, Valencia, 7-9 March. • Taylor, J., 2010. Psychological and contextual issues in technology-enhanced learning: individual differences and the student emotional experience.In: Brown, E., ed. Education in the wild: contextual and location-based mobile learning in action. Nottingham: LSRI, pp. 32-35. • Taylor, J., & Pereira, C. & Jones, M. 2008. The relationship between preferred modal learning style and patterns of use and completion of an online project management training programme. 13th Learning Styles International Network Conference (ELSIN). Ghent, Belgium, pp. 401-408. • Taylor, J. 2008. Teaching psychology to computing students. Psychology Teaching Review, 14(1), 21-29. • Morgan, S. & Taylor, J. 2007. Computer Based Flow and Learning. The 1st Applied Positive Psychology Conference. University of Warwick, April. • Taylor, J. & House, B., 2003. A survey of computer use and communication in the homes of 1st, 2nd and 3rd year undergraduate students. In Proceedings of Home Oriented Informatics and Telematics (HOIT) Conference, LA. • Taylor, J., 2002. A review of the use of asynchronous e-seminars in undergraduate education. In: Hazemi, R. and Hailes, S., eds. The digital university: Building a learning community. London: Springer, pp. 125-138. • Taylor, J., 2001. Approaches to studying and the perception of e-seminar discussions. In: Proceedings of the 6th International Learning Styles Conference (ELSIN). Wales: pp. 347-359.