410 likes | 538 Views
Building Electronic Support Environments for First-Year University Students. Speakers (in order of appearance) Philip Barker – Theoretical Background (7 min) Oladeji Famakinwa - The EPSILON System (Design and Development) (7 min) Paul van Schaik – System Evaluation (7 min).
E N D
Building Electronic Support Environments for First-Year University Students Speakers(in order of appearance) Philip Barker – Theoretical Background (7 min) Oladeji Famakinwa - The EPSILON System (Design and Development) (7 min) Paul van Schaik – System Evaluation (7 min)
Theoretical Background Philip Barker School of Computing
Designing Learning Spaces Three broad categories of learning environment to consider: • On-campus Systems • Online Learning • Ad hoc Spaces Various combinations of conceptual and pragmatic resource categories can be used to construct these. What combinations (or ‘blends’) should be used to construct optimal learning environments?
Designing Online Learning E-Learning a X b c KnowledgeManagement Performance Support
Theoretical Perspective on EPSS • All people have both physical and cognitive limitations • The Power Law of Practice can be used to predict performance plateaus • These can be overcome through the design of appropriate performance improvement or augmentation aids • Most commonly these are simple tools, machines and intelligent devices
Expert Band Novice Band Performance Plateaus and Bands Skill Level Time
Objectives of Performance Support A performance support system can provide one or other (or all) of the following functions: • Improvement in on-the-job task performance • Provision of data, information or knowledge at a point of need • Skill and knowledge enhancement facilities
General Systems Theory We are currently interested in performance issues relating to users of academic libraries(our ‘systems’)
Electronic Performance Support System EPSS Scaffolding LIBRARY B C A System Interface USER Mental Models EPSS and Scaffolding • The system we have been building utilises five different types of digital object: • data objects • information objects • knowledge objects • learning objects • performance objects
The Epsilon System (Design and Development) Oladeji Famakinwa School of Computing
The Epsilon System (Design and Development) • Epsilon or Electronic Performance Support Systems for Libraries • First prototype • Assist students with improving their performance at doing typical library tasks such as locating books and journals • Second prototype • Tutorial Module: Impacting knowledge of library classification systems • Games Module: Developing searching strategy skills for finding books
Evaluation and Data Collection • Non-first-year evaluation (N = 20) • Conducted using paper-based workbook with the online Epsilon system • Participants constantly switching between filling in workbook and using computer • At the end of the evaluation information from workbook had to be converted into electronic format for analysis • First-year evaluation (N = 99) • Online workbook integrating evaluation instructions and guide, electronic questionnaire and Epsilon system • Participants focused on following instructions, carrying out tasks and filling in information on computer • Information already in electronic format ready for analysis
System Evaluation Paul van Schaik School of Social Sciences and Law
Evaluation Studies • Previous evaluation • Non-first-year students • Staff • Current evaluation • First-year students
Evaluation with first-year students • Three-group, pre-test/post-test design to establish the effect of using the two EPSS components • Group 1 only studied the tutorial component • Group 2 only played the game component • Group 3 both tutorial and game • Outcome measures • Knowledge of the library classification system • Confidence in knowledge • System acceptance
Participants • First-year psychology students • Ninety-nine students: 32 in Group 1, 39 in Group 2 and 28 in Group 3 • Mean age = 21 (SD = 6.36) • Twenty-one male • Mean years of computer experience = 10.78 (SD = 3.67) • Mean computer use per week = 16.01 hrs (SD = 13.38) • Mean Web use per week = 8.28 hrs (SD = 8.15)
Materials and equipment • Computer-based experiment/evaluation • Study tutorial and/or play games components (depending on group) • All data collection through experimental software - no separately printed workbook • Instructions for using the tutorial component and playing the games • Pre-test and post-test • Pages for recording task performance on the games • Questionnaire: • Demographic information • System acceptance
Procedure • Participants worked through the computer-based experiment/evaluation • This included studying the tutorial and/or playing the games, and completing sections of the experimental software when appropriate
Knowledge (2) • Mean knowledge scores (pre-test and post-test) were around 60% • Analysis of covariance (ANCOVA) was used to establish the effect of EPSS component (tutorial, game, tutorial and game) on post-test knowledge after statistically controlling for pre-test knowledge • The covariate, pre-test knowledge, was significantly related to post-test knowledge, F (1, 95) = 43.11, p < .001, r = .57 • There was no effect of EPSS component on post-test knowledge after controlling for pre-test knowledge, F (2, 95) = 1.72, p > .05
Confidence (2) • Mean confidence scores (pre-test and post-test) varied widely from 71% to 88% • ANCOVA demonstrated that the covariate, pre-test confidence, was significantly related to post-test confidence, F (1, 95) = 69.36, p < .001, r = .63 • The effect of EPSS component on post-test confidence after controlling for pre-test confidence was significant, F (2, 95) = 7.28, p < .005, 2 = .07 (medium effect size) • This result reflects the relatively high increase in confidence in Groups 1 and 3 compared to Group 2, t1 (95) = 3.74, r = .41, p < .001, t3 (65) = 2.39, r = .28, p < .05 (Bonferroni correction applied)
Acceptance • Reliability analysis • Perceived usefulness • Tutorial; Cronbach’s alpha = .95 • Game; alpha = .95 • Intention to use • Tutorial; alpha = .67 • Game; alpha = .78 • Overall scores were calculated by averaging over items • 95% confidence intervals of mean (range from -3 through +3) • Perceived usefulness • Tutorial; CI.95 = [0.34; 1.57], mean = .96 • Game; CI.95 = [0.24; 1.48], mean= .86 • Intention to use • Tutorial; CI.95 = [0.44; 1.38], mean = .91 • Game; CI.95 = [ 0.58; 1.50], mean = 1.04
Comments from staff • "They enjoyed it, they really liked it, and it really challenged them." • "The students felt this was great and they want to do it again. Look at it again". Hence we provided links to the tutorial and gaming components. • Students have come back to say they have been able to find books in the library. • Too generic particularly for first year students. The exercise should be more specific to their library needs.
Follow-up evaluation (after 3 months) • N = 43 • Measured knowledge and confidence again
Follow-up evaluation (after 3 months) (2) • Pre-use versus follow-up • Confidence: t (140) = 3.22, r = .26, p < .01 • Knowledge: t (140) = 3.76, r = .30, p < .001 • Post-use versus follow-up • Confidence: t (140) = 2.36, r = .20, p < .05 • Knowledge: t (140) = 0.29, r = .02, p >> .05 • Correlations between correctness and confidence • Pre-use: r = .04, p >> .05 • Post-use: r = .19, p > .05 • Follow-up: r = .41, p < .05
Discussion • Results from non-first-year evaluation confirmed • However, results on confidence significant this time
General discussion • EPSS well received by student-users and staff • Users found both EPSS components useful • Tutorial component enhanced users’ confidence in their own knowledge
Future developments • EPSS for library classification • Include sound and speech audio • Dynamic adaptability of content; make EPSS more suited to • specific libraries • academic disciplines • Integration with existing library facilities and services and VLEs • EPSS for students in ‘remote’ locations (including ad-hoc spaces)
Evaluation Work and Publications • van Schaik, P., Barker, P. & Famakinwa, O. (2005). Electronic performance support for learning to use academic libraries. Learning and teaching conference 2005. University of Teesside. • Barker, P., van Schaik, P. & Famakinwa, O. (2005). Potential roles for electronic performance support systems in libraries. Proceedings of Computer-Based Learning in Science (CLBIS) 2005. University of Zilina, Slovakia. • Also, conference paper presentation at CBLIS 2005 (PGB). • Barker, P. , van Schaik, P. & Famakinwa, O. (2006). Designing Learning Objects to Enhance Students’ Performance. Presentation at Learning and Teaching Conference. University of Teesside. • van Schaik, P., Barker, P. & Famakinwa, O. (2006a). Making a case for using electronic performance support systems in academic libraries. Journal of Interactive Learning Research. In press. • van Schaik, P., Barker, P. & Famakinwa, O. (2006b). Potential roles for performance support tools within library systems. The Electronic Library. In press.
Acknowledgements • The authors wish to express their gratitude to the University of Teesside and the Higher Education Academy for financial assistance to support this work • We are also indebted to the enthusiastic support they were given by the staff at the University of Teesside’s Learning and Information Services. We would particularly like to mention Ian Butchart, Sue Myer and Barbara Hull • We are also grateful to Jan Anderson for allowing us to conduct the evaluation study with her students • We also thank Andrew Young for assistance with the coding of the computer games that were used in this study.
External Links • Epsilon Modules • http://sssl-staffweb.tees.ac.uk/u0011128/epss/