290 likes | 526 Views
Training Needs Analysis Student Systems. Matthew Taylor Training & Communications Manager Student Systems. Training Needs Analysis Process. Timelines. Phase 1 (Oct): Draft survey questions Staff interviews Student meetings Analysis of Event booking statistics
E N D
Training Needs AnalysisStudent Systems • Matthew Taylor • Training & Communications Manager • Student Systems
Timelines • Phase 1 (Oct): • Draft survey questions • Staff interviews • Student meetings • Analysis of Event booking statistics • Analysis of staff requesting new roles • Analysis of e:Vision role attribution • Final Draft Survey questionnaires • Phase 2 (Nov): • Issue surveys • Gather responses • TNA survey analysis & conclusions • Phase 3 (Dec) • Design training approach
Overall Web analytics (April to Oct ‘13) 86, 609 Unique views 113, 753 Repeat Page views
Student Unique Views 62, 818 Unique views 79, 834 Repeat Page views
Student Unique Views 62, 818 Unique views 79, 834 Repeat Page views
Student Web Analytics – Key Observations • The Student pages are heavily viewed • Many pages need review & enhancement e.g. EUCLID Portal & Online PG Application • The top 5 student pages and sub-pages make up approx. 72.5% of the total unique views. • 47.9% of these pages are made up of Student/FAQ & Student/Support based pages • A third of all FAQ unique hits related to: • Completing the PG Online application • What happens after application submission? • The Support pages are divided between the following subjects: • PG Online Application (26.5% of all support views) • The EUCLID portal (35.5% of all support views) • These areas require review and enhancement
Staff Unique Views 23, 528Unique views 33, 852 Repeat Page views
Staff Unique Views 23, 528 Unique views 33, 852 Repeat Page views
Staff Web Analytics – Key Observations • Curricula Management makes up to 47.9% of all staff unique page views, 65.5% contains CCAM pages. • These pages need review to reduce size & enhance navigation for staff use • There are a large number of SACS web pages with <10 unique views • These should be reviewed for archiving • The Mini-Portfolio, PPMD, Reporting & Reference Material web pages are not flagged for SACS Google Analytics. • These need to be flagged as soon as possible.
Web Analytics (All) – Key Observations • End page content with anchored links to FAQ within the content are not useful for statistical analysis • This can be remedied by having FAQs linked to unique page content, however, is this good web design? Will it inhibit the web experience? • Analysing the Google statistics is hindered by the variations of URL capitalisation for the same web page. For example, student/support, will have a set of view statistics, however, Student/support will have a unique set of statistics,. • If it is an issue that can be resolved here, this should be resolved to aid analysis. However, it might be the end user typing error? • Great care needs to be taken on reading too much into Web analytics, especially when dealing with less that 1 full academic year!
Student Surveys • Student Focus Group (30 attendees, 3 sessions) • Web Comparison (UoE vs Stanford) • Web Usability • Learning Styles Analysis • Student Questionnaire (11/12/13 – 20/12/13) • Draft sign off (Student Survey Unit/EUSA/School/College feedback) • Pilot (handful students) • School of Education • 1737 recipients • 2.5% response rate (43 students)
Student Forum Observations • Web Comparison • UoE vs. Stanford • UoE rated higher than Stanford for navigation • UoE site considered ‘Simple’ • UoE Video tutorial considered ‘Long’ • Stanford Video Tutorial narration considered ‘Clearer’ • UoE Video narration considered ‘too fast’ • Web Usability • Student Self Help pages • ‘Simple & Easy’, ‘Very Clear’ • Help Request Form • ‘hard to locate’ , ‘not able to find’ • Learning Styles analysis • Heard of SACS? 4/16 Yes • Heard of EUCLID? 16/16 Yes • Required help with EUCLID? 7/16 Yes • Online Application 3 • EUCLID Portal 4 • Blended learning? • Shorter videos, shorter text guidance • “the absolutely best thing about online help is the direct chatbox”
Student Survey Observations • Timing • National Student Survey (NSS) & Edinburgh Student Experience Survey (ESES) due to be issued in January • PG Survey in February • Ethics committee sign off required for survey the crosses schools (2 weeks turnaround) • Personal Tutors, Courses, Induction surveys (August-Sept) • Section 1: Your details: • 81.4 Full-time, 14% Part time, Other 4.7% • 81.4% Year 1 • 86% Mature students (over 21)
Student Survey Observations cont. • Section 2: Previous Experience • 23.3% had heard of Student, Admissions & Curricula (SACS) • 95.3% heard of EUCLID (SITS e:Vision) • 58.1% did not recall an issue with software • Of 23.3% who had a training issue , 69.2% searched for online assistance with issue • 44.4% found it difficult to find online guidance • 70% found the online guidance resolved the issue • 71.4% satisfied with online help
Student Survey Observations cont. • Section 3: Preference for online help • Most effective (i.e. rated effective to very effective): • User guide (text/screen shots) • Effective to Very Effective: 25 (59.5%), Not effective to Satisfactory: 11 (26.2%) • FAQs • Effective to Very Effective: 25 (58.1%), Not very effective to Satisfactory: 13 (30.3%) • Forum • Effective to Very Effective: 19 (46.3%), Not very effective to Satisfactory: 16 (39%) • Narrated Video • Effective to Very Effective: 18 (42.8%), Not effective to Satisfactory: 15 (35.7%) • Online Course • Effective to Very Effective:16(40%), Not effective to Satisfactory: 11 (27.5%) • Webinar • Effective to Very Effective: 15 (35.7%), Not effective to Satisfactory: 9 (21.5%) • Wiki • Effective to Very Effective: 13 (31.8%), Not very effective to Satisfactory: 15 (36.6%)
Can you think of any ways to improve online help? Have people upload videos of their own issues, that others can comment on Live operator to provide ‘chat’ help Redesign your pages/portals to be more friendly & intuitive. Quick, easy & clear help on the main page is very helpful
Staff Survey • Staff Interviews (Oct-Dec 13) • 2-3 nominated staff from each school & central service • 1 hour interview, with questionnaire • Staff Questionnaire (07/01/14 to 17/01/14) • Draft sign off (Student Survey Unit/HR/School/College feedback) • Pilot (training contacts: 14) • All staff using Student System software • Bristol Online Survey tool • 3010 recipients • 9.3% response rate (279 submitted)
Staff Survey - Observations • Staff Interviews (Oct-Dec 13) • Key training contact sign up • Peer review training material • Identify silent partners • Encourage engagement • Section 1: Your Details • 37.3% Managers • 80% School/College, 20% Corporate Services (Registry, SRA, etc.) • Most staff utilising Student Administration, Reporting software (non-SITS) & Admissions
Staff Survey - Observations • Section 2: Your student system use • 62% using e:Vision daily, 26.5% weekly • 35.8% used e:Vision 1-3 year, 50.5% for more than 3 years
Staff Survey - Observations • Section 3: Experience & Preferences • 24.4% attended classroom training in last year, 73.5% did not
Staff Survey – Observations cont. • Section 3: Experience & Preferences • In office training: • 34.1% Yes • 63.1% No • Training provision preference: • 49.5% local • 9.3% central • 41.2% both • Induction training? • 67% No, 28% Yes, 5% Don’t know
Staff feedback? Everyone use systems in different ways - it is difficult to streamline the training Training too basic with no follow up courses What is the SACS online user guide? Never heard of it. If I need training, then the software is poorly designed (Academic)
Quick wins • New student ‘Self Help’ web pages • Embedding web help into student e:Vision software (existing software) • Creating standards for Developers to embed web help in e:Vision (ongoing) • Staff peer testing of online training materials • Brief, FAQ based online tutorials developed for staff & students
New Timelines • Phase 1 (March): • Analysis of Event booking statistics • Analysis of staff requesting new roles • Analysis of e:Vision role attribution • Phase 2 (March): • TNA survey analysis & conclusions • Phase 3 (April) • Design training approach
What’s next? • Statistical Analysis • Publish findings • Consultation with business in proposed training approach • Training Design • Resources estimate • Develop materials • *Gamification of eLearning • *Learning Management System creation (Learning & Development Forum) • *Follow up training needs analysis (6 months?) • Student/Staff Surveys • Web analytics, etc.