340 likes | 421 Views
Killara ACP Student Voice Teacher Survey Results OCTOBER 2013 Compiled by Victoria University Pre-Service Teachers. Introduction. The Killara PS Strategic Plan details a desire for consistent approaches to student leadership and student voice.
E N D
Killara ACP Student Voice Teacher Survey ResultsOCTOBER 2013Compiled by Victoria UniversityPre-Service Teachers
Introduction • The Killara PS Strategic Plan details a desire for consistent approaches to student leadership and student voice. • The Annual Implementation Plan suggests trial and implementation opportunities for the use of student voice in the classroom and across the school. • Student voice activities are being implemented across the school and appropriate data is being collected, collated and displayed to improve teaching & learning.
Research Methodology Engagement: Scoping current practice Quantitative: Teacher survey Qualitative: Teacher interviews Student interviews Leadership interviews
Themes • The Student Voice Survey Process • Questions/Student Understanding • Teacher Analysis • Feedback Approaches
Self Selected Teacher Survey Quantitative data
Q1. Did you find the student voice survey an effective tool?
Q2. Did you make any changes to your classroom as a result of the data collected from your students?
Q3. Do you think the survey process is easy for teachers to manage?
Q4. Do you think the questions in the survey are appropriately worded for students?
Q6. Do you think the students have an understanding of the questions being asked?
Q7.Would you be willing to implement the survey into your classroom each term?
Q8. Did you feel the PST's involved in the process were helpful in completing the surveys?
Q9. Do you believe the students in your class were able to understand the questions asked?
Q10. Would you have liked to sit and ask the children why they answered what they did to get a better understanding?
Themes from teacher, student, and leadership interviews QUALITATIVE DATA
Limitations • The following limitations may have impacted on the accuracy and validity of the survey results: • Language and comprehension • Vocabulary was not age appropriate for all students. For example: Seldom – difficult for students to understand even after explanation given. • Repetitive questions made it difficult for students to differentiate subtlety between similar questions. • Broad wording of questions and some terms/words used were unclear, requiring lengthy explanation. • Time • Limited timeframe to complete surveys. • Limited time for preparation. • May have resulted in limited/inconsistent pre-survey briefings to students. • Consistency of approach • Inconsistent briefing to students prior to surveys. • Varied explanation of vocabulary and questions to students – May have resulted in different student interpretations and misconceptions of questions. • Technology and software • Software program was difficult to navigate and record data in (for example – scrolling back and forth between questions and the recording box made it easy to lose place in survey).
Ethical Considerations Confidentiality of students: “Will my teacher know this was from me?” – Killara Primary Student.
Observations from Process Case notes
CASE NOTES • Vicki • Gianna • Lauri • Suzie
The survey process • Too many questions • Clearer instructions needed to find file • Drop down key needed for student comments • Standardised time needed for the conduction of the survey • Independent person needed to conduct/guide the survey • Questions too broad • Some terms and words unclear/not age appropriate • Needed too much explanation • Distinction to be made between year levels • Greater use of visual clues – smiley face? • Difficult to rely on data (students are consistent in their answers dependant on many factors) • Don’t have students sitting next to each other – some were being influenced • Conflicting ideas between teachers regarding how many times a year to complete (not in ½ unit though) • 5/6 unit says 3 times a year, ½ unit says 2. • Voting eggs – model the question explicitly with examples to get responses for students • Email/survey monkey? • Not a 1-1 system? • Lengthy process • Developed in teams/units for more relevant and specific process and questioning to develop more age appropriate questions
Survey Questions • Age appropriate vocab • Drop down boxes to add additional comments/observations • Repetitive questions • Hard for students to differentiate subtlety in questions • Questions more relevant to year level • Whole class activity • Student friendly/more informal process • Time consuming – having to explain each question • Too many questions • Very broad • Greater use of visual clues for lower level grades • Lesson prior to survey to explain terms • Get students to decide on questioning and answer options
Analysis of feedback • Broken down into segments? - Environment teacher, motivation, learning, confidence • Good to see trends with colours • Gives the chance to see performance as a teacher • Results are inconclusive due to students not understanding the questions • Teacher hasn’t seen the results • Different answers for the same questions by the same students over time – the answer depends on how the students feel that day • Trends can be easily seen • Inconclusive and inaccurate results make it difficult to analyse and make conclusions – teachers can at least make inferences/observations • Does it really show anything the teacher doesn’t already know?
Feedback • Wonder if the data is authentic without the ‘extra’ assistance – need extra assistance to ensure students comprehension • Too much work for teachers • Surveys often rushed, little notice given to complete • Hard-copy preferred so teachers can discuss with class • Questions too hard for preps – i.e. seldom • PSTs – “why do students have to put their ID at the top of the survey?” • Answers vary depending on how they feel that day or what’s happened in the yard/classroom. • Grade 5/6 teachers had discussion with grade 6 female students about results which changed practices.
Results from Quantitative & Qualitative Data KEY RESULTS
Key Results What are the key results that come through the data collected?? • 50-50 whether it is a useful tool. • 50-50 whether it actually changes teaching practice. • Questions not appropriate / hard for students to understand. • Teacher have low confidence in result due to understanding issues. • Feedback to students is not consistent
Recommendations from ACP Research recommendations
Recommendations • Conduct a review into the range and number of questions specific to each Year Level with an eye to student engagement/motivation/understanding. • Foundation Level to investigate a standardised informal process for collecting data that includes explicit modeling of question context. • Investigate appropriate response types (vocabulary/visual cues) for each Year Level. • Investigate development of explicit descriptions/role plays/videos to aid student understanding. • Explore the use of an alternate data collection software to improve efficiency and effectiveness of collection ad analysis eg. SurveyMonkey, Snapshots.
Killara ACP Student Voice Teacher Survey ResultsOCTOBER 2013Compiled by Victoria UniversityPre-Service Teachers
Recommendations • Get students (student leadership, perhaps?) to word the questions and answer questions at the beginning of the process • Preps (or all years) - do as a series of community circle discussions • Suggestion box in class so students can voice opinions anonymously • Yes/no answers with explanation option • Student leadership group to discuss to present to whole school • Catering for different learning styles • Develop survey as a team (i.e. year level teams) • Buddy system so 5/6s assist younger students with survey
Using Snapshots as an option • Pros • Whole class focus • Step by step guidance/procedure • Option for confidentiality • Student reps involved and trained up to run the process • Cons • Need for testing phase to iron out any problems • It can take awhile and technical to set up (time consuming) • Teachers may need to be trained on use of devices • Suggestions • Centre a couple of lessons around using the devices so students are comfortable and competent