1 / 34

Introduction

Killara ACP Student Voice Teacher Survey Results OCTOBER 2013 Compiled by Victoria University Pre-Service Teachers. Introduction. The Killara PS Strategic Plan details a desire for consistent approaches to student leadership and student voice.

Download Presentation

Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Killara ACP Student Voice Teacher Survey ResultsOCTOBER 2013Compiled by Victoria UniversityPre-Service Teachers

  2. Introduction • The Killara PS Strategic Plan details a desire for consistent approaches to student leadership and student voice. • The Annual Implementation Plan suggests trial and implementation opportunities for the use of student voice in the classroom and across the school. • Student voice activities are being implemented across the school and appropriate data is being collected, collated and displayed to improve teaching & learning.

  3. Accessing Student Voice at Killara

  4. Research Methodology Engagement: Scoping current practice Quantitative: Teacher survey Qualitative: Teacher interviews Student interviews Leadership interviews

  5. Themes • The Student Voice Survey Process • Questions/Student Understanding • Teacher Analysis • Feedback Approaches

  6. Self Selected Teacher Survey Quantitative data

  7. Q1. Did you find the student voice survey an effective tool?

  8. Q2. Did you make any changes to your classroom as a result of the data collected from your students?

  9. Q3. Do you think the survey process is easy for teachers to manage?

  10. Q4. Do you think the questions in the survey are appropriately worded for students?

  11. Q5. Are you happy with the answer options provided?

  12. Q6. Do you think the students have an understanding of the questions being asked?

  13. Q7.Would you be willing to implement the survey into your classroom each term?

  14. Q8. Did you feel the PST's involved in the process were helpful in completing the surveys?

  15. Q9. Do you believe the students in your class were able to understand the questions asked?

  16. Q10. Would you have liked to sit and ask the children why they answered what they did to get a better understanding?

  17. Themes from teacher, student, and leadership interviews QUALITATIVE DATA

  18. Mind Map Here

  19. Limitations • The following limitations may have impacted on the accuracy and validity of the survey results: • Language and comprehension • Vocabulary was not age appropriate for all students. For example: Seldom – difficult for students to understand even after explanation given. • Repetitive questions made it difficult for students to differentiate subtlety between similar questions. • Broad wording of questions and some terms/words used were unclear, requiring lengthy explanation. • Time • Limited timeframe to complete surveys. • Limited time for preparation. • May have resulted in limited/inconsistent pre-survey briefings to students. • Consistency of approach • Inconsistent briefing to students prior to surveys. • Varied explanation of vocabulary and questions to students – May have resulted in different student interpretations and misconceptions of questions. • Technology and software • Software program was difficult to navigate and record data in (for example – scrolling back and forth between questions and the recording box made it easy to lose place in survey).

  20. Ethical Considerations Confidentiality of students: “Will my teacher know this was from me?” – Killara Primary Student.

  21. Observations from Process Case notes

  22. CASE NOTES • Vicki • Gianna • Lauri • Suzie

  23. The survey process • Too many questions • Clearer instructions needed to find file • Drop down key needed for student comments • Standardised time needed for the conduction of the survey • Independent person needed to conduct/guide the survey • Questions too broad • Some terms and words unclear/not age appropriate • Needed too much explanation • Distinction to be made between year levels • Greater use of visual clues – smiley face? • Difficult to rely on data (students are consistent in their answers dependant on many factors) • Don’t have students sitting next to each other – some were being influenced • Conflicting ideas between teachers regarding how many times a year to complete (not in ½ unit though) • 5/6 unit says 3 times a year, ½ unit says 2. • Voting eggs – model the question explicitly with examples to get responses for students • Email/survey monkey? • Not a 1-1 system? • Lengthy process • Developed in teams/units for more relevant and specific process and questioning to develop more age appropriate questions

  24. Survey Questions • Age appropriate vocab • Drop down boxes to add additional comments/observations • Repetitive questions • Hard for students to differentiate subtlety in questions • Questions more relevant to year level • Whole class activity • Student friendly/more informal process • Time consuming – having to explain each question • Too many questions • Very broad • Greater use of visual clues for lower level grades • Lesson prior to survey to explain terms • Get students to decide on questioning and answer options

  25. Analysis of feedback • Broken down into segments? - Environment teacher, motivation, learning, confidence • Good to see trends with colours • Gives the chance to see performance as a teacher • Results are inconclusive due to students not understanding the questions • Teacher hasn’t seen the results • Different answers for the same questions by the same students over time – the answer depends on how the students feel that day • Trends can be easily seen • Inconclusive and inaccurate results make it difficult to analyse and make conclusions – teachers can at least make inferences/observations • Does it really show anything the teacher doesn’t already know?

  26. Feedback • Wonder if the data is authentic without the ‘extra’ assistance – need extra assistance to ensure students comprehension • Too much work for teachers • Surveys often rushed, little notice given to complete • Hard-copy preferred so teachers can discuss with class • Questions too hard for preps – i.e. seldom • PSTs – “why do students have to put their ID at the top of the survey?” • Answers vary depending on how they feel that day or what’s happened in the yard/classroom. • Grade 5/6 teachers had discussion with grade 6 female students about results which changed practices.

  27. Results from Quantitative & Qualitative Data KEY RESULTS

  28. Key Results What are the key results that come through the data collected?? • 50-50 whether it is a useful tool. • 50-50 whether it actually changes teaching practice. • Questions not appropriate / hard for students to understand. • Teacher have low confidence in result due to understanding issues. • Feedback to students is not consistent

  29. Recommendations from ACP Research recommendations

  30. Recommendations • Conduct a review into the range and number of questions specific to each Year Level with an eye to student engagement/motivation/understanding. • Foundation Level to investigate a standardised informal process for collecting data that includes explicit modeling of question context. • Investigate appropriate response types (vocabulary/visual cues) for each Year Level. • Investigate development of explicit descriptions/role plays/videos to aid student understanding. • Explore the use of an alternate data collection software to improve efficiency and effectiveness of collection ad analysis eg. SurveyMonkey, Snapshots.

  31. Killara ACP Student Voice Teacher Survey ResultsOCTOBER 2013Compiled by Victoria UniversityPre-Service Teachers

  32. Recommendations • Get students (student leadership, perhaps?) to word the questions and answer questions at the beginning of the process • Preps (or all years) - do as a series of community circle discussions • Suggestion box in class so students can voice opinions anonymously • Yes/no answers with explanation option • Student leadership group to discuss to present to whole school • Catering for different learning styles • Develop survey as a team (i.e. year level teams) • Buddy system so 5/6s assist younger students with survey

  33. Using Snapshots as an option • Pros • Whole class focus • Step by step guidance/procedure • Option for confidentiality • Student reps involved and trained up to run the process • Cons • Need for testing phase to iron out any problems • It can take awhile and technical to set up (time consuming) • Teachers may need to be trained on use of devices • Suggestions • Centre a couple of lessons around using the devices so students are comfortable and competent

More Related