1 / 46

Enhancing Teaching through Evidence and Data Analysis

Explore new ways of thinking about evidence development and learn effective data processes. Discover the hierarchy of evidence, why rubrics are essential, and how to gather, analyze, and use data effectively.

pfrances
Download Presentation

Enhancing Teaching through Evidence and Data Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EVIDENCE AND DATAFOR LEADERS

  2. WHAT IS IN THIS PRESENTATION • New ways to think about evidence • A data use strategy • Pre and post-testing • Gathering evidence via rubrics • Analysing data

  3. NEW WAYS TO THINK ABOUT evidence

  4. DEVELOPMENTAL vs STANDARD MODEL Developmental model • Assessment is used to improve teaching • Teachers hold each other accountable based on their data and teaching strategies they use • Targeted teaching as much as possible – ideally individually but even 3-5 levels is usually sufficient • Compares students to criteria and focuses on where students are ready to learn • Developmental thinking: assessment tells me where a student is in their development and I teach them from there Standard model • Assessment occurs after instruction is complete • Teachers don’t question each others’ data or strategies • Teach whole class at once, with a bit of help for the lower kids and a bit of extension for the top kids if possible • Compares students to norms and focus on what students cannot do • Deficit thinking: students must be at a certain year level norm and I must correct all the deficits they have

  5. HIERARCHY OF EVIDENCE

  6. Clinical model of teaching

  7. A data use strategy

  8. A DATA PROCESS – PRE AND POST TESTING • pre and post test students • Analyse the data • Use the data to: • improve courses • improve teaching • identify teacher effectiveness

  9. A DATA PROCESS – using data from existing assignments • get all teachers using skill-based rubrics • get them recording information on spreadsheets that they send to you • collate this information • analyse it to produce empirical progression • use progression to: • improve courses • improve teaching • help students track their own growth • show students what improvement looks like

  10. Pre and post-testing

  11. Why pre and post-test? • We should be aiming to get a year’s worth of progress for a year’s worth of input • How do we measure a year’s worth of progress? • Effect size of 0.4 or more

  12. What to pre and post-test? • historical skills • pick some from the curriculum • Our school has chosen four: • identifying cause and effect • using historical concepts • sequencing events • understanding historical perspectives

  13. Ask questions linked to each level

  14. Create an item bank

  15. How to administer pre and post tests • By hand (painful) • Microsoft Forms • Google Docs + automatic scoring script like “Flubaroo”

  16. Gathering evidence via rubrics

  17. WHY USE RUBRICS? Students: know how to get better Get higher quality feedback on their performance assessment data used as information rather than as judgement Parents: Know what their child can do, not how they compare Know the next thing their child is ready to learn Sees more motivated students – especially those at the top and at the bottom

  18. WHY USE RUBRICS? Teachers: more consistent judgements between teachers (easier moderation) you don’t have to write as many comments more detailed information for reporting rewarding professional discussions between teachers promotes development: linked to skills not what is “normal” the teacher knows where a student is ready to learn can target teaching intervention to use with that student or group of students

  19. WRITING QUALITY CRITERIA Advice Use student work samples to describe the levels Use learning taxonomies to help you come up with ideas e.g. Bloom’s, Dreyfus, Krathwohl, SOLO Good quality criteria should… no counts of things right/wrong or pseudo-counts (some, many) Avoid ambiguous terms like “appropriate”, “suitable” Describe increasing quality – no procedural steps Full guidelines: https://reliablerubrics.com/category/assessment-rubrics/what-is-a-rubric/guidelines/

  20. 3 GATHERING DATA USING A RUBRIC 3 3 3 3 2 2 2 2 2 1 1 1 1 1 2 1 1 1

  21. RECORDING DATA ON A SPREADSHEETUSING A RUBRIC

  22. ANALYSING DATA

  23. ANALYSING PRE AND POST TEST DATA • use unique student identifier (id number) • use excel function to rule out all students that didn’t do both the pre and the post test https://exceljet.net/formula/find-missing-values 3. calculate effect sizes for: • overall • teacher effect • year level effect https://vimeo.com/51258028

  24. WHAT CAN YOU DO WITH THIS DATA? • Which skills are taught well? • Figure out how it was taught and replicate that elsewhere • Which skills are taught badly? • Is this because of how it is taught or its place in the course? • Which teachers taught well? • Figure out what they do that is transferrable and suggest others try that as well • Get other teachers to observe a teacher teaching a skill well • Film a teacher teaching a skill they teach well

  25. ANALYSING ASSIGNMENT DATA • Combine year level cohort into one large spreadsheet and complete a Guttman analysis on it • via the attached “Guttman Boss”

  26. ANALYSING ASSIGNMENT DATA • Combine year level cohort into one large spreadsheet and complete a Guttman analysis on it • via the attached “Guttman Boss” • Turn this into an empirically-derived learning progression

  27. WHAT CAN YOU DO WITH THIS DATA? • Get students to track their own progress • Show students what improvement looks like • Target teaching of new skills at the right level • Design ability based groupings and teaching material

  28. BABY STEPS

  29. Start persuading people • pre and post-testing • online testing • using evidence to inform teaching • using assessment data formatively not just summatively • skill-based rubrics • writing decent quality criteria in rubrics • Using data

  30. Pre and post-testing • measure just one thing? • begin with just one class / one year level / one subject? • see if other learning areas in your school do the same thing and learn off them? • Rubrics • begin with just one assignment / class / year level / subject? • Critique your existing ones by comparing them with ARC guidelines

  31. Data use • Complete a empirical progression for one year level and show it off

  32. advice • Staff buy-in is extremely important • Leverage early adopters to help persuade the rest • Use Dale Carnegie’s strategies: • Try honestly to see things from the other’s point of view • Let them feel it is their idea • Praise the slightest improvement • Give the other person a fine reputation to live up to

  33. support

  34. This presentation and associated files: https://lawlesslearning.com/free/evidencepd/ benlawless8@gmail.com “Assessment for Teaching” Patrick Griffin (editor)

More Related