1 / 75

Data Interpretation II Workshop

Data Interpretation II Workshop. 2008 Writing Assessment for Learning. Purposes for the Day – p. 2. Deepen understanding about the writing assessment project results; Initiate reflection and discussion among division-level staff members related to the writing assessment results;

medge-mckee
Download Presentation

Data Interpretation II Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Interpretation II Workshop 2008 Writing Assessment for Learning

  2. Purposes for the Day – p. 2 • Deepen understanding about the writing assessment project results; • Initiate reflection and discussion among division-level staff members related to the writing assessment results; • Provide a range of tools and processes to support division-level staff in their work throughout the system related to school improvement; and, • Provide opportunity to discuss and plan around the data in the context of school improvement.

  3. Opening Assessment for Learning – Writing Assessment Conceptual Framework Comparators The Reports Here’s What The Data Processes to Support School/Division School Improvement Changing Contexts Building Capacity So What? Analysis of Data and Support Structures Role of Central Office in Supporting School Improvement Sustainability Now What? Using Goals to Inform Planning Monitoring & Assessing Progress Linking goals and Assessment Data Identifying Interrelationships Evidence of Implementation Closure Agenda

  4. Magnetic Quotes • In various locations around the room are statements regarding data use and school improvement. • Take a moment to read each and then go to the sign with the statement that resonates most for you. • Create a pair or trio with your colleagues and discuss why you connect with the statement and what it means to you.

  5. Assessment for Learningis a Snapshot • Results from a large-scale assessment are a snapshot of student performance. • The results are not definitive. They do not tell the whole story. They need to be considered along with other sources of information available at the school. • The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment. (Saskatchewan Learning, 2008)

  6. In-depth knowledge of specific students Individual Individual Classroom Classroom School School Division Division Provincial Provincial National National International International Little knowledge of specific students Assessments . Depth and Specificityof Knowledge In-depth knowledge of specific students In-depth knowledge of systems Assessments From Saskatchewan Learning. (2006). Understanding the numbers.

  7. Provincial Writing Assessment: Conceptual Framework – p. 3 • Colourful Thoughts • As you read through the information on the Provincial Writing Assessment, use highlighters or sticky notes to think about your reading: Wow! I agree with this. Hmm! I wonder. . . Yikes! Adapted from Harvey, S. & Goudvis, A. Strategies that work, 2007.

  8. Comparators: Types of Referencing • Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing. (Detailed rubrics, OTL rubrics and test items can be sourced at www.education.gov.sk.ca) • Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention. (Figure .2b, .3c, .4a, .6b, .7b and .8b.)

  9. Comparators: Types of Referencing • Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning. (E.g.. Comparing these results to current school data. The standards set by the panel.) • Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm-reference comparisons contribute very little to determining how to use the assessment information to make improvements. (E.g.. Tables comparing the school, division and province.)

  10. Comparators: Types of Referencing • Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements. (This data will not appear until the next administration of this assessment.)

  11. Propensity to Learn using resources to explore models, generate ideas and assist the writing process Motivation, attitude and confidence Participation, perseverance and completion Reflection Knowledge and Use of Before, During and After Writing Strategies Home Support for Writing and Learning Encouragement and interaction Access to resources and assistance Opportunity-to-Learn Elements as Reported by Students

  12. Availability and Use of Resources Teacher as key resource Teacher as writer Use of curriculum Educational qualifications Professional development Time Student resources Classroom Instruction and Learning Planning focuses on outcomes Expectations and criteria are clearly outlined Variety of assessment techniques Writing strategies explicitly taught and emphasized Adaptation Opportunity-to-Learn Elements as Reported by Teachers

  13. Demonstration of the writing process Pre-writing Drafting Revision Quality of writing product Messaging and content Focus Understanding and support Genre Organization and coherence Introduction, conclusion, coherence Language use Language and word choices Syntax and mechanics Student Performance Outcome Results

  14. Standards To help make meaningful longitudinal comparisons in future years, three main processes will be implemented. • Assessment items will be developed for each assessment cycle using a consistent table of specifications. • The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments. • A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

  15. Opportunity-to-Learn and Performance Standards • In order to establish Opportunity-to-Learn and Performance standards for the 2008 Writing Assessment, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post-secondary academics including Education faculty. • The panelists studied each genre from the 2008 assessment in significant detail and established expectations for writing process, narrative products and expository products as well as opportunity to learn.

  16. Thresholds of Adequacyand Proficiency

  17. Threshold of Adequacy Threshold of Proficiency 1.87 3.92 Adequate Proficient & Beyond Thresholds of Adequacyand Proficiency

  18. On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels: Cut Scores

  19. Local Data Descriptive information such as enrollment, attendance, gender, ethnicity, grade level, etc. Can disaggregate other data by demographic variables. AFL Opportunity-to-Learn Data Family/Home support for student writing encouragement and interaction access to resources Four Major Categories of Data: Demographics Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  20. Local Data Describes outcomes in terms of standardized test results, grade averages, etc. AFL Readiness Related Opportunity-to-Learn Data Using resources to explore writing Student knowledge and use of writing strategies (before, during, after) Student performance outcomes Writing 5,8,11 – Narrative and Expository Writing process, Writing product & categories within Four Major Categories of Data: Student Learning Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  21. Local Data Provides information regarding what students, parents, staff and community think about school programs and processes. This is data is important because people act in congruence with what they believe. AFL Readiness Related Opportunity-to-Learn Data Commitment to learn Using resources Motivation & attitude Confidence Participation Perseverance & completion Reflection Knowledge and use of writing strategies Four Major Categories of Data:Perceptions Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  22. Local Data What the system and teachers are doing to get the results they are getting. Includes programs, assessments, instructional strategies and classroom practices. AFL Classroom Related Opportunity-to-Learn Data Instruction and learning Planning and reflection Expectations and assessment Focus on writing strategies Adaptations Availability and use of resources Teacher Time Resources for students and teachers Four Major Categories of Data:School Processes Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  23. Examining the Report • Take a few minutes to look through the entire AFL report. • Use the section on data in the chart in your handout package to guide your thinking and conversation. • Note three or four areas of strength and areas for improvement. DATA Here’s What School ImprovementHere’s What So What? Now What?

  24. DATA Here’s What School ImprovementHere’s What So What? Now What?

  25. Old Basics Literacy Numeracy Obedience Punctuality New Basics Multiliteracy Creativity Communication IT Teamwork Lifelong Learning Adaptation & Change Environmental Responsibility A Changing ContextHargreaves & Fink (2006)

  26. Old Paradigm Content-based curricula Teaching from activities Assessment sorts and selects Streaming Those who can, learn Evolving Paradigm Balance between content and process Outcome-based learning Assessment supports learning Adaptive Dimension Everyone learns Saskatchewan’s Changing Context

  27. Judith Warren Little In large numbers of schools, and for long periods of time, teachers are colleagues in name only. They work out of sight and hearing of one another, plan and prepare their lessons and materials alone, and struggle on their own to solve most of their instructional, curricular and management problems. Against this almost uniform backdrop of isolated work, some schools stand out for the professional relationship they foster among teachers. These schools, more than others, are organized to permit the sort of reflection . . . That has been largely absent form professional preparation and professional work in schools. For teachers in such schools, work involves colleagueship of a more substantial sort.

  28. Professional Learning Communities Transform knowledge Shared inquiry Evidence informed Situated certainty Local solutions Joint responsibility Continuous learning Communities of practice Performance-Training Sects Transfer knowledge Imposed requirements Results driven False certainty Standardized scripts Deference to authority Intensive training Sects of performance Hargreaves, A. (2003). Teaching in the knowledge society: Education in the age of insecurity. New York, NY: Teachers College Press.

  29. Building Capacity for Success: Learning by Trial and Evidence. • In the article, note there are many examples of schools who have engaged in school improvement through a process of trial and evidence. • In all cases the focus was on the questions: • What will students learn? • How will teachers best support student learning? • What are indicators of success? • How will we measure success of the practices? • What can we do to support students not meeting expectations?

  30. Building Capacity for Success: Learning by Trial and Evidence. • Characteristics of improving schools: • A focus on achievement. • Build in monitoring and measuring. • Leadership. • Involvement of all partners. • Considered all students’ needs.

  31. Read and Example:Building Capacity for Success: Learning by Trial and Evidence. • Find a partner, decide who is A and who is B. • Both partners read to the end of the questions on p.7 of the text, then stop. • A summarizes the reading • Pairs craft examples or non-examples from their experiences • Both partners continue reading up to the end of p. 8. • B summarizes the reading • Pairs craft examples (or non-examples) • Read to the end of p. 9. • A summarizes the reading • Pairs craft examples or non-examples from their experiences

  32. What key lessons can be taken from this reading and applied to your context? In what ways are you gathering evidence of promising practices within your schools and school divisions? In what ways do goals reflect your school or division’s core values and beliefs? In what ways are you building community with your schools and school divisions? Reflection Questions DATA Here’s What School ImprovementHere’s What So What? Now What?

  33. DATA Here’s What School ImprovementHere’s What So What? Now What?

  34. Double-Loop ExerciseP. 10 • As a table group, use the provided double-loop to clarify the connections between the professional structures supporting school improvement and the information you are getting from the AFL data. • In the top circle write out the current structures (PLCs, catalyst teachers, PD) and initiatives (literacy) already in place in your division. • In the bottom circle write down 3-5 significant (strengths & areas for improvement) indicators from the AFL data. • Draw arrows from the items in the bottom circle that are connected to or could be supported by items in the top circle. PLCs Catalyst Teachers Reading Group Writing Strategies Majority of students scored at proficient in narrative writing. Students aren’t reporting use of writing processes. Lezotte, L. W. & McKee, K. M. (2006). Stepping up: Leading the charge to improve our schools. Okemos, MI: Effective Schools Products, Ltd.

  35. Double-Loop Exercise Once you have completed the diagrams, use the following questions to guide discussion at your table: • What current structures support areas where student performance was strong? • What current structures could meet the needs identified for improvement? • What new structures/initiatives may need to be considered?

  36. Please return at 12:50 I’d trade, but peanut butter sticks to my tongue stud.

  37. The Role of Central Office • Focus on Alignment • Equipping staff with the knowledge and skills for aligning school improvement processes. • Provide multiple opportunities for staff to collaborate around current literature and best practices. • Model and encourage reflective practice; aligning improvement efforts requires time for reflection. Adapted from - Mooney, N. J. & Mausbach, A. T. (2008). Align the design: a blueprint for school improvement. Alexandria, VA: ASCD.

  38. The Role of Central Office • Supporting Alignment Initiatives • Link data to the goals and strategies already in place. • Keep improvement goals front and center. • Engage staffs in discussion about improvement initiatives. Adapted from - Mooney, N. J. & Mausbach, A. T. (2008). Align the design: a blueprint for school improvement. Alexandria, VA: ASCD.

  39. The Role of Central Office • Getting to Goal • Recognize and address alignment problems. • Support all schools in aligning improvement plans. • Sustain the plan until . . . Adapted from - Mooney, N. J. & Mausbach, A. T. (2008). Align the design: a blueprint for school improvement. Alexandria, VA: ASCD.

  40. Sustainability • Sustainability is the capacity of a system to engage in the complexities of continuous improvement consistent with deep values of human purpose. (Fullan, 2004) • Sustainability does not simply mean whether something can last. It addresses how particular initiatives can be developed without compromising the development of others in the surrounding environment, now and in the future. (Hargreaves & Fink, 2000)

  41. Challenges to SustainabilityP. 11 • In your handout package is a template with five common challenges to sustainable collaborative work. • In groups of 3-6 brainstorm possible solutions for each challenge.

  42. Supporting Improvement • So what meaning are you making about this information about data and school improvement? DATA Here’s What School ImprovementHere’s What So What? Now What?

  43. Goals to Inform PlanningP. 12 • What are your school or division goals? • What structures and supports do you have in place to support sustainable improvement towards that goal? • What kinds of data are you gathering to inform decision making and progress? DATA Here’s What School ImprovementHere’s What So What? Now What?

  44. Progress Measure AreasP. 13 Goal Types Improvement Goals Proficiency Goals Assessing Progress Student Data Short-Term Medium-Term Long-Term Evidence of Implementation From Boudette, City, & Murnane (2005) and Holcomb (2004).

  45. GROWTH Improvement refers to students’ growth on a given assessment within a specified period of time. A student or group of students may experience great growth but still fall short of set proficiency goals. COMPETENCE Proficiency refers to how many students will achieve a certain level of performance within a specified period of time. Proficiency goals don’t measure student growth – they measure how many have reached a set standard or benchmark. Improvement and Proficiency Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

  46. Improvement and Proficiency • Attending to both improvement and proficiency ensures that students grow academically and have achieved degrees of competence in their studies. • Thinking of growth and competence compels us to consider in what ways all students will grow (weak, average, and gifted) and what levels of competence are desired for all. Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

  47. Goal Types • In what ways do your school division goals reflect improvement? • In what ways do your school division goals reflect proficiency?

More Related