470 likes | 804 Views
Facilitating Ongoing Data Collection and Collaborative Conversations to Foster/Focus on School Improvement. Presenters: Sylvia De La Pena Betty Damiani. Data: We All Have It! Now What ? . On Your Own: “Collect Your Thoughts”.
E N D
Facilitating Ongoing Data Collection and Collaborative Conversations to Foster/Focus on School Improvement
Table Activity # 1 Talk It Up:“Share Your Thoughts”
Session Descriptor Session Descriptor: This session will provide the participants with data tools, strategies, processes, and guiding questions to facilitate data collection and data-driven discussions with the campus and LEA, ,shift to STAAR thinking, and foster campus data literacy . The presenters will show how they have used data in their various PSP roles to focus the campus on the road to school improvement .
How Do We Keep The Data Conversation and Thinking Going ?Tip # 1 – On the Road Checklist Presenters will share how they : • Facilitate data-driven dialogue with campus staff . • Recognize the different levels of data literacy and provide support for continuous data literacy growth. • Create templates/activities to collect a variety of data. • Examine and modify current tools to address the needs of STAAR • Use guiding questions to facilitate data analysis, dialogue, reflection, and school process examination. • Display / use data walls to spark dialogue and focus school improvement efforts. • Value working with a PSP Learning Partner for continuous growth
Tip # 2 - Set your GPS Facilitation Skills + Guiding Questions + Tools= Ongoing Data Conversation and Collaboration
Fostering Data Literacy Recognize the different levels of data literacy and provide support for continuous data literacy growth
Data Location • Teachers in case study schools generally were adept at finding information shown explicitly in a table or graph. Data Comprehension • Teachers in case study schools sometimes had difficulty responding to questions that required manipulating and comparing numbers in a complex data display (e.g., computing two percentages and comparing them). • Some case study teachers’ verbal descriptions of data suggested that they failed to distinguish a histogram from a bar graph or to consider the difference between cross-sectional and longitudinal data sets. Data Interpretation • Many case study teachers acknowledged that sample size affects the strength of the generalization that can be made from a data set and suggested that any individual student assessment administration may be affected by ephemeral factors (such as a student’s illness). • Case study teachers were more likely to examine score distributions and to think about the potential effect of extremely high or low scores on a group average when shown individual students’ scores on a class roster than when looking at tables or graphs showing averages for a grade, school, or district. An implication of this finding is that teachers will need more support when they are expected to make sense of summaries of larger data sets as part of a grade-level, school, or district improvement team. • Case study teachers’ comments showed a limited understanding of such concepts as test validity, score reliability, and measurement error. Without understanding these concepts, teachers are susceptible to invalid inferences, such as assuming that any student who has scored above the proficiency cutoff on a benchmark test (even if just above the cutoff) will attain proficiency on the state accountability test. Data Use for Instructional Decision Making • Many case study teachers expressed a desire to see assessment results at the level of subscales (groups of test items) related to specific standards and at the level of individual items in order to tailor their instruction. After years of increased emphasis on accountability, these teachers appeared quite sensitive to the fact that students will do better on a test if they have received instruction on the covered content and have had their learning assessed in the same way (e.g., same item format) in the past. • Many case study teachers talked about differentiating instruction on the basis of student assessment results. Teachers described grouping strategies, increased instructional time for individual students on topics they are weak on, and alternative instructional approaches. Question Posing • . Many case study teachers struggled when trying to pose questions relevant to improving achievement that could be investigated using the data in a typical electronic system. They were more likely to frame questions around student demographic variables (e.g., “Did girls have higher reading achievement scores than boys?”) than around school variables (e.g., “Do student achievement scores vary for different teachers?”). excerpt from1U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Teachers’ Ability to Use Data to Inform Instruction: Challenges and Supports, Washington, D.C., 2011.
Process of Transforming Data Into Knowledge Adapted from Keeping Teachers in the Center : A Framework of Data Driven Decision Making Daniel Ligh.t Education Development Center ,Inc. Center for Children and Technology USA,2004
Table Activity # 2 Take 3 minutes to reflect on where in the continuum your campus is most comfortable and tends “to rest”
Data / Data Tools Create templates and activities to collect a variety of data.
Proposed Road Trip Norms for looking at Data- Why is this important? Table Activity # 3 • Generate a list of norms for looking at data • Discuss why this is important
Proposed Norms for Looking at Data – Why is This Important ? • i.e. Describe only what you see. • Just describe the data in front of you • Resist the urge to immediately work on solutions • Seek to understand differences • Ask questions when you don’t understand • Surface the lenses and experiences you bring to the data • Surface assumptions and use data to challenge and support them
What Data Might Be Considered at Beginning of 2011 School Year for STAAR Preparation ?
TAKS or Pre- STAAR Beginning of Year Data Review Types of Data • 2011 TAKS Performance (Field Guide – Knezek ) Item Analysis (Student Expectations) All Student Commended Economically Disadvantaged LEP Teacher Level Data • Pearson Reports • PEIMS Report • End of Year Grade Report • Preliminary AYP Data • Preliminary AEIS Data • 2 year TAKS Performance • Data Skills • Location • Comprehension • Interpretation
Tip #5 - Be Prepared for the Unexpected Ongoing Data ReviewWatch : Student Groups and Student Expectations !Will this change with STAAR ? • Campus assessments • District benchmarks • Grades • Failure reports • Discipline data • Attendance data • Dropout data
Table Activity# 4 : Case StudyUse the Question Template to Facilitate Data Analysis
What is the Story Being Told by the Case Study Data? Why do you think the story played itself out this way ? Will campus be ready for STAAR ? Why or Why Not ?
Facilitate Data - Driven Dialogue • Use guiding questions to facilitate data analysis, dialogue, reflection, and school process examination.
You don’t need an advanced degree in statistics and a room full of computers to start asking data-based questions about your school, and using what you learn to guide reform. - Victoria Bernhardt
Using TAKS/Benchmark Guiding Questions to Shift from TAKS to STAAR
Table Activity # 5 1. Using the case study data, which guiding question(s) could be used to facilitate data driven discussion? 2. How can some of these questions be changed to reflective questions ?
Data Triangulation Help campuses examine data through multiple lenses
It is irresponsible for a school to mobilize ,initiate , and act without any conscious way of determining whether such expenditure of time and energy is having a desirable effect.-Carl Glickman
Tip # 6 - Know Your Road Trip Emergency Contacts and Resources • Campus Support • LEA Support • ESC Support • PSP Support • SES • Funding
Some Final Thoughts • Data Walls: • Display / use data to spark dialogue and focus school improvement efforts. • Professional Reading: • Foster the development of professional learning communities around data through professional articles and book studies. • Collaborate: • Value working with a PSP Learning Partner for continuous growth
Location, Location, Location( It’s more than just real estate ) Data Room Data Wall War Room Situation Room
Thanks for sharing and participating with us this afternoon. We hope you take away an idea or two to help you better serve in your role as a PSP.Your PSP Learning Partners, Betty and Sylvia
Presenter Contact Information sdlp@satx.rr.combettyal.damiani@yahoo.com
Allen, David and Tina Blythe. The Facilitator's Book of Questions. New York: Teachers College Press, 2004. • Bernhardt,Victoria. Data Analysis 2nd. New York: Eye on Education Inc., 2004. • Cartoons. “Digesting the Data.” This Week in Education, 2010-11. http://scholasticadministrator.typepad.com • DuFour, Richard., et al. Learning by Doing: A Handbook for Professional Learning Communities at Work. Bloomington, IN: Solution Tree Press, 2006. • Evidence Project Staff. A Collaborative Approach to Understanding and Improving Teaching and Learning . Cambridge, MA: Harvard Project Zero,2001. • Sagor,Richard. Guiding School Improvement with Action Research.Association for Supervision and Curriculum Development. March 2000: 3-7 • Sirctexas.net. School Improvement Resource Center. Principal Planning Guides: A Powerful Resource for Principals. http://www.sirctexas.net/resources_forms.htm • Tea.state.tx.us. Texas Education Agency. Accountability Monitoring Intervention Activities: Focused Data Analysis Template. http://www.tea.state.tx.us/index2 • U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Teachers’ Ability to Use Data to Inform Instruction: Challenges and Supports. Washington, D.C., 2011. • Vashisht,Rashmi. “The Importance of Triangulation and Multiple Measures for Data Assessment”. <http://datacenter.spps.org/ Rashmi Vashisht.ppt. • Wellman, Bruce and Laura Lipton. Data-Driven Dialogue: A facilitator’s Guide to Collaborative Inquiry. Connecticut: Mira Via,LLC, 2004. References