1 / 95

Just for the Kids: FWISD Data Workshop Fort Worth, TX April 13, 2006

holden
Download Presentation

Just for the Kids: FWISD Data Workshop Fort Worth, TX April 13, 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    2. 5/1/2012 NCEA Sponsoring Organizations 1995 JFTK Tom Luce Use student achievement data to raise academic standards and promote proven practices for all students 2001 JFTK ESC UT Took JFTK School Reports to other states Began Best Practice work1995 JFTK Tom Luce Use student achievement data to raise academic standards and promote proven practices for all students 2001 JFTK ESC UT Took JFTK School Reports to other states Began Best Practice work

    3. 5/1/2012 Just for the Kids School Reports

    4. 5/1/2012 Broad Foundation 20 states BP Research Broad Prize for Urban Education – 4th Year Largest prize for public schools District making overall improvement and closing the achievement gap Previous winners: Houston, Long Beach, Garden Grove 2005 Finalists: NYC, San Francisco, Norfolk, Boston, AldineBroad Foundation 20 states BP Research Broad Prize for Urban Education – 4th Year Largest prize for public schools District making overall improvement and closing the achievement gap Previous winners: Houston, Long Beach, Garden Grove 2005 Finalists: NYC, San Francisco, Norfolk, Boston, Aldine

    5. 5/1/2012 The Broad Prize for Urban Education

    6. 5/1/2012 The Broad Prize for Urban Education

    7. 5/1/2012 2005 TBEC JFTK Honor Roll Schools

    8. 5/1/2012 2005 TBEC Honor Roll Schools

    9. 5/1/2012 2005 JFTK Consistently Higher Performing Elementary Schools

    10. 5/1/2012 Graduation Rates

    11. 5/1/2012 Education Pipeline

    12. 5/1/2012 High student mobility is often cited in the United States—especially in urban schools—as a contributing factor to poor student achievement. Recognizing the potential problems of mobility, the Department of Defense schools standardized their curriculum in 1994 so that a student moving from Fort Knox, Kentucky to Aviano, Italy or Mannheim, Germany can expect to encounter the same learning standards and instructional resources in a classroom. This curricular alignment is cited as the reason that these schools score consistently at the top of the scale on NAEP and have virtually no achievement gaps. Typically have one of two situations: Teachers believe that they have a responsibility to decide what is important for students to know and be able to do independent of other teachers, the district curriculum or the state testing program. Or Teachers understand standards and accountability and believe it is important to coordinate to accomplish goals. Learning is critically dependent on students’ relevant prior knowledge.High student mobility is often cited in the United States—especially in urban schools—as a contributing factor to poor student achievement. Recognizing the potential problems of mobility, the Department of Defense schools standardized their curriculum in 1994 so that a student moving from Fort Knox, Kentucky to Aviano, Italy or Mannheim, Germany can expect to encounter the same learning standards and instructional resources in a classroom. This curricular alignment is cited as the reason that these schools score consistently at the top of the scale on NAEP and have virtually no achievement gaps. Typically have one of two situations: Teachers believe that they have a responsibility to decide what is important for students to know and be able to do independent of other teachers, the district curriculum or the state testing program. Or Teachers understand standards and accountability and believe it is important to coordinate to accomplish goals. Learning is critically dependent on students’ relevant prior knowledge.

    13. 5/1/2012 Why use the Commended standard?

    14. 5/1/2012

    15. 5/1/2012

    16. 5/1/2012

    17. 5/1/2012 JFTK School Improvement Service

    18. 5/1/2012 Just for the Kids

    19. 5/1/2012 JFTK SIS Overview

    20. 5/1/2012

    21. 5/1/2012 The dentist analogy was written by a superintendent! He didn’t understand the basic premises of the standards and accountability movement. It is important to understand that gaps in achievement DO occur when children are tested on material that has not been taught in the classroom. Norm-referenced tests typically contain 30%+ material that has not been taught in the classroom. Any UNALIGNED test is going to evidence quite predictable gaps in knowledge. These gaps can in large part be explained by four factors outside the control of the school. Tests that rely to any extent on knowledge that is acquired outside the school will ALWAYS show gaps because of the variance in childrens’ learning opportunities outside of the school. It is this quite expected phenomenon that leads to the next giant leap to wrongheadedness—that that same gap is therefore predictive of or evidence of childrens’ ability to learn IN SCHOOL. I’ve heard many good analogies to further illustrate the concept and I thought I would share just one. Let’s assume that you grew up in a family where a parent thought it was critical for you to be able to change a tire and taught you accordingly. Now let’s assume that I grew up in a family that didn’t own a car. The two of us show up at our first day of school and discover that an ability test will be administered. The test? We must change a tire. The results are quite predictable. However, the difference in results has absolutely nothing to do with my capacity to learn or to perform at the same proficiency as you. But I may need an extra session or two to be taught to change the tire. Most importantly, you must absolutely assure I have this knowledge before proceeding to another skill that is based upon my ability to change a tire.The dentist analogy was written by a superintendent! He didn’t understand the basic premises of the standards and accountability movement. It is important to understand that gaps in achievement DO occur when children are tested on material that has not been taught in the classroom. Norm-referenced tests typically contain 30%+ material that has not been taught in the classroom. Any UNALIGNED test is going to evidence quite predictable gaps in knowledge. These gaps can in large part be explained by four factors outside the control of the school. Tests that rely to any extent on knowledge that is acquired outside the school will ALWAYS show gaps because of the variance in childrens’ learning opportunities outside of the school. It is this quite expected phenomenon that leads to the next giant leap to wrongheadedness—that that same gap is therefore predictive of or evidence of childrens’ ability to learn IN SCHOOL. I’ve heard many good analogies to further illustrate the concept and I thought I would share just one. Let’s assume that you grew up in a family where a parent thought it was critical for you to be able to change a tire and taught you accordingly. Now let’s assume that I grew up in a family that didn’t own a car. The two of us show up at our first day of school and discover that an ability test will be administered. The test? We must change a tire. The results are quite predictable. However, the difference in results has absolutely nothing to do with my capacity to learn or to perform at the same proficiency as you. But I may need an extra session or two to be taught to change the tire. Most importantly, you must absolutely assure I have this knowledge before proceeding to another skill that is based upon my ability to change a tire.

    22. 5/1/2012 Because in systems where children are tested over material that has been taught in the classroom, variance in achievement can be almost completely accounted for by factors within the control of the school system. Or as Meredith Phillips, coauthor of the The Black White Test Score Gap states: Just because schools didn’t cause the problem doesn’t mean they can’t solve it. An interesting note: James Popham, University of California, is one of my favorite authorities on testing. He just wrote an interesting article that charges that some state tests, even though aligned to curricular objectives, can be as instructionally insensitive as the nationally standardized tests. His concern is that an overwhelming number of curricular objectives can only be slightly measured in limited testing time. Therefore, teachers end up guessing which of the multitude of content standards will actually be assessed in any given year. He strongly encourages assessment of a much more focused set of curricular aims, detailed and lucid descriptions of those aims for teachers, and much more instructionally useful reports. Because in systems where children are tested over material that has been taught in the classroom, variance in achievement can be almost completely accounted for by factors within the control of the school system. Or as Meredith Phillips, coauthor of the The Black White Test Score Gap states: Just because schools didn’t cause the problem doesn’t mean they can’t solve it. An interesting note: James Popham, University of California, is one of my favorite authorities on testing. He just wrote an interesting article that charges that some state tests, even though aligned to curricular objectives, can be as instructionally insensitive as the nationally standardized tests. His concern is that an overwhelming number of curricular objectives can only be slightly measured in limited testing time. Therefore, teachers end up guessing which of the multitude of content standards will actually be assessed in any given year. He strongly encourages assessment of a much more focused set of curricular aims, detailed and lucid descriptions of those aims for teachers, and much more instructionally useful reports.

    23. 5/1/2012 There is no debate. A systemic failure to teach all children the knowledge they need in order to understand what the next grade has to offer is the major source of avoidable injustice in our schools. E.D. HirschThere is no debate. A systemic failure to teach all children the knowledge they need in order to understand what the next grade has to offer is the major source of avoidable injustice in our schools. E.D. Hirsch

    24. 5/1/2012 No silver bullet exists. Narrowing the achievement gap requires a comprehensive set of strategies that are interdependent and crafted to meet local needs. I would like to begin building the structure for understanding and studying this locally-influenced, interdependent, comprehensive set of strategies. We must agree on some lens, visual tool, or organizer to be able to consider the unique characteristics of the practices multiple districts that are closing the gap. I am going to refer to that tool as a framework from this time forward. Let’s agree that we will first consider practices or behaviors—that is, the first layer of our framework requires that we talk only about observable behaviors—not attributes, beliefs, climate, expectations—observable practices. We believe three questions need to be addressed when beginning to build this organizer. We need to know what practices separate high-performing districts from others. In addition, we need to know if the school level at which the practice is managed is related to student achievement. And finally, we need to know about the lynchpins of the system--No silver bullet exists. Narrowing the achievement gap requires a comprehensive set of strategies that are interdependent and crafted to meet local needs. I would like to begin building the structure for understanding and studying this locally-influenced, interdependent, comprehensive set of strategies. We must agree on some lens, visual tool, or organizer to be able to consider the unique characteristics of the practices multiple districts that are closing the gap. I am going to refer to that tool as a framework from this time forward. Let’s agree that we will first consider practices or behaviors—that is, the first layer of our framework requires that we talk only about observable behaviors—not attributes, beliefs, climate, expectations—observable practices. We believe three questions need to be addressed when beginning to build this organizer. We need to know what practices separate high-performing districts from others. In addition, we need to know if the school level at which the practice is managed is related to student achievement. And finally, we need to know about the lynchpins of the system--

    25. 5/1/2012 Consistently High-Performing Schools Nearly 500 schools in past 3 years HP and AP Half of schools are elementary schools More Texas schools than any other stateNearly 500 schools in past 3 years HP and AP Half of schools are elementary schools More Texas schools than any other state

    26. 5/1/2012

    27. 5/1/2012

    28. 5/1/2012 Before we review the actual practices within each theme, however, we will establish a second dimension of organization for those practices—the school level. We know that three school levels exist in every school system—the district, the school, and the classroom. Within each theme, we are going to study the role of each of these levels as related to student achievement. As shown in this slide, we will see that the assignment of effective practices to specific levels may be as important as the practices themselves. Different levels of the system must be involved to differing degrees to reach maximum effectiveness in the specific theme area. Once again, the optimal level of involvement has been determined from the cumulative review of sustained high-performing school systems. This slide also helps us to understand why teachers often state they are overwhelmed with standards-based reform or why teachers in average-low performing districts tell us that they preach absence as the optimal district role. You see if the district doesn’t play the correct role the practices required for student success don’t change. And since the scores are much more personalized at the school and classroom level, it is the teacher and school staff who need to fill in for this absence. And in an even worse case scenario they may have to respond to additional school and/or district actions unrelated to student success. So…while a school or students in a particular grade may demonstrate success without one or more of the practices present or without the involvement of a particular school level, that success requires far greater stress on other practices and levels and is much less likely to be sustained long term. Before we review the actual practices within each theme, however, we will establish a second dimension of organization for those practices—the school level. We know that three school levels exist in every school system—the district, the school, and the classroom. Within each theme, we are going to study the role of each of these levels as related to student achievement. As shown in this slide, we will see that the assignment of effective practices to specific levels may be as important as the practices themselves. Different levels of the system must be involved to differing degrees to reach maximum effectiveness in the specific theme area. Once again, the optimal level of involvement has been determined from the cumulative review of sustained high-performing school systems. This slide also helps us to understand why teachers often state they are overwhelmed with standards-based reform or why teachers in average-low performing districts tell us that they preach absence as the optimal district role. You see if the district doesn’t play the correct role the practices required for student success don’t change. And since the scores are much more personalized at the school and classroom level, it is the teacher and school staff who need to fill in for this absence. And in an even worse case scenario they may have to respond to additional school and/or district actions unrelated to student success. So…while a school or students in a particular grade may demonstrate success without one or more of the practices present or without the involvement of a particular school level, that success requires far greater stress on other practices and levels and is much less likely to be sustained long term.

    29. 5/1/2012 JFTK SIS Overview

    30. 5/1/2012 Scale Scores 101

    31. 5/1/2012

    32. 5/1/2012 Why scale scores? Each TAKS test has a different level of difficulty Knowing that a student answered 75% of the test correctly does not tell you whether they passed To correct for this, TEA establishes the % correct required to pass each test individually, then places this % correct on a common scale across all TAKS tests

    33. 5/1/2012 What do scale scores communicate? Knowing a student’s scale score tells you whether they passed and/or were Commended on that test Knowing a grade level average scale score tells you whether – on average - the students in that grade passed and/or were Commended

    34. 5/1/2012 How does TEA set the Passing standard on each exam? Passing standards are related to two issues: Difficulty of the test questions 2. The number of questions students must answer correctly in order to pass the test

    35. 5/1/2012 What are the scale scores associated with Passing and Commended?

    36. 5/1/2012 What standard does NCEA use to compare schools? Opportunity Gap Analysis Elementary School: % Commended (2400) Middle School: % Commended (2400) or % Proficient (2300) High School: % Proficient (2300) High-Performing Schools Analysis For the elementary, middle, and high school analyses, schools are compared by the average scale score of their continuously enrolled students on each TAKS test

    37. 5/1/2012 JFTK School Reports: Gap Analysis

    38. 5/1/2012

    39. 5/1/2012

    40. 5/1/2012

    41. 5/1/2012

    42. 5/1/2012

    43. 5/1/2012

    44. 5/1/2012

    45. 5/1/2012

    46. 5/1/2012 JFTK School Reports: Opportunity Gap Bar Chart Activity

    47. 5/1/2012

    48. 5/1/2012

    49. 5/1/2012

    50. 5/1/2012 JFTK School Reports: Top Comparable Schools Activity

    51. 5/1/2012

    52. 5/1/2012

    53. 5/1/2012

    54. 5/1/2012 JFTK School Reports: Multi-Year Summary Activity

    55. 5/1/2012 JFTK School Reports: Consistency Analysis

    56. 5/1/2012

    57. 5/1/2012 High-Performing Schools Website Consistent High Performance TAKS data from 2003, 2004, and 2005 Every Grade and Subject Control for Student Demographics % Low Income, % ELL, Size of School, etc. Control for Prior Academic Performance (Middle and High Schools)

    58. 5/1/2012 Best Practice: State Studies

    59. 5/1/2012 Best Practice: State Studies

    60. 5/1/2012 Who are the consistently high-performing schools?

    61. 5/1/2012 Best Practice: State Studies

    62. 5/1/2012 How does my school compare to the consistently high-performing schools?

    63. 5/1/2012 How close is my school to being a consistently high-performing school?

    64. 5/1/2012

    65. 5/1/2012 How does my school compare to the consistently high-performing schools?

    66. 5/1/2012 Dan D. Rogers – 4th Grade Math, 2005

    67. 5/1/2012 Dan D. Rogers – 4th Grade Writing, 2005

    68. 5/1/2012 Explaining the high-performing criteria

    69. 5/1/2012

    70. 5/1/2012 Individual performance ranks Individual performance ranks compare similar schools on one particular TAKS test We use your campus demographics and the demographics of all the schools in the state to calculate an expected performance level for your school on a particular TAKS test

    71. 5/1/2012 Individual performance ranks: Requirement

    72. 5/1/2012 Individual performance ranks: Requirement

    73. 5/1/2012 Individual performance ranks: Calculation

    74. 5/1/2012 Individual performance ranks: Expectation - Distance - Rank

    75. 5/1/2012

    76. 5/1/2012

    77. 5/1/2012 In the actual high-performing analysis, the expected average scale score used is calculated with consideration to additional demographic variables on your campus, not just the school low income level The distance between your actual performance and this more complex expected value is calculated in the same manner as the previous example

    78. 5/1/2012 Your distance from our expectation is then ranked among the distances of all other schools with the same range of percent low income at the school There are four low income ranges: Schools with 0-25% low income Schools with 25-50% low income Schools with 50-75% low income Schools with 75-100% low income

    79. 5/1/2012 * The school most outperforming its expected performance will receive a rank of 99, meaning it outperformed our expectation better than 99% of the schools being ranked * The school most underperforming its expected performance will receive a rank of 0, meaning it outperformed our expectation better than 0% of the schools being ranked

    80. 5/1/2012

    81. 5/1/2012 Reviewing the individual performance rank requirement

    82. 5/1/2012 Overall performance rank: Requirement

    83. 5/1/2012 Overall performance rank: Calculation To calculate the overall performance rank, we average together all of your school’s distances from our expectation for a particular subject – across multiple years and grades of TAKS tests We then rank this average subject distance with schools in the same low income range, exactly as we did for a single distance with the individual performance rank

    84. 5/1/2012 Opportunity gaps: Requirement

    85. 5/1/2012 AYP and participation rates: Requirement

    86. 5/1/2012 AYP and participation rates: Requirement

    87. 5/1/2012 JFTK Self-Audits

    88. 5/1/2012 JFTK Self-Audits

    89. 5/1/2012 JFTK Self-Audits

    90. 5/1/2012 JFTK Self-Audit

    91. 5/1/2012 JFTK Self-Audit

    92. 5/1/2012 The World is Flat – Thomas L. Friedman

    93. 5/1/2012 Disney Channel - Higglytown Heroes

    94. 5/1/2012 Scott Hyten, CEO Wild Brain

    95. 5/1/2012 Standards and Innovation

    96. 5/1/2012

More Related