780 likes | 879 Views
The Board & Student Achievement. New Jersey School Boards Association March 2, 2013 presented by Dr . Tracey Severns. Introductions – Who am I?. Background Check Teacher Vice Principal & Principal Superintendent Researcher Presenter Student Chief Academic Officer for the NJDOE.
E N D
The Board & Student Achievement New Jersey School Boards Association March 2, 2013 presented by Dr. Tracey Severns
Introductions – Who am I? Background Check • Teacher • Vice Principal & Principal • Superintendent • Researcher • Presenter • Student • Chief Academic Officer for the NJDOE
Who are you? • What district do you represent? • Why are you here?
Defining Success What is your definition of a great school? Make it short and measurable.
Milton Chen’s definition is… The kids run in faster than they run out. (and so do the faculty!)
Consider this… I think that if we changed _____________________, our students’ scores would improve.
Without data, it’s just an opinion. Opinionsmay be your most important data!
What opinions do you suffer? • Special ed kids are better served in special ed classes. • Grouping students by ability improves student achievement. • Having one teacher, all day, is the best way to teach elementary school. • The students fail because they don’t care.
Leaders must use data to: • Evaluate progress and performance • Establish goals and mobilize efforts • Leverage resources • Inform practice • Guide decision-making • Measure, Monitor & Market results
Today, we’re going to • Examine the role of BOE members in using data to improve student achievement. • Learn to ask questions of the data.
Establish a baseline With regard to student achievement: • What data do you have? • What data do you use? • Who uses the data? • For what purpose are the data used? • What data do you need?
Identifying the Data Barriers • What gets in the way of using data in schools and school districts? • What are the obstacles?
Data Sources and Key Results Student Performance (classroom quizzes/tests, lab reports, projects, pre/post tests, GPA, performance assessments, standardized tests (norm ref, criterion ref) PSAT, SAT, ACT, AP, report card grades, portfolio pieces, writing assessments, promotion/graduation rates, discipline records, college acceptance, G&T, BSI, honors classes, advanced courses, honor/high honor roll, scholarships, awards, record at competitions/championship)
Data Sources and Key Results • Demographic data (enrollment and performance by race, gender, SES, ELL, special education, migrant) • Climate (exit/entrance interviews, surveys, attendance, extracurricular participation, passage of referendums/school budgets) • Resources (personnel, computers, connectivity, time, space, revenues, expenditures)
When working with data, use three reference points. • How are we doing compared to standard? (Proficiency) • How are we doing compared to ourselves? (Progress) • How are we doing compared to others? (Relative performance)
Performance Targets According to the ESEA Waiver: Targets are set in annual equal increments so that within six years the percentage of non-proficient students in the “all students” group and in each subgroup is reduced by half.
Huh? If 40% of “all students” are Proficient: • 100 – 40 = 60 • 100%P – current %P = gap • 60 / 2 = 30 • Gap divided by 2 = target % increase in 6 yrs • 30 / 6 = 5 • 6 yr target divided by 6 = annual target % increase
And so… • For this school, the expected performance rates would be: • Yr 1 45%P • Yr 2 50%P • Yr 3 55%P • Yr 4 60%P • Yr 5 65%P • Yr 6 70%P
More on Performance Targets • Targets were based on 2010-2011 data. • This process was repeated for each subgroup with an n > 30. • High perfgrps can meet expectations by achieving 90%P (95%P in 2015).
Question Does this process effect every subgroup equally?
How are we doing compared to Standard in 5th grade language arts?
How are we doing compared to Standard and Ourselves in Language Arts?
How are we doing compared to Standard and Ourselves in Math?
DFG % FARMS % ELL % ELL at home % Special needs Student mobility Teacher mobility Class size Cost per pupil Total enrollment Instructional hours Student/Faculty ratio Student/Admin ratio How would you define comparable?
Where do you stand? School Digger • www.schooldigger.com – ranks all NJ public elementary, middle and high schools by adding each school’s average ASK Math and LA scores. • Includes a 5 star system to designate schools in the top 10% of the ranking
Coping with Education Statistics “There are three kinds of lies: lies, damned lies and statistics.” “Sometimes we accept statistics because we are not in a position to challenge them. Other times we accept them because we lack the time to ferret out the truth.” - Gerald Bracey
Simpson’s Paradox • Has nothing to do with Homer. • Beware of changes in groups over time when the aggregate data show one pattern and the disaggregated data show the opposite.
Consider this… SAT Scores 2005 SAT Scores 2011 Mean = 480 Mean = 478 At a BOE meeting, people demand to know, “Why are SAT scores dropping?” But are they?
Let’s examine the data SAT Scores 2005 SAT Scores 2011 500 510 500 510 500 510 500 510 500 510 500 510 500 430 500 430 400 430 400430 Mean = 480 Mean = 478
First, we need to understand that In 2005, the 500s represent scores of white students and 400s represent scores of black students. In 2011, the 510s represent scores of white students and 430s represent scores of black students.
What do you notice? White students’ scores went up 10 points. Black students’ scores went up 30 points. but In 2005, 80% were white, 20% were black. In 2011, 60% were white, 40% were black.
And so… • Although the SAT scores for both groups increased, the overall mean decreased because there was a higher percentage of minority students taking the test. • Thus, beware of shifts in subgroup proportion and performance over time.
Simpson’s Paradox at work… Ethnic Group 1995 2005 Gain White 519 529 +10 Black 412 433 +21 Asian 474 511 +37 Mexican 438 453 +15 Puerto Rican 437 460 +23 Am Indian 471 489 +18 All Students 504 508 +4
Imagine this. • Your superintendent has just presented these results. • Write down what you are thinking.
Root Cause Analysis Why are we doing better? To what do we attribute the results?
Revealing the Root Cause Root cause analysis is the process of identifying the underlying cause, or causes, of positive or negative outcomes within a system. Paul Pruess
In other words… Why did subgroups perform as they did? Possibilities include: • Organizational issues (time, availability of programs, personnel or support services) • Instructional/implementation issues (curriculum, instruction, assessment) • Environmental issues (external forces or factors that may have influenced results)
Data Analysis • What trends do you find in the data? • To what would you attribute the results? • What questions come to mind when you review the data? • What recommendations would you make to improve student performance?