600 likes | 684 Views
Data Analysis. This presentation is intended to accompany the Georgia School Council GuideBook. Why so many tests?. Assessments are a central part of the accountability system required by the federal No Child Left Behind Act and the Georgia A+ Reform Act of 2000 .
E N D
Data Analysis This presentation is intended to accompany the Georgia School Council GuideBook. www.GeorgiaEducation.org
Why so many tests? • Assessments are a central part of the accountability system required by the federal No Child Left Behind Act and the Georgia A+ Reform Act of 2000. • Both laws require annual testing of all students in specific grades and subjects. See p. 2.15 of the Georgia School Council GuideBook for a chart of all required assessments in Georgia. www.GeorgiaEducation.org
Purpose of Assessments • To identify those students performing below grade level, at grade level and above grade level in order to tailor instruction to individual student needs. • To provide teachers with information to guide instruction. • To assist schools and school systems in setting priorities. www.GeorgiaEducation.org
Two Kinds of Assessments Criterion-referenced tests (CRTs) measure how well the student has learned a particular curriculum. • A group of experts decides how many questions must be answered correctly for a student to pass or to receive a score in a particular category (e.g., Pass, Pass Plus). The number of correct answers required is called the cut score. • CRCTs, end of course tests (EOCTs), and graduation tests are criterion-referenced assessments used in Georgia. www.GeorgiaEducation.org
Two Kind of Assessments Norm-referencedtests (NRTs) measure how well students perform compared to all the other students taking the test. • Scores are reported in percentiles of 1 to 99. • The 50th percentile is the median. Half of the students did better; half did worse. • The Iowa Test of Basic Skills (ITBS) and the Stanford 9 are the two most commonly used NRTs in Georgia. www.GeorgiaEducation.org
Compare and Contrast • Georgia’s criterion-referenced tests show how well students have mastered the curriculum adopted by the State Board of Education. • Norm-referenced tests, such as the Iowa Test of Basic Skills (ITBS), compare our students’ mastery of general knowledge to other students in the nation. www.GeorgiaEducation.org
Discuss • Some criticize the requirement for annual assessments saying it leads to teachers “teaching to the test.” Do you agree? • Does your answer depend on whether a CRT or NRT is used? • If assessments were not used, how would student achievement be measured and monitored? www.GeorgiaEducation.org
The National Test: NAEP • The National Assessment of Education Progress (NAEP) is sometimes referred to as the nation’s report card. • Only certain students in certain grades in certain schools take the test. • Results are provided only at the state level. • Federal law requires states to participate in NAEP reading and math assessments in grades 4 and 8 every other year. www.GeorgiaEducation.org
NAEP Is A Check and Balance • Each state adopts its own criterion-referenced test and sets the standards to be met. No Child Left Behind requires NAEP to be given to ensure states do not use easy tests or low standards. • Each state uses its own tests and standards to determine whether or not it meets the goals (Adequate Yearly Progress) of NCLB. • Comparing the percentage of students who meet the standards on the state test to the percentage of those who meet the NAEP standards can reveal the rigor of the state test. www.GeorgiaEducation.org
NAEP vs. CRCT Results • NAEP scores are listed as Advanced, Proficient, Basic, and Below Basic. The expectation is that students will be Proficient or Advanced. • CRCT scores are Exceeding the Standard, Meeting the Standard, and Did Not Meet the Standard. • Because of the 4 categories of scores versus three, it is difficult to directly compare them. The fairest way is to compare the percent Below Basic on NAEP and the percent Not Meeting the Standard on the CRCT. www.GeorgiaEducation.org
Comparing the percent of students who scored “Below Basic” on NAEP and “Does Not Meet Standard” on CRCT in 2003 www.GeorgiaEducation.org
NAEP vs. CRCT Results The closer the results are, the more likely it is that the tests have a similar level of difficulty and a similar cut score set. • 4th Grade Math has very similar results on the two tests. • 8th Grade Math is not quite as close. • The largest disparity is in 4th grade Reading. • 8th Grade Reading also has a large difference in the results. It appears that the NAEP Reading tests are more rigorous than the Georgia CRCT in Reading. www.GeorgiaEducation.org
Where To Find Georgia Assessment Data • Office of Student Achievement: www.gaosa.org • Georgia Department of Education: www.gadoe.org • Georgia School Council Institute: www.GeorgiaEducation.org • Georgia Public Policy Foundation: www.gppf.org www.GeorgiaEducation.org
Where To Find National Data • Standard and Poor: www.schoolmatters.org • Education Trust: www.edtrust.org • NAEP: http://nces.ed.gov/nationsreportcard/ www.GeorgiaEducation.org
Resources • Pp 2.3 – 2.10 in the Georgia School Council GuideBook are a step-by-step guide to analyzing test scores. • The GuideBook and this presentation are written using the Georgia School Council Institute’s website. Test scores are found in the Center for School Performance section at www.GeorgiaEducation.org. www.GeorgiaEducation.org
Purpose of Data Analysis • Are all students learning what we expect them to know? • Which students are not succeeding? • How do we improve the achievement of all students? “That which is not measured cannot be improved.” www.GeorgiaEducation.org
Three Levels of Test Data • There are three levels of test data available to the public: • State Level • System Level • School Level • Individual schools have class level and student level results. www.GeorgiaEducation.org
Analyzing Test Data Begin with state level data. www.GeorgiaEducation.org
Begin with State Level Data • Understanding the state statistics helps put your school and system data into perspective. • Learning the terminology helps you identify what is relevant. • First, look at the Profile Report to see the demographics of the state and the changes that are occurring. www.GeorgiaEducation.org
State Level Profile Report www.GeorgiaEducation.org
When you are provided data using percentages, always be clear as to whether you are looking at a change in percent or a change in percentage points. www.GeorgiaEducation.org
Understanding the Trends When changes in percentages are listed, it is a change in percentage points, not the percent change itself. • There was a 3percentage point increase in the number of Hispanic students. (4% in 2000 and 7% in 2004) • 48,366 more Hispanic students is a 54percent increase from 2000 to 2004. • 7% of the students in 2004: 1,486,125 x .07 = 104,029 • 4% of the students in 2000: 1,391,579 x .04 = 55,663 • Percent increase: 55,663 / 104,029 = 54% www.GeorgiaEducation.org
Understanding the Trends • Percentage point changes tell only part of the story. • What are the demographics of the students moving into the schools? • If Georgia gained 94,546 students in five years and 48,366 were Hispanic, then 51% (48,366 / 94,546) of the new students were Hispanic. • It is as important to know what the population trends are in your school as it is to know the demographic percentages. www.GeorgiaEducation.org
Discuss • The number of Hispanic students has grown tremendously, but the percent of students in the Limited English Proficient (LEP) program has remained steady. What conclusions can you draw? • What impact on student achievement could changing demographics have? www.GeorgiaEducation.org
Pop Quiz • 43% of all students (1,391,579) were eligible for free or reduced lunch (FRL) in 2000. • In 2004, 46% of the students (1,486,125) were eligible for FRL. • How many more students are eligible in 2004? • What is the percent increase in those eligible? www.GeorgiaEducation.org
Answers • In 2000: 598,379 students were eligible for FRL.(1,391,579 x .43 = 598,379) • In 2004: 683,618 were eligible.(1,486,125 x .46 = 683,618) • 85,239 more students were eligible for FRL.(683,618 - 598,379 = 85,239) • That is an 88% increase in the number of students eligible.(598,379/683,618) x 100 = 88% www.GeorgiaEducation.org
Analyzing Test Data Next, look at test scores for all the students at the state level. www.GeorgiaEducation.org
State Level Test Scores Report www.GeorgiaEducation.org
What do the numbers say? • The percent Exceeding the Standard should be going up, and the percent Not Meeting the Standard should be going down. • If the percent Meeting the Standard drops, which category is increasing? Are more students moving to a higher level or a lower level? • Are there changes in the number of students tested? • Has improvement been greater in one subject? In one particular grade? Is there a reason? www.GeorgiaEducation.org
Keep in Mind • Trend information is more important than comparing one year to another. • The same group of students is not being compared. • One year’s results alone does not indicate a trend. www.GeorgiaEducation.org
Look for Achievement Gaps • After looking at scores for all students, look at the scores for subgroups of students. • This is called disaggregating the data. • Federal and state law require the disaggregation and reporting of scores by ethnicity, gender, socioeconomic status, disability and migrant status. www.GeorgiaEducation.org
Look for Achievement Gaps On the Test Scores Report, click on a subject under “Achievement Gap Analysis” to see the current year’s scores of each subgroup in graph form. www.GeorgiaEducation.org
Look for Achievement Gaps On the Test Scores Report, click on the box labeled “View Scores by Group” and select a subgroup. You will see the disaggregated data by year. This allows you to see the trends in the scores of the subgroup. www.GeorgiaEducation.org
Analyzing Achievement Gaps • Compare the scores of the different groups of students. Differences in scores reveals the achievement gaps. • Which group has the highest proportion of students performing below grade level? • Are some groups doing better than others? • Are the differences the same in every subject? In every grade? www.GeorgiaEducation.org
Exercise • Jones Elementary is excited becausethe percent offourth grade students Exceeding the Standard in reading has increased by 16 percentage points in the last four years. • Some members of the school council are concerned because the percent of students who Did Not Meet the standard in reading did not decrease this year. • Is this grounds for concern? • What should the school council look at to answer this question? • Is additional data needed? www.GeorgiaEducation.org
What did you decide? • Results over time (4 years) is more significant than one year’s results. • Look at the number of students tested overall and in each subgroup. • If more students were tested, it is not necessarily grounds for concern that there was no change in the percent not meeting the standard. • If some subgroups improved, that is a positive change. If some subgroups lost ground, this may be something for the school council to keep in mind as they look at future data. Think of it as a caution light rather than a red flag. www.GeorgiaEducation.org
Analyzing test scores is more than just comparing numbers. It is comparing numbers in a way that puts them in perspective and gives them meaning. www.GeorgiaEducation.org
System and School Analysis • Do the same kind of analysis for the school system and your school. • Start with the Profile Report. • Look for changes in the student population. • Look at Test Score Reports for all students. • In each subject area, check the change in the percent Exceeding and the percent Not Meeting the standard. • Has the number of students being tested changed? • Compare scores to the system and state. • Check for achievement gaps. www.GeorgiaEducation.org
Comparing Schools or Systems The unique part of GeorgiaEducation.org is the ability to compare test scores of schools and systems that are demographically similar. The “Similar Systems” and “Similar Schools” Reports will give you this information. www.GeorgiaEducation.org
Comparing Schools You will first see a listing of your school, the state, and schools with similar demographics. www.GeorgiaEducation.org
Comparing Test Scores Click “Test Scores Comparisons” to see the test scores of all the schools. www.GeorgiaEducation.org
Comparing Test Scores • A disaggregation box is available. • The list can be sorted by clicking on “M/E” (Meets and Exceeds) to see the schools ranked by the highest percentage of students meeting and exceeding the standard. Clicking on “DNM” (Did Not Meet) puts the lowest achieving at the top. • Each subject can be sorted by achievement. www.GeorgiaEducation.org
Comparing Test Scores www.GeorgiaEducation.org
Comparing in Graph Form If you prefer to look at a graph, click on a subject area below “Target schools by subject.” www.GeorgiaEducation.org
Comparing in Graph Form The graph will include your school and the five most similar schools. www.GeorgiaEducation.org
The SAT Debate www.GeorgiaEducation.org
SAT and ACT Scores: What do they tell Us? • Both tests are used for college admissions, but they test different skills. • The SAT is more of a critical thinking and problem solving test designed to measure a student’s potential to learn. • The ACT is a more curriculum-based test designed to measure what a student has learned. www.GeorgiaEducation.org
SAT Scores • The SAT was designed to predict how well any given student would perform in his or her freshman year of college. • Because the SAT is taken by students in all 50 states, SAT scores are used by the media to rank the quality of public education in the 50 states. • Within the state, SAT scores are used to rank high schools. • Georgia’s low ranking is often attributed to the high percent of students taking the test. Is that a valid argument? • What happens if only states with similar demographics and similar participation rates are compared? www.GeorgiaEducation.org
Comparison of 2004 Georgia SAT Scores to States with Similar Participation and Demographics www.GeorgiaEducation.org
Are our students prepared for the SAT? • If the purpose of the SAT is to determine college readiness, students taking it should be on the college prep track. Is the number of students who receive a college prep diploma similar to the number who take the SAT? • If the percent of students eligible for a HOPE scholarship is used to estimate grade point average, what does that indicate about the preparedness of Georgia’s students? www.GeorgiaEducation.org