250 likes | 388 Views
Reporting college and career readiness results to the public DQC Public Reporting Task Force | January 9, 2014. Objectives. Today’s webinar is designed to address several questions: What are college- and career-ready indicators (CCR) and to what extent do states report them to the public?
E N D
Reporting college and career readiness results to the public DQC Public Reporting Task Force | January 9, 2014
Objectives • Today’s webinar is designed to address several questions: • What are college- and career-ready indicators (CCR) and to what extent do states report them to the public? • What considerations should states use when reporting CCR indicators? • What are trends across states as well as emerging issues?
CCR indicators fall along a continuum of readiness Source: Adapted from Measures that Matter: Making College and Career Readiness the Mission of High Schools, Achieve and the Education Trust, 2008
States that use multiple CCR indicators in a variety of ways signal a commitment to readiness
Only one state, Florida, reports all categories of CCR indicators to the public Data Source: Achieve, Closing the Expectations Gap 2013, www.achieve.org/ClosingtheExpectationsGap2013
Achieve has published several resources to provide guidance to states Source: www.achieve.org/public-reporting
Some guidance for calculating CCR indicators • The way states calculate CCR indicators matters for results • Indicators should be criterion-referenced where possible (e.g. “percent of students meeting the CCR benchmark” rather than average score) to better capture changes in readiness • Denominators should include all students, preferably all students in a graduating cohort (e.g. the 2012-13 graduating cohort rather than just students taking an assessment) to improve the stability of the indicator and its ability to portray the full picture of readiness for students in the school • This may mean that states will need to work with data providers to refine the way they receive data.
EXAMPLE: North Carolina reports the percent of all 11th grade students meeting ACT benchmarks Source: North Carolina ACT and WorkKeys Data Sets, http://www.dpi.state.nc.us/docs/accountability/reporting/act-results1213.pdf
Reporting techniques can build understanding and raise the sense of urgency • States can use a number of strong techniques • Reporting the number of students as well as percentages • Building in comparisons - vertical comparisons such as school to district to state, horizontal comparisons such as school rankings or showing where the school’s performance lies upon a spectrum, or trends over time • Highlighting disparities among student groups • Some data and functionality may need to live online (along a spectrum of static to interactive reports) while others can translate to a paper report that might be given to parents
EXAMPLE: Illinois reports the percent of students meeting ACT benchmarks with vertical comparisons to the district and state Source: http://illinoisreportcard.com
EXAMPLE: Indiana compares CCR outcomes across student groups Source: Indiana COMPASS reports, http://compass.doe.in.gov/dashboard/graduates.aspx?type=state
EXAMPLE: Indiana compares school to state and district performance and trends Source: Indiana COMPASS reports, http://compass.doe.in.gov/dashboard/collegereadiness.aspx?type=state
EXAMPLE: Michigan displays remediation data over time – and by student subgroup Source: Michigan School Data, https://www.mischooldata.org/DistrictSchoolProfiles/PostsecondaryOutcomes/IheEnrollmentByHighSchool.aspx
EXAMPLE: Maryland includes both percent and number of students graduating with CCR courses of study Source: 2013 Maryland Report Card, http://www.mdreportcard.org/HighSchoolCompletionOther.aspx?PV=38:12:30:0338:3:N:0:13:1:2:1:1:1:2:3
EXAMPLE: Massachusetts DART shows the number and percent of students graduating with MassCore requirements over time Source: Massachusetts DART system, http://www.doe.mass.edu/apa/dart/
EXAMPLE: Texas uses student numbers to explain graduation rates Source: Texas 2012 Campus Graduation Summary, http://ritter.tea.state.tx.us/acctres/completion/script/2012/campus.html
EXAMPLE: Australia’s MySchool shows student performance along a spectrum of similar schools’ results Source: Australian Curriculum, Assessment, and Reporting Authority http://www.myschool.edu.au/
States can also use other techniques to better present the data in context • Adding “judgments” can enhance understanding of performance patterns • Traffic-lighting – color-coding in categories such as red, yellow, green • Presenting performance data against goals and benchmarks • Ratings or classifications – these may include those used in the state accountability system, or be defined separately for measures used only in the report card
EXAMPLE: Kentucky shows actual scores against performance targets Source: 2013 Kentucky School Report Cards, http://applications.education.ky.gov/src/DeliveryTargetGraph.aspx
There are a few trends across states • Far better visibility and functionality • Enhanced engagement with stakeholders, focus groups • Greater influence from accountability on public reporting than in previous years – district/school report cards are becoming the primary way SEAs report data to the public • Less top-level reporting of student subgroup-level results • More states are using “combined” indicators • More transitions “coming in 2015”
Emerging issues • How will states leverage reporting from new assessments aligned to CCR standards to answer critical questions from parents, policymakers and the public? • How will states collaborate across agencies and sectors to get the right data to the right people at the right time? • How might states use public reporting as a strategy to meet goals for students?
Reporting college and career readiness results to the public