160 likes | 253 Views
ND Community Call Data Dashboards: Part 2 February 19, 2013. Dashboards vs. Report Cards. What’s a dashboard?
E N D
ND Community Call Data Dashboards: Part 2 February 19, 2013
Dashboards vs. Report Cards • What’s a dashboard? • A navigation system that can graphically represent current program performance—highlighting key areas of strength and weakness—as well as predict orforewarnof programs that are not on track to meet program performance goals at a glance • Supports decision-making
Essential Steps • Define program priorities • Explore existing data • Map current and potential data sources • Select performance indicators • Set performance targets and threshold criteria • Conceptually group indicators • Design the dashboard interface • Develop the dashboard • Implement the dashboard
Step 4: Select Performance Indicators • Things to Consider: • Good dashboards need good data; good data is: • accessible • clean • timely • comprehensible • actionable • Types of indicators (e.g., Inputs, outputs, leading, lagging, student level, teacher level, classroom level, school/facility level, district level) • The inclusion of leading indicators that correlate with lagging indicators
Leading and Lagging Indicators Leading Indicators are outputs and short-term outcomes: • Demonstrate signs of growth or change in a given direction suggesting early wins and areas of improvement • Provide an early read on progress towards long-term outcomes • Measure conditions that are prerequisite to the desired outcomes (i.e., predict lagging indicators) Lagging Indicators are long-term or desired outcomes: • Measure the success and consequences of activities that have already occurred • Measure achievement of the desired outcomes
Leading and Lagging Indicators On the next slide, identify the leading indicators and their corresponding lagging indicators?
Leading and Lagging Indicator • Teacher turnover rate • Number of youth who earn a CTE certificate • Course completion rate • Number of youth who begin a technical trade while in aftercare • Number of disciplinary incidents • Hours of professional development
Step 4: Select Performance Indicators What kind of indicator are each of the following and why? Any caveats? • Graduation rate • Enrollment rate • GED enrollment rate • Number of CTE certificates awarded • Number of CTE certificates earned • Recidivism rate • Types of CTE courses offered • Number of CTE courses offered • Per pupil spending • Number of youth served • Percentage of HQT by FTE • Bed count • High school transcript • Average SAT/ACT score • Course completion rate
Step 5: Set Performance Targets and Threshold Criteria • Things to Consider: • In terms of your priorities, where do you want your subgrantees and facilities to be in one year? Two years? Three years? • What performance benchmarks might you set to measure their progress along the way? • How will you know when to target a subgrantee or facility for technical assistance? At what point might you sound the alarm?
Step 6: Conceptually Group Indicators • Things to Consider: • How might you categorize your selected indicators in a way that makes it easier for you to identify subgrantees/facilities that are not meeting your performance targets? • Demographics? • Outcomes (academic vs. transition vs. behavioral)? • Facility features and characteristics? • Staffing? • Priorities? • Common administrative challenges? • Common program implementation problems?
Step 6: Conceptually Group Indicators How might you group the following indicators? Types of CTE courses offered Number of CTE courses offered Per pupil spending Number of youth served Percentage of HQT by FTE Bed count High school transcript • Graduation rate • Enrollment rate • GED enrollment rate • Number of CTE certificates awarded • Number of CTE certificates earned • Recidivism rate • Course completion rate • Average SAT/ACT score
Step 7: Design the Dashboard Interface • Things to Consider: • The KISS (keep it simple sally) principle applies • Display high-level information that the user can understand • No extraneous or irrelevant details • No meaningless color coding, variety, or decorative elements • Data without a context is trivia: What data are essential to tell the story visually (i.e., without narration or analysis)?
Step 7: Design the Dashboard Interface • Things to Consider: • Choose the right display • Tabular (spreadsheet), graphical, or some combination? • Bar chart, pie chart, gauge, map or time series graph? • Highlight important data at a glance • Emphasize important data by its position on the dashboard • Emphasize important data by visual attributes like color intensity, size, line width • All dashboard data should be visible on a single screen without scrolling
Dashboard Interfaces • For Discussion: • What do like and not like about these dashboard interfaces?
Next Steps • Homework: • One-on-one follow-up call to discuss homework, finalize indicators, threshold criteria, and conceptually groupings • Collect and submit sample data (scrubbed of any personally identifiable information) associated with these indicators
Next Steps • For Discussion: • What resources do you have available to support the development and implementation of a dashboard? • Human? • Financial? • Technical/technological? • For our next call, would a hands-on tutorial on Excel and/or another decision support tool like Tableau be helpful?