320 likes | 545 Views
PbS: Measuring performance to improve. Ned Loughran, Executive Director, Council of Juvenile Correctional Administrators (CJCA)CJCANational non-profit organization dedicated to improvement of youth correctional services, incorporated in 1994Unites nation's youth correctional CEOs to promote leade
E N D
1. Performance-based Standards (PbS) for Youth Correction and Detention Facilities:
A System for Continuous Improvement
NDTAC Webinar
Thurs, July 21, 2005
2. PbS: Measuring performance to improve Ned Loughran, Executive Director, Council of Juvenile Correctional Administrators (CJCA)
CJCA
National non-profit organization dedicated to improvement of youth correctional services, incorporated in 1994
Unites nations youth correctional CEOs to promote leadership for juvenile justice
Projects: New Directors Seminar, MacArthur Foundation Model Systems Project, OJJDP Mental Health Model
Committed to expanding the adoption of PbS as a best practice to improve conditions of confinement
3. Presentation Overview How PbS was developed
The scope of PbS: Standards and Outcomes
How PbS works
Examples
www.pbstandards.org for more information
4. History of Juvenile Justice First court 100 years ago
Goal to treat children differently from adults to recognize differences in development, capabilities
Pendulum swing: punishment, rehabilitation
Most recently: transfer laws, building new facilities
Influx of youths with mental health problems, rise in female offender population
5. The cycle of juvenile institutions Since the opening of the first facility in 1846, the institution has been the program of choice for juvenile offenders
Institutions have cyclical lives: initial calm; overcrowding due to crackdown on crime; deterioration and violence; media event; government investigation; Blue Ribbon Commission; reforms and back to calm.
PbS is a cycle that breaks the cycle.
6. Before PbS Weve counted facilities and youths:
Census of Juveniles in Residential Placement counts facilities and number of youths within the facilities annually
Most recent report: 2,980 facilities (1,197 public, 1,774 private, and 9 tribal facilities) that held 104,413 residents (82%) that met all the inclusion criteria for the census:
Younger than 21.
Charged with an offense or court-adjudicated for an offense.
In residential placement because of that offense.
7. We didnt measure performance What went on behind razor wire fences was ignored, unknown and largely avoided; public perception formed by press, horrific incident, super predators label
Existing standards and accreditation were:
Process and policy-based; didnt measure performance
Pass / fail
Three-year cycle
Not about improvement
Data usually meant someone was about sue!
Recidivism not accurate measure of effectiveness
8. 1994 Conditions of Confinement Study About 1,000 secure facilities and found substantial and widespread deficiencies:
High rates of youths and staff getting hurt
High rates of suicidal behavior
Few timely or professional health screenings
High levels of staff turnover
Adherence to existing standards did not mean better facility
9. 1995: PbS launched to address COC problems OJJDP selected the Council of Juvenile Correctional Administrators (CJCA) to direct and develop the project
Guiding principals First key to sustaining success:
Set standards at highest level of performance, not minimums
Facilities should be places wed feel comfortable sending our own children
Gradually transfer knowledge, skills and ownership of PbS from project to field
Field driven to be meaningful and useful: feedback
Start with all stakeholders, related agencies at the table
Create meaningful and user-friendly performance measures
10. Development Process Inclusive, comprehensive and a loop for feedback
Advisory Board of representatives from 10 major organizations set the framework, goals and standards
Working group on each function area: Safety, Order, Security, Health/Mental Health, Programming, Justice, and Reintegration; comprised of experts, practitioners, researchers developed outcome measures to indicate performance related to standards
Pilot, revise, field test feedback loop
11. Struggle: Performance outcomes vs. process indicators Difficult to avoid process creep
You are what you count
Count only what you report
Report everything back to users
Some things cannot be translated into numbers you need people
Processes, policies are a foundation but do not demonstrate effectiveness
Outcomes are measures, outcomes dont measure they are the consequence of activities
12. Performance outcomes - examples Rates of injuries as indication of level of safety
Percentage of youths improving math and reading scores from pre-test to post-test indicating effectiveness of education program
Number of instances youths placed in isolation and average duration to describe behavior management system and sense of order
Interview youths and staff to ask about perception of safety reported as percentages who report fear describe quality of life for youths and staff
13. Keys to successful development of national standards and performance outcomes: Include all major stakeholders from the beginning -Guiding principals established and adopted from the outset; keep advisors informed.
Listen to the users/ field Build structure to collect continual feedback from field and use it to make revisions; when they see youre listening, encouraged to adopt.
Take advantage of technology Worth any early resistance, fear and training investment.
Provide in-person technical assistance as much as possible PbS site consultants and two full-time staff answering calls, emails; no one implementing is alone, stuck
Work to meet needs of users/ field not just drop standards on them and leave.
14. The innovation of PbS: For the first time, staff, managers and directors know from data what is going on in facilities and how they perform compared with national standards and other facilities
PbS provides the facilities and agencies with a picture of their performance in the context of improvement, with:
Tools, easy-to-read bar graph reports to identify the good and not-so-good;
A roadmap of practices and ways to improve and
Monitoring over time to demonstrate accountability and effectiveness through changing performance outcomes
15. Project Elements A set of seven goals and 27 standards to assess:
Safety
Order
Security
Health and mental health
Programming
Justice
Reintegration
Performance toward meeting each standard is measured using one or more outcome measures, which are compared over time and to other participating facilities.
16. Example: Safety Goal: To engage in management practices that promote the safety and well-being of staff and youths.
Standards:
Protect youth and staff from intentional and accidental injuries
Promote management practices and behavior that minimize harm resulting form the use of restraints, isolation and environmental risks;
Protect youth and staff from fear.
Outcome Measures: Number of injuries to youths; number of injuries to youths by other youths; incidents of suicidal behavior with and without injury by youths; percent of youths and staff reporting that they fear for their safety.
17. Outcome report: Injuries to youths by other youths
18. Example: Order Goal: To establish clear expectations of behavior and an accompanying system of accountability for youths and staff that promote mutual respect, self discipline and order.
Standards:
Maximize responsible behavior by youth and staff and conformance to facility rules;
Minimize the facilitys use of restrictive and coercive means of responding to disorder;
Maximize opportunities for youths to participate in activities and programs.
Outcome Measures: Incidents of youth misconduct; use of physical restraint; use of mechanical restraint; use of isolation or room confinement and; average duration of isolation or room confinement.
19. Outcome report: Physical restraint use
20. Measuring performance PbS outcome measures adhere to definition of performance outcomes as rates, frequencies, numbers that show change in status, occurrence or prevalence
PbS outcomes are measured twice a year to reflect change from one data collection period to the next
PbS reports performance to users in many ways: performance over time, in comparison to the field average of other participating facilities, performance on outcomes targeted for improvement and outcomes critical to safe and effective operations
21. Demonstrating performance: The PbS continuous improvement cycle:
22. Data Collection Administrative Form
1 per site, 46 questions
Incident Reports
All incident reports for data collection period
Youth Record
30 random YR, 93 questions
Youth Climate Survey
30 random youths, 38 questions
Staff Climate Survey
30 random staff, 38 questions
Youth Exit Interview
All youths released since last data collection, 24 questions
Ongoing data entry
23. Site Reports Divided Between Areas:
Safety, Order, Security, Health/ Mental Health, Programming, Reintegration
Corrections:
105 outcomes
Detention:
59 outcomes
Field Averages
24. Components of a Sites FIP
FIP Status
Targeted Outcome Measures
What is the problem?
Action Steps
Progress Notes
Ongoing Review Facility Improvement Plans
25. Level System
26. Critical Outcome Measures 31 Outcome Measures distributed over areas of Safety Security, Order and Health and Mental Health
Deal with issues of staff and youth injuries, suicidal behavior, abuse, neglect, restraints, assaults, fear for safety, confinement, contraband, health and mental health screenings
Available as a report for each site
27. Example: Overuse of isolation in NJ New juvenile agency based on adult model
New director looking to improve performance
PbS report gave him the data and information on what needs to be improved
Average time in isolation off the page
Decided to institute change, worked with unions, rewrote regulations, designed staff training
Outcome measure: average time spent in isolation
Changes over time:
257 hours October 2002 to
29.2 hours October 2004; working to eliminate
28. Order 9: Average Duration of ConfinementCorrections compared to Detention Field Average
29. Order 9: Average Duration of ConfinementNew Jersey Training School
30. Order 9: Average Duration of Confinement New Jersey Juvenile Medium Secure Facility
31. Order 9: Average Duration of Confinement New Jersey Female Secure and Intake Facility
32. PbS Website and email address www.pbstandards.org
help@pbstandards.org