480 likes | 737 Views
Performance Measurement and the OJJDP Data Collection Tool presented at the OJJDP National Grantee Orientation April 6–7, 2010. CSR’s DCTAT & Performance Measurement Team. Agnes Cholewa Ashley Hayward Mary Leonard Elizabeth Logan Ursula Murdaugh Monica Robbers Matt Watson. Outline.
E N D
Performance Measurement and the OJJDP Data Collection Tool presented at the OJJDP National Grantee Orientation April 6–7, 2010
CSR’s DCTAT & Performance Measurement Team • Agnes Cholewa • Ashley Hayward • Mary Leonard • Elizabeth Logan • Ursula Murdaugh • Monica Robbers • Matt Watson
Outline Requirements Performance Measurement Data Collection Reporting Performance Measurement Data to OJJDP
Requirements Projects are required to: Collect and report performance measurement data Participate in an OJJDP DCTAT training session Submit a report on these data to OJJDP semiannually
Performance Measures Concerned with collecting information to determine whether a program achieved its goals and objectives Information from performance measurement is used to improve the operation of the program Inputs, outputs, and outcomes are collected and reported
Performance Measurement vs. Evaluation Performance Measurement Feature Evaluation Question How much? What does it mean? Example Game score Game analysis Offers A tally Causality Timeframe Continuous (Ongoing) Interval (Discrete) Cost Less expensive More expensive Performance measurement is necessary, but not sufficient, for evaluation.
Performance Measurement and Data Collection Performance measures and data collection are building blocks of evaluation Hard proof of what/how/when/why your program is doing Documentation supports sustainability efforts Specifically: Strengthens accountability Enhances decision-making (helps governments and communities determine effective resource use) Improves customer service Supports strategic planning and goal setting
Federal Initiatives on Performance Measurement Government Performance and Results Act (GPRA, 1993) Shift from accountability for process to accountability for results Programs must show effectiveness to justify funding Federal Agency Rating of Programs President's Agenda – “Transparency and accountability a priority” Several State-level efforts also in place
Funding and Information Flows Congress and OMB $ $ $ OJJDP Grantees Information Communities
History of Performance Measurement at OJJDP DCTAT opened for DCTAT TYP, EUDL opened for BG, and JABG Title V and Discretionary DCTAT opened Performance Formula Grants Grant Data for T-JADG Data Measures Data Reporting Reporting Reporting Developed 2002 DCTAT 2004 2006 2004/2005 JABG PART opened for JABG Report PART of Title V Report JABG Data to Congress Juvenile to Congress Reporting Included Justice Included Quantitative Programs Quantitative Performance Performance Data Data A Brief History…
Office of Juvenile Justice and Delinquency Prevention Mission/Purpose: Authorizing legislation is the Juvenile Justice and Delinquency Prevention Act of 2002 Focus is on helping States and localities to respond to juvenile risk behavior and delinquency Primary function of the agency is to provide program grant funding, and support research and technical assistance/training Long-term goal is prevention and reduction in juvenile crime and victimization
Diversity of Programs Formula, Block Grants for States Tribal Youth Programs Discretionary Competitive Programs Enforcing Underage Drinking Laws (Block and Discretionary Grants Victimization Grants (Amber Alert, Internet safety) Congressional Earmark Grants
OJJDP Funding OJJDP generally funds 4 types of programs/projects: Direct-Service Prevention Direct-Service Intervention System Improvement Research and Development
Development of Core Measures for OJJDP Programs A small number of measures that directly link to OJJDP’s core mission Comparability within and across programs A focus on quality services and youth outcomes
27% of discretionary grantees are implementing one or more evidence-based programs:(July-December 2009 Reporting Period) *Definition: Programs and practices that have been shown, through rigorous evaluation and replication, to be effective at preventing or reducing juvenile delinquency or victimization, or related risk factors. Evidence-based programs or practices can come from many valid sources (e.g., Blueprints for Violence Prevention, OJJDP’s Model Programs Guide). Evidence-based practices may also include practices adopted by agencies, organizations, or staff which are generally recognized as “best practice” based on research literature and/or the degree to which the practice is based on a clear, well-articulated theory or conceptual framework for delinquency or victimization prevention and/or intervention. Evidence-Based Programs*
OJJDP’s “Behavior” Measure Options Percentage of program youth who exhibit a desired change in the targeted behavior.(Several options – select most relevant behavior) Substance use Social competence School attendance GPA GED High school completion Job skills Employment status Family relationships Family Functioning Antisocial behavior Gang-related activities
Other Data Results From the July – December 2009 Reporting Period • Number of Youth Served: 109,656 • Number of Youth Who Offend or Reoffend: 644 • Funds Used For: • Direct Service Prevention: $3,253,214 • Direct Service Intervention: $1,719,168 • System Improvement: $1,163,804 • Research and Development: $336,584
OJJDP’s Performance Measures Website http://ojjdp.ncjrs.org/grantees/pm/
Data Collection Need up-front planning Need a sense of what you are trying to accomplish What data will you collect and why? What data sources are available and which will you use? How will you use the data beyond just reporting it to OJJDP?
Purpose of Data Collection An ongoing process that keeps the project focused Provides the information needed to report on performance measures Data and data collection are the building blocks of performance evaluation Use data collection to enhance your ability to monitor and evaluate your program
Data Collection Standards Program documentation Clearly describe and document performance measures Keep logic model and performance measure documentation together as part of the history of your program Formal agreements for data collection Make sure that written agreements are clear Collect valid and reliable data Report accurate data
Data Collection Standards (cont.) Analyze Data Quantitative data (i.e., data from surveys) and qualitative data (i.e., from interviews) should be appropriately and systematically analyzed Obtain training and technical assistance for this if necessary Justify Conclusions Justify the conclusions you make from your data Protect Rights of Program Participants Design and conduct data collection to protect the rights and welfare of all participants Obtain training and technical assistance for this if necessary
Keeping Track of Data Use data collection planning tool Identify staff member to coordinate and monitor data collection Assemble data collection checklists Develop forms and instruments Develop procedures or policies for collecting needed data Must collect accurate data in a systematic manner Develop a codebook to define the data you collect Policies and data collection codebooks can help keep the program on track even with staff turnover Pilot-test your procedures!
Plan for Performance Measurement in Ongoing Program Assessment To assess your program, include plans for: Analysis/synthesis – How performance measurement data will be analyzed and summarized Interpretation – How the program will interpret what the data mean Dissemination – Which program stakeholders will receive the results of the performance measurement? Recommendations – How the group will identify recommendations based on the results of the performance measurement
The Data Collection and Technical Assistance Tool (DCTAT) The OJJDP Data Collection Tool (DCTAT) is a resource for your program Lists data submission deadlines Includes a training power point for how to use the DCTAT Lists webinar-based training schedules, phone number and e-mail for technical assistance Links to performance measure (indicator) grids Generates reports Generates documentation for your program Include with bi-annual CAPRs For use in your program Changes and improvements to the DCTAT are ongoing
The DCTAT Steps to Complete Reporting in the DCTAT: Log in Profile (Review, Complete, or Revise) Select a Reporting Period Step 1: Enter Award Information (Includes Target Population Information) Step 2: Select Program Categories Step 3: Select Performance Indicators Step 4: Enter Data Step 5: Create a Report to Submit to OJJDP Complete the User Feedback Form
DCTAT Sign-in Screen This screen contains information and resources for your program The Grantee will be provided with a user ID and password from the System Administrator Grantee (Grantor) is defined as the primary recipient of funds from OJJDP. Website address: http://www.ojjdp-dctat.org
Profile Screen If you are a first-time user, the system will take you to this screen first. Profile screen contains information received via a download from GMS. Please update this page frequently to receive important e-mails from the DCTAT. Most screens in the DCTAThavehelpdesk contact info
Grant Program Selection Screen If you are a returning user, the system will take you to this screen first. The purpose of this screen is for you to select the reporting period for which you need to enter data for a current reporting period or view data entered previously. If you are not sure, please call the DCTAT help desk.
Designation Screen The purpose of this screen is for you to inform the DCTAT how you as the Grantee administer your funds. There are 2 methods: 1) Grantee spends funds and/or awards funds to subaward recipients (subgrantees); 2) the Grantee solely uses all funds. NOTE: Subgrantees are secondary recipients of funds from the Grantor (not from OJJDP). Secondary awards were made from the primary award received from OJJDP.
Grantee Status Summary Screen This screen provides the status of performance measures data entry at the grantee level
Grantee Status Summary Screen with Subgrantees This screen provides the status of performance measures data entry at the grantee level and subgrantee level (if applicable) The system has red buttons that lead you to the next action or step. “Follow the red buttons!”
Step 1: Award Information Screen (1 of 3) This is a view of the first data entry screen. It is general info questions about your award or subaward.
Step 1: Award Information Screen (2 of 3) Target Population Information Continued • Tell OJJDP about the population that is served/funded by your award. This will be different at the grantee level or subgrantee level. • Programs that directly provide services/programs to youth are asked to define the population by race/ethnicity, justice involvement, gender, age, geographic location of population served by the federal award. • Grants that use funds for “system improvement” type projects should select the option “Youth population not directly served”.
Step 1: Award Information Screen (3 of 3) Target Population Information Continued The “other” category is to define other factors that may define the population that you are serving. Are these additional factors that were proposed when you applied for funding?
Step 2: Program Category Selection Screen The next step is to Select Program Categories. • Remember activities funded by your award are organized • into these 4 categories: • Prevention – Youth has not had any involvement in the juvenile justice (jj) system but may have risk factors for involvement. • Intervention – Youth has had some involvement in the jj system and you would like to intervene to prevent further involvement • System Improvement – a program or project may need hiring of staff, staff training, new policies/procedures; MIS development/enhancement • Research and Development – a project is research or evaluation focused; related to a juvenile justice program or population; or development of materials that will be considered for use with a juvenile justice population or program
Step 3: Indicator Selection Screen Can Be Mandatory and Optional The next step is to select indicators (performance measures) that represent your grant-funded activities The indicators are presented as mandatory (those that OJJDP requires you to report to support their “core” measures) and then optional indicators. The optional indicators are additional measures for which you are encouraged to select as many as apply to your grant-funded activities. This is data that may help to maintain and manage your program activities.
Step 4: Data Entry Screen • This screen provides you with all of the • mandatory and optional measures that • were selected for data reporting. • If a mandatory measure does not relate to your grant-funded activities, enter zero. • If you do not have data this reporting period for a selected optional or mandatory measure, just enter zero. • In the comments section of the Performance Data Report, you can explain the zero values that were reported.
Step 5: Reports Menu There is 1 Mandatory Report Type: Once all data entry has been completed, you are ready to create the mandatory report that should be submitted to OJJDP through the Grants Management System (GMS). Performance Data Report: Aggregates your data; submit this one to OJJDP through GMS
Step 5: Reports Menu (cont.) 1. Performance Data Summary Report – Provides a comparison of a grantee’s aggregated data to an aggregate of national data by federal program. 2. Subaward Detail Data Report – Contains performance measurement data for all active awards at the grantee and/or subgrantee level for the reporting period. 3. Performance Data Reportby Subgrantee – An aggregate data report by subgrantee by federal award. (only displays when applicable) 4. Close Out Report – Provides in aggregate form, data reported across all reporting periods during the life of the award. It should be submitted as a part of the close-out package when the close out process has been initiated in the GMS system. 1 2 3 In addition to the mandatory report, the DCTAT provides other reports for your use.
User Feedback Form Wait - before you go! Let us know about your experience using the DCTAT and how you would like to use your data!
Please Remember! Report accurate data! Prepare your data before entering the tool Follow the red buttons to get to the next step When data entry is complete, select “Mark data as complete and create final Performance Data report” Export the Performance Data Report (PDF or Word format) and save to your computer After saving to your computer, be SURE to upload this document to GMS as an attachment to get credit for reporting
Contact Information Website • To access the DCTAT website, please go to: http://www.ojjdp-dctat.org Technical Assistance • E-mail: ojjdp-dctat@csrincorporated.com • Toll-free: 1 (866) 487-0512