530 likes | 566 Views
Quality Performance Report (QPR) FY 2013: What Do You Need to Know?. Agenda. Welcome – Shannon Rudisill, Office of Child Care Overview of the Quality Performance Report – Dawn Ramsburg, Office of Child Care Guidance for Completing the FY2013 QPR – Dawn Ramsburg, Office of Child Care
E N D
Quality Performance Report (QPR) FY 2013: What Do You Need to Know?
Agenda • Welcome – Shannon Rudisill, Office of Child Care • Overview of the Quality Performance Report – Dawn Ramsburg, Office of Child Care • Guidance for Completing the FY2013 QPR – Dawn Ramsburg, Office of Child Care • Overview of the ACF-118 QPR module - Rosa Williams, National Center on Data and Technology • Overview of the Data Toolkit – Sarah Friese, Child Trends • Questions and Wrap Up – Office of Child Care
What is the QPR? • First information collected on quality activities • Report on State/Territory progress on improving the quality of child care • Link back to the goals identified in the 2012-2013 CCDF Plan • Annual Snapshot (Appendix 1 of Plan) • Submitted electronically through ACF-118 e-submission website. • Submitted to ACF by December 31, 2013 • Reflects time period October 1, 2012 to September 30, 2013 (FY2013)
What is the QPR? • Collects data in relation to the four components of quality from Part 3 of CCDF Plan: • Health and safety and licensing • Early learning guidelines • Program quality improvement activities • Professional development systems and workforce initiatives
What is in the Guidance? • Definitions • Explanations on what data should be reported • Clarification on how to report if you do not have data • What narrative you could include • N/A is an acceptable response to data questions (use Describe box to add context)
What is in the Guidance? • OCC wants to know what data you have available • OCC recognizes that this data may be incomplete or imperfect • May be data for a subset of programs or providers • OCC anticipates that Lead Agencies may not have information and data available to respond to all questions
Guidance for FY2013 • Questions are the same as Appendix 1 of FY2012-2013 CCDF Plan (not FY2014-2015) • Reporting on FY2013 only (October 1, 2012 through September 30, 2013) • Second annual report for FY2012-2013 Plan period • Some clarifications added based on Q & A during FY2012 reporting period
Electronic-Submission Site • Allows grantees to electronically submit their QPR and for Federal Staff to review and accept the report • Password Protected Electronic Site • Obtain Username and Password from NCDT • URL for the site: https://extranet.acf.hhs.gov/stplan/STPLAN_Login.jsp
Features and Functions of Site Same as ACF-118 Plan Submission Site: • Navigation Index • Easy Data Entry and Edit Functions: Text boxes formatting tool bar, radio buttons, check boxes, and spell-check feature • Embedded question guidance for each question
Two Ways to Access the QPR Main Menu on the ACF-118 Submission Site
Option 1 – Accessing the QPR from the Main Menu Click the Quality Performance Report link to access the QPR Main Menu.
Option 2 – Accessing the QPR from the Table of Contents Click the link for the QPR from the Table of Contents page.
QPR Main Menufor State/Territory Users Select the year from the drop down list and click the QPR button to access the report.
QPR Table of Contents page The user can generate an error report from the TOC page
QPR Error Report The Error Report will point out any incomplete sections of the QPR report.
Data Entry Screen w/ Describe Box The data entry functions are the same for the QPR as they are for the ACF-118. Grantees must enter text in each describe box, even if it is to explain the use of N/A.
Original Goals from ACF-118 Plan The State/Territory’s original goals will be displayed for each of the goals’ questions.
Goal Text Boxes • Active goals that were part of the original Plan must be copied and pasted into a Goal text box; • Active goals that were not a part of the original Plan, must be entered into a Goal text box; • All active goals should be entered into a text box; • Select the Click to Enter Another Goal button to add the next goal;
Editing Goals • If a goal is deleted, it cannot be retrieved; • Click the “Edit Goal” button to open the Goal Text box in another window; • User must “refresh” the display after editing a goal.
Quality Levels The site allows the user to add the appropriate number of quality levels for their program.
Submit/Certify Process • All questions must be answered • Only the Super user can Submit/Certify • Click Submit/Certify QPR button to begin process
Federal Review and Acceptance • Federal staff will review the QPR • If report needs revision, RO will send it back to grantee (emails to Super user and RO/CO staff) • If revision is not necessary, RO will “accept” the QPR (emails to Super user and RO/CO staff)
Using the INQUIRE Data Toolkit for Support on the Quality Performance Report Sarah Friese
The Quality Initiatives Research and Evaluation Consortium (INQUIRE) • Consortium of primarily researchers and evaluators supported by OPRE through a contract with Child Trends • Purpose of INQUIRE • Support high quality, policy relevant research and evaluation • Provide guidance to policymakers on evaluation strategies, new research, interpretation of research results, and implication of new research for practice
In this webinar we will… • Provide an overview of how data quality and management practices help meet the reporting requirements on the Quality Performance Report (QPR) • Describe the INQUIRE Data Toolkit and how it can be used to support reporting, monitoring and evaluation • Review specific examples from the QPR to demonstrate how data elements can be linked to specific reporting questions
Data Lifecycle’s Application to the QPR* • *Based on the Data Documentation Initiative’s Lifecycle
Data Quality Best Practices • Planning • Use Unique Identifier Codes • QPR example: A3.2.6 What percentage of CCDF subsidized children were served in a program participating in the state or territory’s quality improvement system during the last fiscal year? • Requires unique IDs at the Child and Site level • Collection • Collect data using common data standards (data elements, data definitions) • QPR example: A1.2.2 What percentage of programs received monitoring visits, and at what frequency, for each provider category during the last fiscal year? • Use of common data standards ensures your definitions of “programs” and “providers” align with QPR reporting requirements
Data Quality Best Practices, cont. • Processing • Minimize overwriting of historical data • QPR example: A3.2.4 How many programs moved up or down within the QRIS or achieved another quality threshold established by the State/Territory over the last fiscal year? • Preservation of historical data allows for analyses that capture change over time
Data Quality Best Practices, cont. • Management • Maintain up-to-date codebooks detailing the fields included in any data set • QPR example: A3.2.1 How many programs received targeted technical assistance in the following areas (health and safety, infant/toddler care, etc.) during the last fiscal year? • Detailed codebooks ensure you’re collecting the necessary data, in the correct format • Distribution • Ensure data collected in different systems can be connected
Other Data Considerations Data mapping • Data integration • Data governance
Data Mapping to Identify Gaps in Data Needed for Reporting Requirements • A2.2.1 How many children are served in programs implementing the ELG’s? • Data elements needed • Classroom Implementing ELG • Site Implementing ELG • Classroom/Group ID • Program Site ID • Child ID Data Gap PD Registry Practitioners QRIS Children Sites Organizations Licensing Sites
Data Integration to Meet Reporting Requirements • QPR A3.2.6 What percentage of CCDF subsidized children were served in a program participating in the state or territory’s quality improvement system during the last fiscal year? • Data elements needed • Program Site ID • Child ID • QRIS Participation History • Financial Support Types-Child Care Development • QI Participation History Child ID Site ID Human Services-CCDF subsidy Children Sites QRIS-Participation Children Sites Organizations
Data Governance to Promote Data Sharing A3.2.3 What is the participation rate (number and percentage) in the State/Territory QRIS or other quality improvement system for programs over the last fiscal year? • Data elements needed • Program Site ID • QRIS Participation History • QI Participation History QRIS Participation or QI Participation Data from licensing to create new variable Licensing Sites
INQUIRE Data Toolkit-Background • INQUIRE Data Elements Workgroup began in the spring of 2012 • Purpose: • Develop a set of data elements that can guide data collection efforts. • Link the data elements to questions that inform monitoring, reporting, performance management and evaluation. • Provide guidance on data governance and data integrity. • Process: As tools were developed, efforts were made to align efforts with other national data collection efforts and reporting
INQUIRE Data Toolkit Components • Linkages questions • Provide guidance to analyze basic policy, monitoring and evaluation questions that states may ask about their early childhood systems • Can be used to address questions unique to a specific state as determined by the state’s policy priorities
INQUIRE Data Toolkit Components, cont. • QPR questions • QPR questions are listed as sub-analyses under the Linkages questions • QPR questions in the Toolkit are from the 2012 version • QPR analyses recommendations are formatted in a similar way as the linkages questions
INQUIRE Data Toolkit Components, cont. • 3. Data dictionary • List of data elements organized by level (child, family, practitioner, classroom, group, site, organization, system) • Element name, definition, variable options, variable type • Reporting requirements: QPR, RTT-ELC, PIR, ACF-801 • Alignment with CEDS and National Survey of Early Care and Education is noted • Data elements in the Linkages and QPR questions are hyperlinked to the data dictionary • Also available as a spreadsheet • CHILD LEVEL
INQUIRE Data Toolkit Components, cont. 4. Web-based look-up tool • Tool for customizing data elements to your state’s needs and priorities • Will be available on Research Connections
QPR Analyses • A3.2.6 What percentage of CCDF subsidized children were served in a program participating in the state or territory’s quality improvement system during the last fiscal year? • Children should be connected to program sites though a linkage between Child IDand Program Site ID. To determine how many children receiving CCDF subsidies are served in program sites participating in the quality improvement system, first use the data element Financial Support Typeand select for children with a “Yes” response for the field “Child Care Development Fund”. Next, select program sites indicating “Currently participates in the QRIS” for data element QRIS Participation History. Total the number of children that are both served at a program site participating in the quality improvement system and who receive CCDF subsidies. To calculate a percentage, divide this number by the total number of children receiving CCDF subsidies. To determine how many children are participating in quality improvement initiative other than QRIS, perform the same operation but use the data element Quality Improvement Participation. If a state multiple QRISs and QI initiatives, perform the same calculation for each and sum the numbers for the total number of children with a CCDF subsidy who are served by all of the initiatives.
A4.2.3 How many teachers/caregivers received credit-based training and/or education as defined by state/territory during the last fiscal year? • First, split practitioners into three groups based on the setting in which they work, family child care, center-based or license-exempt. Use the data element, Type of Setting, and assign a code (“1”) for practitioners working in settings that are indicated with a “Yes” for the category “Family Child Care”, another code (“2”) to practitioners at settings that are “Center-based (including a school setting)”, and a third code (“3”) for practitioners in “License-exempt” settings. For each of these groups, you will complete the calculation described below. • Sum the number of practitioners with a Training Completion Date in the last federal fiscal year. Training records should be linked such that each completion date is tied to additional information about that training like whether is was a training approved by the state. Limit the number of completion dates that are also indicated as a State Approved Training and total the number of trainings. Perform this calculation for each of the setting types.
A1.2.5 How many previously license-exempt providers were brought under the licensing system during the last fiscal year? • Using the data element Program Site Licensing Status select program sites that previously indicated they were “Exempt.” Select the subgroup of these program sites that currently indicates they are “Licensed” for the total number of program sites that moved from exempt to licensed.
A3.2.4 How many programs moved up or down within the QRIS or achieved another quality threshold established by the State/Territory over the last fiscal year? • Determining whether average quality ratings of program sites (Program Site ID) increase from year to year involves first determining the change in the individual program site’s QRIS Score. First, isolate those program sites participating in the QRIS using the data element, QRIS Participation History and select the category “Program site currently participates in the QRIS.” Then, select the time period for which you would like to calculate a change in score. Rating scores should never be overwritten so there should be a QRIS Scorefor every time a program site was rated. If program sites are not rated every year, scores will have to be collapsed across time periods such that it is possible to compare the most recent score with the previous one regardless of what year the rating actually took place.
To calculate the change in scores, subtract the previous QRIS Score from the current QRIS Score. Negative scores indicate a decrease in rating while positive scores indicate an increase. Divide the number of program sites that increased by the total number of rated sites, for the percentage of sites that increased their quality scores. Average the scores across all program sites to determine the change in scores over time for all sites. Additional elements, such as Early Childhood Setting, can be used to examine changes in QRIS Scores for sub-groups of program sites. • If the state does not have a QRIS but collects some other measures of quality, like the CLASS, use the CLASS Average Scorefor each classroom and average across all classrooms at the program site to determine the level of quality. Subtract the previous program-level CLASS score from the current one to see if scores have improved over the time period between observations. Positive scores indicate improvements in observational quality.