1 / 22

What did Connecticut do?

Learn about how Connecticut distributed surveys to eligible families and used a census approach to ensure representativeness. Explore the analysis of the survey results and considerations for APR reporting.

eugenecox
Download Presentation

What did Connecticut do?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What did Connecticut do?

  2. The Basics • A group of people who hold stakes met to give the lead agency suggestions. • We chose the NCSEAM survey and we distributed the surveys to families eligible on specific date (just like 618 Table 1, point in time.) • Service Coordinators hand delivered survey in sealed envelopes with a stamped envelope. • Surveys (with a unique identifier printed on them) were returned directly to the lead agency and scanned in

  3. The Basics • In Year One, we also mailed surveys to recently exited families. • The response rate for hand-delivered was much higher 26% v 19% so… • In Year Two we only sent them to families who were participatING in the program but…. • Return rates were low! So we did a follow up mailing ($$$$). • Ended up with a rate of 41%.

  4. The Basics • THIS IS A CENSUS APPROACH for distribution of the surveys. • We gave the surveys to ALL ELIGIBLE families who had been in Birth to Three at least 6 months. • After year one, OSEP asked Connecticut to clarify the details of what we had done to assure representativeness. • READY?

  5. This is basically your handout – don’t try to read it. 5

  6. The Details • We interpreted the “population of children with disabilities in the early intervention program” to mean our official 618 child count which is the only official demographic data reported to OSEP. • We called this our Target Group

  7. The Details • Connecticut reported the percentages by both Race / Ethnicity and Gender for the… • Target Group (FFY05 618 child count - Table 1) • Census(Children whose families were sent a survey) • Respondent Pool(Children whose families returned a completed survey) and….SURPRISE! The respondent pool wasn’t representative!

  8. The Details

  9. Random and Representative • So we used SPSS to randomly select cases in order to create groups that would match the 618 percentages. • By Race/Ethnicity • By Gender • A crosstab of both • (We also did “Region” for our ICC.)

  10. Race Ethnicity

  11. Gender

  12. Race/Ethnicity X Gender

  13. Analysis • Then we analyzed the results for each of the randomly selected representative groups. • Then we reported EVERYTHING…

  14. Analysis 4A – Know My Rights (77%)

  15. Analysis 4B – Communicate About My Child (75%)

  16. Analysis 4c – Help Me Help My Child (88%)

  17. Remaining Questions • “Week of Clarity” Prelim APR table read that CT • Didn’t include the N’s and • Didn’t meet it’s target for Indicator 4C • They use the results from our response pool • So we revised our APR to • Include the N’s and • Only report results that were from representative data

  18. Original APR 4c – Help Me Help My Child (88%)

  19. REVISED APR The respondent pool N=875 and from that the following representative groups were selected.

  20. Remaining Questions • The final APR response table read that CT • Didn’t report the total N for the Response Pool • Still didn’t meet it’s target for Indicator 4C • Because we are required to assure representativeness, which results should be reported to and used by OSEP?

  21. Remaining Questions • Dr. Paula LaLinda used Connecticut’s data for her dissertation. • She analyzed many more variables including language spoken in the home, insurance type (public or commercial) • This was more illustrative.

  22. Remaining Questions • States may be using their family outcome data in ways that are a priority for stake-holders but not required for the APRs. • What are the minimum requirements for the APRs as related to reporting representativeness?

More Related