410 likes | 508 Views
Getting Better All the Time. Assuring the Quality of COSF Data Andy Gomm : New Mexico Part C Jane Atuk : Alaska Part C Lisa Backer : Minnesota Part C & 619. New Mexico . Andy Gomm Part C Coordinator. ECO Implementation in NM. Training provided to 34 provider agencies at their sites
E N D
Getting Better All the Time Assuring the Quality of COSF Data Andy Gomm: New Mexico Part C Jane Atuk: Alaska Part C Lisa Backer: Minnesota Part C & 619
New Mexico Andy Gomm Part C Coordinator
ECO Implementation in NM Training provided to 34 provider agencies at their sites ECO manual developed and distributed Technical assistance made available through FIT staff and University of NM – Early Childhood Network Roll out region by region (5 regions)
ECO quality assurance in NM ECO Quality Assurance form developed ECO lead staff with the Family Infant Toddler (FIT) Program initially reviewed all ECO forms Review expanded to 4 FIT staff Total ECO forms reviewed to date = approximately 1,300
ECO quality assurance in NM (cont.) Each provider agency received specific feed back regarding rating selection and supporting documentation. Once it was determined that the agency was completing the ECO forms to a high standard – they could be ‘graduated’ Once graduated FIT staff request the ECO forms on an “as needed” basis
Additional ECO quality assurance Providers receive a summary of the ECO quality assurance conducted Data entered in new online data system – provides additional opportunities to review accuracy Database reports provide ability to review whether ECO scores have been entered
ECO Quality Assurance Form The NM ECO review form includes: Are all areas of the ECO form completed? Were a minimum of three sources of info (approved assessment tool, clinical observation and parent input) used to generate rating? Does the supporting evidence really support the ECO rating? Is the ECO rating consistent with the child’s eligibility category?
Lessons Learned After initial training, all sites needed an additional, almost identical, training once they began implementation. TA needs to be available promptly. Pre-printing sources of information on the supporting evidence section ensured that documentation was present from all three required sources.
Lessons Learned (cont) Regarding Feedback on ECO Form: • Feedback needs to be prompt. • Feedback needed to go directly to Service Coordinators completing the form, and not just their EC Coordinator (manager). • Positive feedback works!! If a particular SC at an agency was doing a great job with the ECO form, a recommendation was made that that SC mentor others at that agency. Use his / her ECO form as an example of what we want.
Next Steps • Develop online training – available 24 / 7 • Promote QA to be done by provider managers • Review online ECO reports – e.g. review data reports for patterns in scores, etc. • Include ECO process (incl. ECO Manual) in the Service Coordination training
Minnesota Lisa Backer ECSE Specialist
Basic Realities • Education Lead/Birth Mandate State • “Local Control” is valued • Teams must use multiple sources of information including at least one criterion-referenced or curriculum-based measure cross-walked by ECO • Parent input must be documented on the COSF
Basic Realities • Single target group of stakeholders & professionals for training on child outcomes reporting across Parts C and Part B • Rating at exit from Part C is becomes the entrance rating for Part B • Minnesota Automated Reporting Student System (MARSS) created in the late 1980’s. • No “real time” data. Data collected by LEA’s throughout the year and reported to MDE each fall and each end-of-year
Quality Assurance Efforts • Stakeholder Responsibility Table • Training & TA • Data Awareness • Self Study
Stakeholder Roles/Responsibilities Key Areas • Knowledge of typical child development • Ongoing Assessment • Knowledge and Use of COSF & Process • Annual reporting of data • Ensuring validity • Family Outcomes
Training & TA “Get Started” • 55 Face-to-face trainings during Year 1 • Data Retreat for Early Childhood Program Administrators (ECSE, Head Start, Pre-K) to promote professional investment in data • One time additional appropriation of $$ to fund tool purchase and training
Training & TA “Get Better” • 7 Regional Trainings in Year 2 • Program survey LEAs; Provide training on most popular assessment tools • HELP; AEPS; BDI-2; Brigance; Creative Curriculum • Web-Ex training under development for implementation during Fall 2008 • Validation Self-Study
Data Quality & Awareness • Simple logic check • Mean, Median and Standard Deviation calculated on entry and exit data sets for each LEA for each outcome. • Progress data calculated and made available for each LEA on password protected site • Does district data tell the right story?
Self Study • Self-study tool under development • Procedural Requirements • Sources of Information • Assignment of Ratings • Statewide training on use of tool 10/2/08
Lessons Learned & Next Steps Lessons: • Getting started was easy. Getting better takes more work. Next Steps: • Vigilant monitoring of all data submissions • Evaluate local use of self-study tool
Alaska Jane Atuk Early Intervention Specialist Early Intervention/Infant Learning Program
COSF implementation in Alaska • COSF pilot at 7 regional sites, Feb-Dec 2006 • Training provided to all providers at statewide workshop, Feb 2007 • Statewide implementation of COSF began March 1, 2007 • DVD training modules provided to each regional program, Nov 2007 and now accessible online for ongoing local training
Quality assurance in Alaska • Technical assistance provided through state staff by phone and at regional sites • COSF database reports reviewed at least quarterly with feedback to local providers • Provider survey conducted July 2008
Survey Notes • 92 ILP providers received the survey link by email (Survey Monkey) • 67 responded for a 73% overall response rate • The number of responses on items varies because… • Subsets of respondents received some questions based on answers to other questions (skip logic) • Respondents could choose to not answer some questions
COSF training & information • 90% of respondents answered an item about how they received COSF training/information Of these (n = 60)… • 70% attended an in-person statewide event • 42% used the COSF training notebook • 37% consulted with trained ILP providers • 30% consulted with state-level staff • 18% used DVD training modules* • 7% used the Internet to access information *DVD training modules were only available after statewide training events occurred
28 I know how to do it, but I need some more practice and assistance. 24 I am confident I know how to do it, and I do it well. 12 I understand to a point, but I need more training. 2 I do not know how to do this yet. (n = 66) Overall Proficiency with COSF 78% felt they could do the COSF process with varying confidence, but without further training
61 54 54 44 11 9 ILP provider observations Parents/foster parents/legal guardians Assessment results/test scores Specialists (OT/PT, speech/language, etc.) Other family members/relatives Childcare providers Sources of Information The most typical resources used to inform COSF rating decisions (n = 64) Note: Respondents were asked to “check any that apply”
63 18 8 6 Meeting with people in person Meeting with people over phone or teleconference Communicating back and forth with people by email Videotaping interviews, assessments, observations Gathering Information The most typical methods used to gather information for COSF ratings (n = 64) Note: Respondents were asked to “check all that apply”
Were crosswalks helpful? 4 very much 8 yes 9 somewhat 3 no 21 don’t know if using 21 not using Was the decision tree helpful? 3 not using 3 don’t know 24 very much 24 yes 6 some 2 no Were instructions for completing the COSF helpful? 6 don’t know 36 yes 8 no 14 not using Decision-Making Tools
Determining COSF Ratings Most commonly… • 33% consulted with another provider • 24% consulted with families • 21% determined ratings on their own • 18% used a team process Note: 3 (4%) respondents did not answer this question. It would seem that providers most often did not use an “ideal” team approach
Determining COSF Ratings However… • 63% (42) had used a team approach at times Of these 42 providers… • 64% felt the team approach enhanced the decision-making process • 62% felt it contributed information that would otherwise not be available • 95% felt it was relatively easy to reach consensus
Level of Parental Involvement Typical parental involvement in COSF process on teams (n = 42)… • 69% - contributed information, but were not usually present during team meetings • 26% - usually were present and participated • 5% - usually were not involved at all
25 19 17 16 16 3 3 3 Battelle Developmental Inventory (BDI) Early Learning Accomplishments Profile (ELAP, 2002) Sewell Early Education Developmental Profile (SEED) Early Learning Intervention Dev. Profile (“the Michigan”) Hawaii Early Learning Profile (HELP, 2004) Assessment, Evaluation, & Programming System (AEPS) Bayley-III Scales of Infant & Toddler Development, 3rd ed. Carolina Curriculum for Infants & Toddlers (CCITSN-3) Anchor Assessment Tools (n = 63) Note: Respondents were asked to “check any that apply”
Anchor Assessment Tools 45 providers indicated training specific to assessment tools from… • 91% local EI/ILP agency • 27% assessment authors/publishers • 20% university course • 16% professional conference • 13% state or regional workshop • 11% private consultant or contracted trainer • 7% another organization
73% Anchor Assessment Tools Recentness of training (n = 45)… • 24% within the last year • 31% within the last two years • 18% within the last five years • 27% more than five years ago 43 of 61 (64%) respondents indicated someone else in their program has training/education specific to anchor tools used
Added Comments • 20 providers (30%) added a comment to the survey • 5 were clarifications of answers given • 6 expressed objections to using the COSF • 3 expressed difficulty with the COSF process • 2 indicated confusion with the COSF process • 3 were suggestions • 1 was about the survey itself 16% of respondents made what could be considered negative comments
Lessons Learned & Next Steps • Train often and early • Regular feedback is essential • Providers appreciate being asked to give feedback on process • Survey results will help to focus future training and technical assistance • Continue to elicit feedback from providers