1 / 28

Measuring Our Success

Measuring Our Success. Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia. Objectives. Describe our quality improvement process Defining metrics and what to measure Improve data collection process Challenges and successes Discuss program evaluation strategies

ermin
Download Presentation

Measuring Our Success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Our Success Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia

  2. Objectives • Describe our quality improvement process • Defining metrics and what to measure • Improve data collection process • Challenges and successes • Discuss program evaluation strategies • Describe our assessment process • Next steps

  3. Our Objective • Define a set of meaningful metrics that reflect our progress in reducing incidence and severity of child injuries and mortality in the six focus areas.

  4. Challenges • Measuring behavior change • Many factors that influence safety and the incidence of injury. i.e. laws, law enforcement, community development, economics, cultural norms • Measuring not just one program, but many programs and overall effectiveness

  5. Quality Improvement and Process Evaluation

  6. Monitoring vs. Evaluation • Before we could begin measuring our impact (evaluation), we needed to collect more accurate data on our activities and programs (monitoring) • Monitoring • Monitor the implementation process • On-going measurement of performance • Regular tracking of resources, activities, and outputs • Are things working well? • Evaluation • Assess specific program outcomes and impact • What is the result? • E.g. behavior change

  7. First Step • Analyzed current method of coalition reporting • Pros: • Collecting program activity information including # of events by focus area, # of materials/equipment distributed, people reached • High response rate • Cons: • Annual reporting = inaccurate, untimely data • Some questions unclear = different interpretations and unreliable information • Current methods were not accurate, timely, or effective

  8. Our Tasks • Improve Quality of Program Data • Establish metrics • Revise reporting forms • Improve data collection process • Assess Our Activity and Impact • Quarterly and Annual reports – monitoring • Periodically assess program impact and identify areas for improvement

  9. Establishing our Metrics • Determine what data we want to and can collect • Questions we asked ourselves: • What do we want to measure? • What data do we want to track over time? • What can we feasibly collect from our coordinators? • What information will be useful for the coordinators? • What did we decide to measure/track? • Injury statistics by county • County demographics • Zip code (location) of events • Event details

  10. Improving our Data Collection Process • Goal 1) Collect data as close to time of event or program activity as possible • Improve accuracy and reliability of the reported data • Provide more timely feedback for program adjustments • Goal 2) Make data collection as easy and fast as possible • Reduce the # of non-reporting coalitions • Recognize the diversity of coalition members’ roles (part-time vs. full-time, e.g.) • Solution: Annual reporting  Quarterly Reporting

  11. Quarterly Report

  12. Quarterly Report

  13. Quarterly Report - Inventory

  14. Challenges • Change • History of no repercussions for not reporting • Estimating numbers • No incentives or disincentives • Varying skill levels and knowledge of Excel • Buy-in • Benefits of reporting never communicated • Time intensive

  15. Overcoming Challenges & Getting Buy-In • Discussions at meetings, one-on-one site visits • Technical assistance and feedback • Follow-up: reminders, phone calls • “What do I get out of it?” • Accurate numbers for lead agency, funders, sponsors • Quarterly/Annual reports showcasing numbers • Not having to remember all activity at the end of the year • Needed to keep 501(c)3 status • User-friendly report

  16. Quarterly Report • Pilot testing • Full dissemination in 2nd quarter 2012 • Lots of feedback, many revisions! • Group revisions • Only released updated versions quarterly • Final changes for 2013 • Process took over a year

  17. Roadblocks • Coalitions had multiple reports • CPAT • Annual Survey • Funder requests • SKW Grants • Technical difficulties • Many issues with the new form required lots of revisions • Balancing multiple stakeholders’ expectations • Board committees • Lead agencies • Coordinators

  18. Early Successes • Increases in number of reporting with each quarter • More accurate and timely data • Standardized process

  19. Recommendations • Involve coordinators and all other stakeholders throughout the process • What does everyone want to get out of this? What are the benefits for everyone? Challenges? • Talk with coordinators who report and don’t report – why aren’t they reporting? • Provide training or meeting before dissemination of new tool • Training/explanation beforehand will minimize inaccuracies and misinterpretation

  20. Recommendations • Obtain buy-in from coordinators early on • Talk with them one-on-one on the benefits for them, how to help them • Know what other reports they already have • 4. Pilot test and revise • 5. Slow process – patience is needed!

  21. Measuring Outcomes and Assessing our Impact

  22. Program Evaluation - Initial Focus on 2 Areas • Child Passenger Safety • Largest program • Has the most evidence supporting risk mitigation approaches • Questions to answer: • Do we need to do more, and how much more? (un-served population) • Can we show reasonable impact between SK vs. non-SK coalitions? • Behavior changes? • Poisoning • Recently trended upwards in incidence • Least evaluated area, least implemented program • Question to answer: • What needs to be done? What works?

  23. Poisoning Prevention • Program Development • Develop program based on literature review, existing evidence-based programs, coordinator interviews • Develop capacity • Provide resources to coordinators • Create sustainability • Evaluation • Evaluation of instructor training • # of trained Poison Prevention Instructors (capacity) • # of Poison Prevention educational events (increase in education) • Pre- and post-test forms (change in knowledge, behavioral intentions) • 3 month follow-up (behavior change)

  24. Child Passenger Safety • # of CPSTs in coalition counties (capacity) • Training (capability) • # of certification classes • # of recertification classes • Car seat checklist forms • Monitoring misuse overtime

  25. Child Passenger Safety • Follow-up evaluation Pilot Project • 3 month follow-up after car seat checks • Demonstrate knowledge retained and behavior change • Partnership with Georgia Department of Public Health • Impact of Booster Legislation (July 2011) • Challenges • Lots of data from multiple sources • Narrowing information • Link between injury data and our activities

  26. Assessment Injury and Death Statistics • For all 6 of the focus areas • Compare to prior 2-3 years Program Evaluation & Recommendations • Program specific outcome evaluation • Summary of how SKG “moved the needle” • Narrative success stories and lessons learned • Coalition activity and assessment of effectiveness (paid v. volunteer) • Changes to programs and coalition building • Additional resources needed (staff and $) Program Statistics • Activity levels by program, by coalition • Location of activities • Coalition building activities and interaction with other agencies • Program funding & grants • Staff resources devoted per program

  27. Next Steps • Monitor program activity • Continue to improve # of coalitions reporting • Create database for Program Activity Data • Create and distribute quarterly and annual reports • Develop goals and objectives • Track our progress • Identify injury prevention program priorities • Outcome evaluation • Program-specific evaluation

More Related