1 / 20

Evaluation of AQHI Program in Canada & San Diego: Challenges and Advantages

This study by Sharon Jeffers and Kamila Tomcik from Environment Canada and Health Canada respectively evaluates the Air Quality Health Index (AQHI) program, focusing on funding, definitions, challenges, and results. The research discusses formative, summative, and developmental evaluations, addressing initial and evolving challenges, advantages of the approach, disadvantages, and fluid process management. The study highlights the importance of collaboration, persistence, and capacity building in evaluation processes.

Download Presentation

Evaluation of AQHI Program in Canada & San Diego: Challenges and Advantages

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of the Air Quality Health Index Program in CanadaSan Diego 2011 Sharon Jeffers - Environment Canada (EC) Kamila Tomcik – Health Canada (HC)

  2. Background • The AQHI is the first multi-pollutant health risk based air quality index in the world • Multiple (really, really multiple) partners and stakeholders for both development and implementation (the whole is greater than the sum of the parts) • Piloted and implemented in different jurisdictions at different times and in different ways.

  3. Funding and Evaluation • Our funding comes in 4 to 5 year cycles with a Treasury Board required evaluation at the end of each cycle • But in reality, we have undergone some kind of an evaluation almost every year since 2001

  4. Definitions Formative Evaluation – used when a program is under development or being formed – focus is on the implementation process Summative Evaluation – done when a program is « mature », want to see measurable results (show me the numbers).

  5. One more… Developmental Evaluation • New field • Google Michael Quinn Patton • Deals with social change programs that do not lend themselves well to summative evaluation (complex programs, vs simple or complicated)

  6. Getting Started • We started a bottom-up process in response to a poorly organised top-down process • The logic model framework and indicators that were being proposed weren’t representative of what the program was supposed to accomplish, and how it was being implemented

  7. Getting Started • We did not want to be responsible for both reporting and being evaluated on indicators that were disconnected from the reality of the program

  8. What we did… • Put together a working group composed of both program and evaluation staff from both federal agencies (EC and HC) • Developed our own program logic model and performance indicators • These were then approved by senior management in both departments and fed back up through the system • This wasn’t as easy as it sounds here….

  9. Initial Challenges • Getting the right people sitting around the table • Proving the value of what we were doing to some program managers and senior management • Multiple players with multiple priorities – we had to show how we fit in with higher level priorities – often with no notice

  10. Initial Challenges • Changing players at all levels, so just when you got to know someone, they were gone, losing visibility each time with people who need to know you are there • The older AQI and the AQHI co-exist in several jurisdictions (so most of the baseline data are for the AQI and are not necessarily transferable (also create confounding and public confusion of the two programs)

  11. Evolving Challenges • Maintaining committment • Still have to deal with changing players • Getting relevant, AQHI-specific data and then making sense of it (getting at « show me the numbers ») • Actually measuring some of our indicators – data from partners are not standardized

  12. Evolving Challenges • Validation of the program logic • Recommendations are not always followed up on • Getting at behaviour change (developmental evaluation) – attribution of any measured change to the program • AQI and AQHI still co-exist in many jurisdictions – still need AQHI specific baseline data but we need them now

  13. Addresssing the Challenges • Persistence – hang in there • Demonstrate early the value of what you are doing – i.e. « what’s in it for management?» • Entire process seemed overwhelming – we broke it down into small bites, starting with developing the program logic model • Got training in evaluation • Collaboration between AQHI program staff with evaluation experience and evaluation specialists

  14. Advantages of our approach • Gives you some influence in the process, not imposed from above – in our case, it turned out to be a lot of influence • More opportunity to intervene effectively to prevent confusion from arising • Helped de-mystify the evaluation process for many program staff and managers • Build capacity for the long term

  15. Disadvantages • It wasn’t easy, and it still isn’t easy • Not all the partners are equally engaged – slows things down • Raises expectations – will be much harder once we undergo a summative evaluation • May be hard to replicate – unique opportunities (e.g., available program staff with evaluation expertise)

  16. Results • Formative evaluation results were mainly positive; gave useful recommendations • Program performance measurement framework/indicators provide a focus for smaller partner agencies – better alignment of individual programs & projects with national program goals • Helped identify key data gaps – now we can work to fill in those gaps • Helped make the case for continued funding for the next five years

  17. Fluid process • Have to stay on top of things • Staff change • Departmental priorities change • Funding changes • Organisational structures change • Some of the assumptions made for the program logic prove to be wrong

  18. What’s next • Next round of funding is coming up • Submission to Treasury Board(TB) for the next five years • TB will require an evaluation before the end of the five years • It is highly likely that this will be a summative evaluation

  19. What’s next • Establishing a performance management committee • Membership is open to all partners and stakeholders • To have performance measurement results used in decision making and for future planning purposes, data and information dissemination, knowledge transfer and communication functions. • To work together to address gaps in both program logic and data collection.

  20. Au revoir…. If you want more details of what we did, feel free to contact us: Sharon.Jeffers@ec.gc.ca (514-283-8621) Kamila.Tomcik@hc-sc.gc.ca (902-426-9449)

More Related