1 / 29

Counterfactual impact evaluation

Counterfactual impact evaluation. What it tells us… and what it doesn't Daniel Mouqué DG Regional Policy, European Commission. Reminder: Counterfactual = comparison. In practice, comparison group. This imposes conditions…. Similar intervention over large "n" (law of large numbers).

ejacobson
Download Presentation

Counterfactual impact evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Counterfactual impact evaluation What it tells us… and what it doesn't Daniel Mouqué DG Regional Policy, European Commission

  2. Reminder:Counterfactual = comparison

  3. In practice, comparison group

  4. This imposes conditions… • Similar intervention over large "n" • (law of large numbers)

  5. … which only hold for certain measures • Interventions which target individuals or enterprises • Not infrastructures (exception: impact of infrastructures on individuals) • Perhaps for area based initiatives (provided similar goals/means)

  6. There are also data needs • Good data on the intervention (nature, scale, dates) • Good data on target indicators (before and after, including for non-beneficiaries) • The ability to link 1 and 2

  7. Lessons learned from enterprise support studies • DG Regional Policy doing & encouraging since 2008 • What are we learning? And what would we like to know? In terms of: • Investment, capital constraints and other market failures (and how vary by firm and support size) • Impact of support on the enterprise (productivity, innovation, employment)

  8. What do we learn… … about investment, capital constraints and other market failures?

  9. Impact on investment in Eastern Germany (GEFRA 2010)

  10. E. Germany not an isolated example

  11. Small is beautiful 1 – the firms

  12. Small is beautiful 2 – the support • ASVAPP (2012) even controlling for firm size, smaller grants more effective (cpj €79,000 for smallest grants, rising to €489,000 for largest). • ASVAPP (2012) outright grant to SMEs similar effect to soft loan of same size • Czarnitzki et al (2011) presence or absence of a grant was the crucial factor - smallest grants had almost the same innovation impact as the largest • Comparing across studies: schemes of smaller support tended to have better results (eg RSA, UK)

  13. Business advice can be cost effective • Better survival rates 2-4 years later in North Jutland. • €7500/net firm €1500/net job • (Rotger and Gørtz, 2009)

  14. What do we learn? • Capital rationed for SMEs, but only partially • Grants help – do not substitute private money • This argument applies to small enterprises and (probably) to medium sized but not large firms • Less support and/or financial instruments would still work • Capital constraints not the only market failure: success of advice => information failures more serious, at least for smallest and newest firms?

  15. What would we like to know? • The mechanism for capital constraints? Knowing this would help for… • Targetting by firm? And what too big for support? • More effective solutions than direct financial support? (E.g. change capital market) • What is the optimal level and form of support? • What information failures? • Whatis good soft support (incl. business advice)? • How to target/tailorby context and firm? => Need more CFsandother types of evaluation

  16. What do we learn… … about impacts on the firm? Productivity, innovation & jobs

  17. Broader more often than deeper

  18. A closer look at some exceptions • CEBR (2010) in DK: innovation consortia increased profitability 12% vs controls over a 10 year period (adds up to €260,000 extra profits per firm). • Czarnitzki (2007): R&D subsidies in Germany had a significant effect on research and innovation where the firm also benefitted from networking • Czarnitzki (2007): in Finland both financial R&D support and networking effective, and additive

  19. CIS indics, Germany (Czarnitzki, 2011)

  20. But innovation is not a panacea • GEFRA (2010) investment impact of R&D grants < modernisation grants (leverage 0.9-1.0 vs 1.4-1.5). Innovation benefits worth loss in impact? • De Blasio, Fantino & Pellegrini (2009) No additional impact from investment scheme: less tangible nature => more possibilities for deadweight

  21. Jobs created, but < monitoring data

  22. Job quality good • ASVAPP (2012) average firm salary and productivity same or slightly greater • Trzciński (2011) jobs created in SMEs received similar pay rises to those in the control group – and that jobs were maintained five years after support.

  23. What do we learn? • Relatively easy to make firms proportionately bigger (e.g. with grants) • More difficult to make firms more innovative/productive (soft support better?) • Measures with less tangible targets eg innovation can be abused (maybe we knew this already?)

  24. What is left unanswered? And how would we answer this? • Is soft support really the key to innovation? • What is the mechanism for productivity and innovation? What types of innovation influenceable, how to target by firm etc? • What constitutes a "smart" supportpackage? What soft support, what mix with financial support? • How to avoid abuseof innovation and networking measures? => Need more CFsandother types of evaluation

  25. In conclusion …

  26. In summary • New lessons from CFs about impacts (partial capital constraints for SMEs, scaling up effect, importance of information failure) • More to learn about impacts from CFs (e.g. soft support, financial instruments) • Need other evaluation methods to open "black box" of mechanisms (targetting most effective solutions, best investments) • Some factors too intangible for quantified approach? (innovation)

  27. Beware the man whose only tool is a hammer… • … for every problem comes to resemble a nail • - Abraham Maslow

  28. For further information InfoRegio:ec.europa.eu/inforegio Impact evaluation centre:http://ec.europa.eu/regional_policy/information/evaluations/guidance_en.cfm#2

More Related