1 / 17

D. Czarnitzki A , Cindy Lopes Bento A,C , Thorsten Doherr B A) K.U.Leuven B) ZEW, Mannheim

Counterfactual impact evaluation of cohesion policy Examples of innovation support from two regions . D. Czarnitzki A , Cindy Lopes Bento A,C , Thorsten Doherr B A) K.U.Leuven B) ZEW, Mannheim C) CEPS/INSTEAD, Luxembourg Warsaw, 12 December 2011. Introduction.

cbender
Download Presentation

D. Czarnitzki A , Cindy Lopes Bento A,C , Thorsten Doherr B A) K.U.Leuven B) ZEW, Mannheim

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Counterfactual impact evaluation of cohesion policy Examples of innovation support from two regions  • D. CzarnitzkiA, Cindy Lopes BentoA,C, Thorsten DoherrB • A) K.U.Leuven • B) ZEW, Mannheim • C) CEPS/INSTEAD, Luxembourg • Warsaw, 12 December 2011

  2. Introduction • Task in this research project: • Exploretowhatextentpubliclyavailablebeneficiarydata of European CohesionPolicycanbeusedfor a quantitative study of policyimpactsatthe firm-level • Focus: innovation activities of firms • Requirements: • Linking beneficiary data to firm-level information • Amadeus Database • Patent database • Other resources containing data about innovation activities at the firm level • Obtain control group of non-funded firms

  3. Challenge • Beneficiary lists typically only include • Recipient name • Project title • Year of funding approval • Approved amount of funding • Recipient names have to be searched in other databases using text field algorithms • Potential hits have to be manually checked • For each study two text field searches necessary: • Recipients have to be identified in Amadeus database • Identified Recipients and control group (obtained from Amadeus) have to be searched in patent database or other related data source containing information on innovation activities

  4. Example of recipient data

  5. Countries examined Of 11 countries/regions investigated: • Poland, Slovakia, Slovenia, Flanders, Wales and London were eliminated because of small numbers of projects in these regions/countries. • Spain was eliminated as in the data we had available firm names were not included. • FR data were tested, but was impossible to tell treatment date • Only CZ and DE retained

  6. Country case I: Czech Republic • Recipient data: • 26,075 projects • Time period: 2006 – 2011 • Total amount of € 10,747,210,000 • Avg. amount per project € 412,265 • How many (different) firms? • unknown!!! • Name list is searched in Amadeus (firm database) • contains 14,609 different firms

  7. Country case I: Czech Republic 1,433 treated firms 11,454 control firms Data: Total # of firms contained in Amadeus: 14,609 Total # of firms retained : 12,887 Methodology: • “Difference-in-difference” • Compare annual patent applications per firm in pre-treatment (1997-2003) and treatment phases (2008/9)

  8. Country case I: Czech Republic • Patenting fell by 63% in controls, only 14% in treated • Highly statistically significant (chi² =12.07, p<0.01) • Understates impact, since some firms still in their pre-treatment phase (data lag) • Patenting here is really a proxy for a wider range of innovative activities • Next example has data for wide range of innovation activities…

  9. Country case II: Germany • Recipient data: • 47,616 projects (out of those 33,201 in Eastern Germany) • Time period: 2006 – 2011 • Total amount of € 9,060,653,000 • Avg. amount per project in Eastern (Western) Germany: € 92,400 (€ 415,923). • Name list is searched in „Mannheim Innovation Panel“ • „Outcome“ variables: • R&D investment (R&D intensity = R&D / Sales) • R&D employment divided by total employment • Total innovation investment / Sales • Investment into physical assets (relative to capital stock) • Innovation types

  10. Link to the Mannheim Innovation Panel • MIP = German part of the the CIS • Annual survey; asks about 5,000 firms about the innovation activities • We can link 5,606 different grants to the MIP. These correspond to 1,904 different firms. • Restrict time period to 2007-2010: We „lose“ firms as they are in the MIP but not in the relevant years. • After removing observations with missing values in variables of interest, we can use a final sample of 623 supported firms. • Control group: 21,226 observations • Estimator: Nearest Neighbor matching

  11. Germany

  12. Germany [1]Due to missing values, the number of observations is of 16,748 for the non-subsidized firms and of 488 for the subsidized firms for R&D employment. [2]Due to missing values, the number of observations is of 9,291 for the non-subsidized firms and of 330 for the subsidized firms for innovation intensity

  13. Germany

  14. Germany Robustnesstests: • Restrict sample toinnovatingcompanies • aspurpose of projectis not supportedsystematically:innovation vs. somethingelse • Main resultsreportedearlier hold BUT: • Oncewecontrolforsubsidiesreceivedfrom German Federal Government all positive effectsreducesomewhat in terms of magnitude and also statisticalsignificancereducesslightly. • Cohesion Fund reciepientsare also morelikelytoreceiveothersubsidies!!!

  15. Germany Robustness test: does the size of the grant matter?

  16. Lessons learned • Reporting standards should be improved. Otherwise a quantitative evaluation lacks credibility or produces no results because of noisy data. • What should be reported at the minimum? • Funding start and end date in addition to amount • Type of recipient (firm vs. other) • Purpose of grant • Recipient name AND location • All in database compatible formats • And…. • If possible, historical data should be stored centrally, e.g. by EC. • Longer time lag between evaluation and program completion should be applied.

  17. Q&A: Discussion Contact: Prof. Dr. Dirk Czarnitzki K.U.Leuven Dept. of Managerial Economics, Strategy and Innovation Phone: +32 16 326 906 Fax: +32 16 326 732 E-Mail: dirk.czarnitzki@econ.kuleuven.be

More Related