1 / 18

Ongoing evaluation and interactive research for cluster development, innovation and growth

Ongoing evaluation and interactive research for cluster development, innovation and growth. Göran Brulin, senior analyst and professor, Swedish Agency for Economic and Regional Growth. Lisbon-agenda and Europe 2020:.

Download Presentation

Ongoing evaluation and interactive research for cluster development, innovation and growth

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ongoing evaluation and interactive research forcluster development, innovation and growth Göran Brulin, senior analyst and professor, Swedish Agency for Economic and Regional Growth

  2. Lisbon-agenda and Europe 2020: According to the strategy “Europe 2020” an economy based on knowledge and innovation should be developed. EU should promote a resource efficient, greener and more competitive economy that leads to growth. It should foster a high-employment economy that helps deliver social and territorial cohesion. 2

  3. Fifth report on economic, social and territorial cohesion: Investing in Europe! • Europe 2020 emphasises the need for more innovation. For example, only one region in ten has reached the Europe 2020 target of investing 3 % of GDP in R&D. (1,9 nu, 3,75 in Sweden) • Close coordination between Cohesion Policy and other EU policies and national policies as well as regional! • WHAT ABOUT THE SO CALLED SWEDISH PARADOX?

  4. Fifth report on economic, social and territorial cohesion: Investing in Europe! • Higher-quality, better-functioning monitoring and evaluation systems are crucial for moving towards a more strategic and results-oriented approach to cohesion policy. • Measurable targets and outcome indicators. Indicators must be clearly interpretable, statistically validated, truly responsive and directly linked to policy intervention, and promptly collected and publicised. • Ex-ante evaluations should focus on improving programme design so that the tools and incentives for achieving objectives and targets can be monitored and evaluated during implementation. • Plans for on-going evaluation of each programme would become an obligation, since they facilitate transparency at EU level, foster evaluation strategies and improve the overall quality of evaluations. Evaluations could also be envisaged once a certain amount of the funds has been certified to the Commission.

  5. Investments in innovation and clusterscall for changes in how programmes and projects are evaluated! There is a lack of learning approaches in evaluations! Evaluations are conducted just for the sake of evaluation! Evaluations are ritual and symbolic activities rather than processes for critical and constructive knowledge formation! 5

  6. • The programming period 2007-13 investment in clusters and ‘R&D and innovation’ represents six-fold increase compared to previous period - €86b! • Structural Fund management authorities will increasingly be required to appraise performance of innovation measures! • Few mid-term evaluations (MTE) for the 2000-2006 period provided real insights that contributed to significantly improved management of clusters and RTDI interventions. According to Alexander Reid, Technopolis: 6

  7. • Innovation is risky and unpredictable: which particular activity/intervention will work or prove useful or not, who will benefit, when exactly, under which circumstances • Firms rarely innovate in isolation - a system of networks and cooperation with customers/users; open innovation, user-led innovation etc. • Make appraising and direct cause-effect analysis of measures is very difficult; - time-lag between intervention and impact (long for innovation projects), attribution problem and project fallacy - skew - only a few public funded innovation projects from a portfolio of project will produce significant economic impact 7

  8. •Innovation measures should improve innovation performance/behaviour in firms in a lasting way: •Longer-term view: innovation surveys and econometric analysis of impact of innovation on competitiveness, etc.; •Medium-term view: observed changes in co-operation patterns, modes of innovation, innovation expenditure, etc. •Short-term view: direct results of projects financed (people trained, new technologies adopted, spin-offs created, etc). Evaluate and shape learning processes about thepreconditions for innovations and cluster! 8

  9. The main task in the new generation of evaluation is to organise reflective learning processes for continuous improvement! Mid term evaluation came too late and was too expensive; minor impact in managing of the programmes. Quantitative rather than qualitative focus. Indicatororiented evalutions means heavy focus on activities in the projects (at the expense of the overall objectives). How convincing are the core indicators? Limited learning in the projects, in the program context and between stakeholders in national framework (labour market policy and regional growth policy actors). 9

  10. Learning spiral: ”Public debate” for sustainable regional growth and jobs! Ongoing evaluation for continous improvements in programmes and projects. Dynamic learning processes within and between regions to trigger of multipliers! Learning networks between projects and programmes and management authorities. ”Evaluation capacity building” 10

  11. Actions taken in Sweden during the programming period 2007 – 2013: Ongoing evaluation of the eight ERDF programmes. Ongoing evaluation in large of projects. Joint evaluation between the ESF and ERDF of the organisations of implementation. Joint university course and reader in Learning Through Ongoing Evaluation at seven universities. 11

  12. Programme level evaluation: - Project portfolio composition seems to be in line with priorities and measures in the eight programmes. Although programmes were developed during the boom 2007, they seem to function well in the present economic situation. - The intended focus on innovation this programming period has resulted in a three-fold increase in investment in R&D and Innovation (RDI) in cooperation with universities and university colleges! - ERDF funding supports the idea of the "entrepreneurial university”, with knowledge-”spillovers” between academia and its surrounding as well as academic entrepreneurship. 12

  13. Billions to RD&I and clusters in the Swedish ERDF programs • More than 3 billion SEK is so far granted by ERDF programs, including co-financing more than 6 billion. (63 Billion Euros in EU as whole!). Corresponds to approximately 40% of the total effort in the programs. • The major project owners are universities but also many others.

  14. Findings from the ongoing evaluation of innovation and clusters: 1) Projects are more innovative if there are other owners of the projects than the scientific community? 2) There is a need for clearer plans that lead to commercialization (resources are to often used for traditional research patents that are not passed on to commercialization, peer review articles and so forth). 3) Cooperation with business should be increased in projects as well as in monitoring committees and partnerships! 4) The initiation of projects should be made in consultation with experts, the degree of innovativeness should be focused. 5) There is a need for regional innovation strategies that put clusters in a regional growth context.

  15. 6) There should be an ambition to constantly see how the ERDF projects can interact with other national programs, venture capital fund projects and other parts of the EU programs and funds, for example, how can structural fund projects be carried forward into a Framework project? 7) Also 7th Framework program projects too seldom lead to sharp innovations, but rather "intermediate knowledge outputs” (Technopolis evaluation). Traditional companies are to often participating companies rather than innovative "young" businesses. 8) Ongoing evaluation of clusters should be done with a clear focus on innovation, commercialization and business participation. 9) The learning that takes place between clusters and other actors should have a much stronger focus on innovation, commercialization and growth. 10) Simplification - according to the business society there is to much detailed control rather than a focus on the overall objectives of doing business.

  16. Twelve venture capital fund projects: - Truly experimental activity that will be evaluated during implementation, i.e. five years of ongoing evaluation. - Do the funds act as good venture capitalists, given their mission, planning and regional conditions? Will they contribute to new structures increasing venture capital for SMEs? - Will such funds have an impact on regional growth, venture capital markets and at EU-level? - Ongoing evaluation will be conducted in cooperation with the Swedish Agency for Growth Policy Analysis (responsible for international overview, research overviews and ex-post evaluation).

  17. Structural fund programs could do better to support innovation, job creation and growth: if they work as ”venture capitalist” with ongoing evaluation of projects and learning processes within and between the projects. Thereby, both experiences from processes and knowledge about ”products” and methods can be gathered. By participation in public debate insights are spread and regional development are energized. 17

  18. Ongoing evaluation has led to different MA-actions: - setting up of arenas for learning, knowledge formation and transferring best practice. - training and education of management authority staff, often in cooperation with regional development actors (education of evaluators). - dissemination of results and generated knowledge to stakeholders and regional structural partnerships. 18

More Related