1 / 45

Rationales and evolution of public 'knowledge policies' in the context of their evaluation

Explore the evolution of public knowledge policies and their evaluation in the context of socio-economic development and innovation. Discover the different rationales behind research and innovation policies and their impact on the innovation system.

isadora
Download Presentation

Rationales and evolution of public 'knowledge policies' in the context of their evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rationales and evolution of public 'knowledge policies' in the context of their evaluation Professor Stefan KuhlmannUniversity of Twente, The Netherlands Seminário Internacional CGEE“AVALIAÇÃO DE POLÍTICAS DE CIÊNCIA, TECNOLOGIA E INOVAÇÃO” - Diálogo entre Experiências Internacionais e BrasileirasRio de Janeiro, 3-5 December 2007

  2. Growing dependence of socio-economic development and welfare on sustainable ‘knowledge base’, in particular science, technology, innovation – and education Internationalisation of industrial research, not caring about national borders International search for creative, highly skilled people; mobility of top researchers Knowledge production: (re-) discovery of a “mode 1 – mode 2 transition" (M. Gibbons et al. 1994); advanced technology and innovation: "fusion” of heterogeneous trajectories (Kodama 1995) Emergent`, generic S/T (e.g. Nano) withm many promises: ‘New actors’, NGO, ELSA , ‘users’ matter Public research and innovation policy: push for efficiency, evidence and evaluation; ‘bureaucratised’ semi-industrial R&D (Ziman, 2001, 82) Search for ‘intelligent’ policy designs: ‘Systemic’ policy instruments; concern about ‘policy mix’ and ‘governance’ Multi-level arenas and governance; [EU: large number of policy actors on national, regional, and transnational levels; re-shuffling of institutional research landscape] ‘Knowledge policy’ issues

  3. Overview: 7 steps ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’ Weaknesses of present research and innovation policy Stakeholders‘ diverging interests, policy arena, governance Role of ‘Strategic Intelligence’ Options and limitations of ‘impact’ evaluation Evaluation methodologies

  4. ‘Innovation system’ concept; evolution of policy instruments and rationales

  5. Framework Conditions Financial environment; taxation and incentives; propensity to innovation and entrepreneurship; mobility ... Demand Consumers (final demand) Producers (intermediate demand) Co-evolution Education and Research System PoliticalSystem Industrial System Large companies Professional education and training Government Intermediaries Researchinstitutes Brokers Higher education and research R&I policies Mature SMEs New, technology- based firms Public sector research Governance Infrastructure IPR and information Innovation and business support Banking, venture capital Standards and norms The innovation system model: a heuristic tool The potential reachof public policies ... Source: Kuhlmann & Arnold 2001

  6. IS = analytical heuristic (NOT normative perspective)! IS = “biotopes” of such institutions engaged in scientific research, the accumulation and diffusion of knowledge, which educate and train the working population, develop technology, produce innovative products and processes, and regulate and distribute them. IS extend over schools, universities, research institutions (education and science system), industrial enterprises (economic system), the politico-administrative and intermediary authorities (political system) as well as the formal and informal networks of the actors of these institutions. One can conceptualise national, regional, sectoral, and technological IS. Each IS is different. Sustainable innovation systems develop their special profiles and strengths only slowly, in the course of decades or centuries. They are based on evolving exchange relationships among the institutions of science and technology, industry and the political system (= co-evolution). Innovation system (IS) - a heuristic

  7. Researchandinnovationpolicyinstruments

  8. Mission rationale, e.g. defence; energy production and conservation, medicine and public health, space, and agriculture; S/T for "public goods" as an aim of public investments (OECD 1995), e.g. “sustainable development” Market failure rationale (externalities; indivisibilities; risks) Co-operative policy rationale, e.g. "Specific Programs" under the EU Framework Programs, or the "Verbundforschungsförderung" (co-operative R&D between public sector institutes, universities, and industry) in Germany “System failures” rationale, need for structural change in the innovation system, e.g. government initiatives aiming at overcoming sclerotic institutions and procedures e.g. in the academic research system Counter rationale ”government failures”, e.g. institutional inertia, lack of reliable information (on efficiency and impacts of policies etc.), lack of continuity and long-term perspective, red-tape procedures, rivalry of bureaucracies Note: Actual policymaking only seldom follows such rationales ! Rationales of research and innovation policies

  9. IS ‘success factors’ and political support Source: Hekkert et al. 2006

  10. ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’

  11. NSI too static (building on specialisation of ‘national’ or ‘sectoral’ systems) – difficulty to cope with complex dynamics in knowledge prduction and global socio-economic change Core hypothesis: different ‘search regimes’ in knowledge production correspond to different institutional settings and policies = evolving ‘configurations’. Implications for policy design: different knowledge dynamics appearing in different ‘configurations’ will evolve with different policy mixes. Aim of project: Identification of limited set of ‘ideal-type’ knowledge configurations and characteristics of related institutions and policy-mixes – helping to design policy concepts in a prospective manner (foresight) Detour (1): ‘One size does not fit all’The PRIME ERA Dynamics concept (work in progress)

  12. Knowledge configurations correspond to Specific characteristics of different knowledge dynamics Techno-industrial dynamics market characteristics, sectors user behaviour and expectations Institutional landscapes Regulation Character of policy arenas and agency Public policy initiatives and traditions Historical path dependency [Degree of ‘Europeanisation’] The involved actors, their ambition, strategy and power. Detour (2): Knowledge dynamics and institutional “configurations”

  13. Knowledge dynamics have three main ‘aspects’*: Growth = capacity to survive and/or prosper within the same institutional and organisational setting. Indicators: publications, patents, exports Convergence = modalities of knowledge flows, and in particular opposing ‘individual’ vs. ‘distributed knowledge’ and the collaboration patterns Complementarities = Technical complementarities = role of large shared infrastructures or equipment (critical infrastructures) Cognitive complementarities = collaboration patterns (bilateral vs. multilateral e.g. networks and clusters); critical mass, competences to be assembled to develop a relevant ‘research production unit’ Institutional complementarities = heterogeneous collaboration for efficient productive settings (e.g. strong relationship between clinicians and biologists in biotechnology); frequency of industry-university collaborations Detour (3): Knowledge dynamics and institutional “configurations” (* Building on Bonaccorsi 2005ff; see also the rich body of literature in innovation economics, sociology of science and in science and technology studies (STS).)

  14. Detour (4): ‘Anecdotal’ examples of knowledge dynamics Source: Laredo, 2006

  15. ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’ Weaknesses of present research and innovation policy

  16. Lack of Governance mechanisms allowing for more ‘systemic’ orchestration between diverse knowledge and innovation related policy domains (see LEG 2007) High degree of departmentalisation, sectoralisation of the political administration, and low inter-departmental exchange and co-operation Heterogeneous, un-linked arenas: often ‘corporatist negotiation deadlocks’ Failing attempts at restructuring responsibilities in government because of institutional inertia Dominance of ‘linear model’ of innovation in policy approaches ‘Innovation policy’ run in a very specific, narrow field focusing on introduction of new technologies in SMEs, IPR or VC issues etc. Weaknesses of research and innovation policy(in EU and else)

  17. Re-shaping of innovation and research systems facilitating the construction (Neue Kombinationen) and deconstruction of subsystems, preventing of lock-in supporting prime movers ensuring that all relevant actors are involved Stimulating demand articulation, strategy and vision development Building cross-linking platforms and "new spaces" for learning and experimenting Providing and exploiting an infrastructure for distributed "strategic intelligence" (building on technology assessment, foresight, evaluation, benchmarking etc.) Need for‘systemic’ policy instruments

  18. ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’ Weaknesses of present research and innovation policy Stakeholders‘ diverging interests, policy arena, governance

  19. National research centers Contract research institutes Multi- national companies Uni- versities Research councils SME asso- ciations Consumer groups Industrial asso- ciations National research ministry Strategic Intelligence Environ- ment groups National parlia- ment Other national ministries EU Com- mission Regional govern- ments Public research and innovation policy actors’ arena – a heuristic • Organised actors: differing interests, values, and power; bounded rationality • Competition for impact and resources • No dominant player? • Contested policies • Search for (some) alignment and policy learning - otherwise ‘exit’ • ‘Enlightenment’ through ‘Strategic Intelligence’

  20. ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’ Weaknesses of present research and innovation policy Stakeholders‘ diverging interests, policy arena, governance Role of ‘Strategic Intelligence’

  21. Concept of bounded rationality in policy learning processes Providing insights in the limited but relevant impact on the quality of decision-making for “rationalisation”-based policies (see Braun/Benninghoff referring to Heclo, Hall, Olsen/Peters) Concept of single and double loop learning(Argyris/Schön 1978) first-order (single loop) learning helps “to keep organizational performance within the range set by organizational norms. The norms themselves […] remain unchanged”; Second-order (double loop) learning concerns “incompatible organizational norms by setting new priorities and weightings of norms, or by restructuring the norms themselves together with associated strategies and assumptions” . Policy Learning

  22. Strategic intelligence and policy learning Strategic intelligence is ... • ... a set of sources of information - often distributed and heterogeneous • explorative/empirical as well as analytical (theoretical, heuristic, methodological) tools • well known strategic intelligence tools are evaluation studies, performance measurement, benchmarking initiatives, foresight exercises, or technology assessment (TA) • employed to produce “multi-perspective” insight in the actual or potential costs and effects of public or private policy and management, to be 'injected' and 'digested' in political arenas • facilitating policy learning

  23. Analyses & methods: Evaluation (ex post, monitoring, ex ante) Delphi, scenarios, TA Policy-analysis Institutional analysisStatist.-econometrical analyses Network analysis Sectors,technologies: retrospectively, prospectively Actors: Companies,Science,Policymakers Innovationprocesses:micro, meso,macro Indicators: corporate data Sectorial techno-eco- -nomic performance Bibliometrics Regulatory data (e.g. norms, standards) Analysing the dynamics of research and innovation systems

  24. ‘Forum’ and ‘Strategic Intelligence’

  25. ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’ Weaknesses of present research and innovation policy Stakeholders‘ diverging interests, policy arena, governance Role of ‘Strategic Intelligence’ Options and limitations of ‘impact’ evaluation

  26. “... methodology-basedanalysis and assessment of the appropriateness of S/T policy assumptions and targets, of the related measures and their impacts, and of the goal attainment.” (cf. Kuhlmann/Holland 1995a, 199; Kuhlmann/Meyer-Krahmer 1995, 3pp) R&D policy ‘evaluation’may be defined as ...

  27. Typical R&D evaluation questions (Arnold/Guy 1997, 72) • Appropriateness: Was it the right thing to do? • Economy: Has it worked out cheaper than we expected? • Effectiveness: Has it lived up to the expectations? • Efficiency: What’s the return on investment (ROI)? • Efficacy: How does the ROI compare with expectations? • Process efficiency: Is it working well? • Quality: How good are the outputs? • Impact: What has happened as a result of it? • Additionality: What has happened over and above what would have happened anyway? • Displacement: What hasn’t happened which would have happened in its absence? • Process Improvement: How can we do it better? • Strategy: What should we do next?

  28. Impact dimensions of public R&D spending

  29. Change ofbehaviour Market dynamics Improvedknow-howbase Other policies Programme orinstitution New product orprocedure Funded project Company or institute Sector orregion Nationaleconomy Scope and limitations of impact measurement of public R&D

  30. Evaluation  identification of impact of (public) action  scientific, technological, economic, societal, political, ...  past/future, direct/indirect, intended/non-intended, ... Condition: Model of input/output relation, of cause/effect, of actors and beneficiaries ... “Impact”  a rational construction of more or less complexity Searching for impacts of policy ...

  31. Summative Evaluation systematic, indicator based mainly ex post - or interim - measurement and assessment of the performance of programmes (including projects) to assess the programme design, implementation management and the leverage of funding and to learn for future approaches Formative Evaluation systematic consulting, moderating, assessing activities seeking to assist policy makers, programme managers and programme participants throughout the whole life cycle of funding programmes to make all actors involved learn and (re-)adjust and thus contribute to the overall success (and/or improvement and/or termination) of programmes and funded structures and to learn for future approaches. Summative and formative evaluation

  32. ‘Innovation system’ concept; evolution of policy instruments and rationales Detour: ‘One size does not fit all’ Weaknesses of present research and innovation policy Stakeholders‘ diverging interests, policy arena, governance Role of ‘Strategic Intelligence’ Options and limitations of “impact” evaluation Evaluation methodologies

  33. Example:German research system Evolution of research systems and evaluation practices

  34. R&D world until the 1970s

  35. R&D world from the 1980s onward

  36. Specify mixdepending onpolicy issue -- no generalmetrix! Variety of R&D evaluation methods: a “metrix”? Evaluation data • official statistics (R&D, patents) • bibliometrics • questionnaire-based surveys • interviews • case studies Evaluation mission • ex ante (based on foresight?): strategic options? • monitoring, real time: management, fine tuning • ex post: learning, legitimisation • summative • formative Evaluative methodology • peer review, peer panels • input/output; cost/benefit; before/after (descriptive statistics; econometrics) • comparison groups • benchmarking • network analysis • foresight • technology assessment See also "Evaluation Toolbox": http://epub.jrc.es/docs/EUR-20382-EN.pdf

  37. Quantitative: Statistical data analysis Innovation Surveys: basic data describe the innovation process, using descriptive statistics Benchmarking: comparisons based on a relevant set of indicators across entities Quantitative: Modelling methodologies Macroeconomic modelling and simulation: broader socioeconomic impact of policy interventions Microeconometric modelling: effects of policy intervention at the level of individuals or firms Productivity analysis: impact of R&D on productivity growth at different levels data aggregation Comparison group approach: effect on participants using statistical sophisticated techniques Qualitative and semi-quantitative methodologies Interviews and case studies: direct observation of naturally occurring events to investigate behaviours in their indigenous social setting Cost-benefit analysis: economic efficiency by appraising economic and social effects Expert panels/peer review: scientific output relying on the perception of peer scientists Network analysis: structure of cooperation relationships and consequences for individuals and their social connections into networks Foresight/ technology assessment: identification of potential mismatches in the strategic efficiency of projects and programmes Evaluation methods, quantitative and qualitative Source: Fahrenkrog et al., RTD Evaluation Toolbox, http://epub.jrc.es/evaluationtoolbox/start.swf

  38. Focus on decision context Understand the stakes Use relevant methods (and use them well) Understand the nature of the object (here: R&D and its institutions) Understand the evolving context (here: knowledge, research & innovation system) A professional evaluator should:

  39. R&D is open-ended (success can be re-defined after the fact) Connection between R&D and effects is non-linear and indirect Even in application-oriented research it may take 10 years or more for impact to be realized (and attribution then becomes tenuous) But evaluation is required earlier … Specifics of R&D evaluation (1)

  40. R&D functions in a larger whole, thus its productivity and effects depend on what happens there no simple RoI (return on investment) approaches Knowledge, Research and Innovation System (KRIS) links with internationally defined scientific fields and domains of application Specifics of R&D evaluation (2)

  41. Content & domain are important distantiated measures (e.g. number of publications, patents) do not capture enough judgment of domain experts is necessary (but has limitations as well, cf. peer review) balance the two – responsibility of the evaluator! Specifics of R&D evaluation (3)

  42. There is not one evaluative ‘truth’: Actual R&D policies relate to variety (and competition) of targets, envisaged effects and underlying rationales and assumptions variety of promotion instruments, overlapping, competing ... Involved actors (policy, industry, science) pursue heterogeneous, partly conflicting interests, assumptions, expectations success criteria differ Ever more R&D policy interventions aim at multiple purposes and heterogeneous actors (e.g. the set of “socio-economic” targets and clients)  increased complexity and interbreeding of input-output-outcome relationships Specifics of R&D evaluation (4)

  43. Standard (quantitative) analyses often not applicable limited number of cases (sometimes N=1, and evaluator has to mobilize experience/insight in similar cases) skewed distributions (of productivity, of impact, of uptake in innovation) – so one cannot use sampling Specifics of R&D evaluation (5)

  44. U r s a c h e u n d W i r k u n g. - Vor der Wirkung glaubt man an andere Ursachen als nach der Wirkung.Friedrich Nietzsche, Die fröhliche Wissenschaft, Aphorismus 217, 1882 Cause and impact. - Before any impact you believe in other causes than after the impact.

  45. Further info and contact Prof. Dr. Stefan KuhlmannUniversity of Twente Chair Foundations of Science, Technology and Society School of Management and Governance Institute of Governance Studies (IGS)Enschede, The Netherlandse-mail: s.kuhlmann@utwente.nl Thanks for your attention !! Smits, R. / Kuhlmann, S. (2004): The rise of systemic instruments in innovation policy. In: Int. J. Foresight and Innovation Policy (IJFIP), Vol. 1, Nos. 1/2, 2004, 4-32 Shapira, Ph., Kuhlmann, S. (eds.) (2003): Learning from Science and Technology Policy Evaluation: Experiences from the United States and Europe, Cheltenham (E. Elgar) IPTS (ed.) (2002): RTD Evaluation Tool Box. - Assessing the Socio-Economic Impact of RTD-Policies Brussels/Luxembourg (EUROPEAN COMMISSION, IPTS Technical Report Series, EUR 20382 EN) International R&D Evaluation Course (4 days) at University of Twente, see: http://www.mb.utwente.nl/stehps/news/rd_2007.doc/

More Related