150 likes | 323 Views
Nordic project on the development of Public Innovation Metrics Carter Bloch Brainstorming session on measuring innovation in Education, June 11 2009, Paris. Measuring Public Innovation: Toward a common statistical approach. Background for project – funding support.
E N D
Nordic project on the development of Public Innovation Metrics Carter Bloch Brainstorming session on measuring innovation in Education, June 11 2009, Paris Measuring Public Innovation: Toward a common statistical approach København – Århus www.damvad.dk
Background for project – funding support • Project initiated by: Danish Ministry of Science, Technology and Innovation • Other Nordic contributors: • Nordic Innovation Centre (NICe) • Research Council of Norway • Innovation Norway • VINNOVA • SALAR (Swedish Assoc. of municipalities and regions) • Finnish Ministry of Enterprise and Employment
Participants • Denmark: • DAMVAD (Carter Bloch, Torben Vad, Mark Riis, Lydia Jørgensen) • CFA (Peter S. Mortensen, Ebbe Graversen) • Statistics Denmark (Jens Brodersen) • Danish Agency for Science, Technology and Innovation, Denmark (Thomas Alslev Christensen, Jesper Rasch and Hanne Frosch) • Norway: • NIFU-STEP (Johan Hauknes, Stig Slipersæter) • Statistics Norway (Frank Foyn, Lars Wilhelmsen) • Finland: Statistics Finland (Mikael Åkerblom) • Sweden: Statistics Sweden (Roger Björkbacka, Per Annerstedt) • Iceland: RANNIS (Thorvald Finnbjørnsson) • Collaboration with UK (NESTA; DIUS) • Hope to collaborate with other countries within NESTI Task Force • Eurostat Pilot studies (Denmark and Finland have applied)
Background for project • National interest in Public sector innovation in Nordic (and other) countries • Demographic changes necessitate innovation • Competition with private service providers • Better quality services to citizens • However lack of systematic data on public sector innovation. • Hinders efforts to better understand and to promote public sector innovation
Main objectives • Develop framework and questionnaire for collecting internationally comparable data on innovation in the public sector • Conceptual framework • Survey methodology • Studies of user needs • Respondent interviews and testing
Main objectives • Primary focus: ’generic’ survey instrument that can be applied across govt levels and public sector activities • Goal: include main public sector activities, all three levels of govt, front-line service delivery institutions (hospital wings, schools, etc) • Examine option of additional sector-specific modules • Examine how innovation data can be used together with output data (often from other sources). • Project website: www.mepin.eu
Time line for project work • Started in November 2008 • Nov-Feb, 2009: Background work • March-Aug, 2009: • Meetings with user groups, • interview respondents, cognitive testing • conceptual framework/indicators and survey methodology. • Pilot questionnaire. • Fall 2009: • Small-scale testing of questionnaire • Deliverables on first stage of work • Workshop for preliminary results (November). • 2010: • Pilot test studies
Implications for Indicators - Overview • Innovations – definition and types • Oslo Manual as starting point • Unsuccessful innovations • Innovation outputs (qualitative only) • Innovation input (Oslo Manual as starting point, though quant measures likely more difficult for public sector) • The Innovation Process • Innovation capability (what do organisations do to structure and promote their innovation activities – and how able and ready are they?) • Linkages (by type of partner; more than just cooperation) • Drivers and barriers to innovation (actors and factors) • ’Cross-cutting themes’ to be covered: ICT, HR, Procurement
Implications for Indicators/pilot questionnaire • Innovations – definition and types • Oslo Manual, Product-process-organisational seems like a suitable starting point. However, a number of questions on whether these types can be distinguished • Marketing innovations? • Perception by many that changes mandated by policy directives, rules, cuts are not automatically innovations • Ask for examples – disseminate examples • Unsuccessful innovations • Innovation projects that have been abandoned, Implemented innovations that have failed • Ask for examples • Give reasons why not implemented or why a failure • Impacts of these unsuccessful innovation projects (learning effects vs. no-more-experiments attitude)
Innovation inputs • Oslo Manual as a starting point (activities y/n; resources) • Questioned: validity/reliability and interest in quantitative estimates • How to collect? OM ”approach”? • Estimates based primarily on budgets/accounts of innov.- projects? • Estimates also based on loose estimates? • Separation between R&D and non R&D (in-house; extramural)? respondents may be seeing def of R&D for the very first time… • Additional questions related to procurement here?
The Innovation Process • We ask about innovation inputs and outputs; other questions essentially are collecting data on the innovation process: • Innovation capability (what do organisations do to structure and promote their innovation activities – and how able and ready are they?) • Linkages • Drivers and barriers (actors and factors)
Innovation capability some potential example ‘questions’ • Innovation strategy • Specific goals, targets for innovation activities • Development department • Activities organised in innovation projects • Individuals charged with supporting the development and implementation of innovative ideas • Procedures for reviewing/assessing innovative ideas for further development and implementation • Regular evaluation of innovation strategy, innovation processes • Systematic procedure for gathering external knowledge • Part of staff work time explicitly devoted to innovation • Innovation-related training/courses for mgmt, staff • Staff incentives for generating innovative ideas
Linkages Interest in linkages appears to go beyond ”Cooperation: Y/N” This suggests asking small set of interaction-related questions by type of partner (potential examples): • Businesses • Innovation cooperation • Collaboration in provision of services • Outsourcing • Use of external innovation specialists • Users • Innovation cooperation • Analysing user needs • Meetings/hearings with users • Gather information on users through daily operations • Other public institutions • Innovation cooperation with public research inst. • Innovation cooperation with other public inst.
Drivers and Barriers • Can this be formulated as one question; ie where respondent can mark whether each impact is positive (a driver) or negative (a barrier)? • Split the drivers/barriers in actors and factors? • Long lists of potential drivers/barriers available from former ad-hoc surveys.
Other areas that can potentially be treated as cross cutting themes • ICT • HR • Procurement practices