100 likes | 226 Views
Evaluating government R&D programs with quantitative measures What works best AEA 2006 Annual Conference November 2, 2006. Session Chair: Bruce McWilliams First Presenter: Mary Beth Zimmerman Second Presenter: Brian Zuckerman Third Presenter: Bruce McWilliams Discussant: George Teather.
E N D
Evaluating government R&Dprograms with quantitative measures What works bestAEA 2006 Annual ConferenceNovember 2, 2006 Session Chair: Bruce McWilliams First Presenter: Mary Beth Zimmerman Second Presenter: Brian Zuckerman Third Presenter: Bruce McWilliams Discussant: George Teather Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Why Quantitative Measures? • Holy Grail of Evaluation Metrics • When metrics have Cardinal differences, • “Apples to Apples” comparisons can be made • Statistics can be generated • Correlation or causal chains can be supported • with statistical evidence Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Government R&DUnique Evaluation Context • Government R&D Characteristics • Federal Management Initiatives • Institutional Characteristics & Incentives Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Characteristics of Government R&D • Outcomes typically “public goods” • Disciplinary or fundamental knowledge • Non-rivalrous in use and/or non-excludable • Insufficient appropriability & market failure • Serendipitous discovery can occur • Significant uncertainties in the flows of expected costs & benefits in a R&D pipeline • Wide range of motives and incentives • Quest for Understanding v. Quest for Usefulness Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Recent Federal Management Initiatives & Legislation • GPRA (1993): Government Performance & Results Act • Must submit strategic plans to OMB and • Congress every three years • Must submit an annual performance plan to • Congress along with budget request • Performance (& Accountability) Plan must be • based on performance elements established in • plan from previous year Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Recent Federal Management Initiatives & Legislation President’s Management Agenda (2001) – Budget Performance Integration (BPI) – OMB’s Performance Assessment Rating Tool (PART) Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Institutional Characteristics • Very Different! • Scale & Budget (Capabilities different) • Intramural v. Extramural (Monitoring different) • Focus: Exogenously v. Endogenously Defined • (Planning horizons different,Measures different) • Other differences Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Institutional Incentives • Competition over budgets • Potentially overlapping objectives • Differing incentives to: • - drop underperforming research activities • - transition technologies out of public sector • Others Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Implications So what impact do these characteristics & incentives have on: • Achievable Top-Level Aggregate Comparability? • Managerial Decision-rules? • Performance Improvements? • Budget Performance Integration? Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov
Questions What is the best way to quantitatively evaluate R&D programs? • Should anything be done differently to evaluate: - basic knowledge generation; - the development of applied knowledge; or, - the implementation of research knowledge? • Does the customer of R&D outcomes make a difference? • Does an organization’s characteristics (size, structure,..) impact a quantitative evaluation? • Is there or should there be a core evaluation policy that spans all the federal agencies? Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov