260 likes | 272 Views
This presentation discusses the findings and lessons learned from a seven-year study on the research environment, presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference in Toronto in 2005. The study aimed to assess organizational effectiveness and improve research productivity.
E N D
Lessons From Seven Years of Study of the Research Environment Presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference Toronto, Canada October 28, 2005 Gretchen B. Jordan Sandia National Laboratories gbjorda@sandia.gov Work presented here was completed for the U.S. DOE Office of Science by Sandia National Laboratories, Albuquerque, New Mexico, USA under Contract DE-AC04-94AL8500. Sandia is operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation. Opinions expressed are solely those of the author.
Outline • Motivation, Overview of DOE project • Conceptual framework • Overview of research environment survey • Lessons learned from survey • Future research G. Jordan 10/17/2005
Motivation for assessing organizational effectiveness • Desire to define strategies to improve research effectiveness • Concerns the research environment is deteriorating • Add to very slim study to date of management of science • Organize thinking about differences in R&D, organizations, and circumstances • Examine multiple levels and linkages among levels, such as differences in R&D within a portfolio of projects • And have a reasonable response to public demand for demonstrating accomplishments • Legislative and administrative requirements (GPRA, PART) • Need for a leading indicator and one to balance the emphasis on outcome assessment G. Jordan 10/17/2005
DOE project overview • Research by Sandia labs in collaboration with Dr. Jerald Hage and the Center for Innovation, University of Maryland • Sponsored by U. S. Department of Energy (DOE) Office of Basic Energy Sciences to define innovative measures of operational and scientific performance. • Concentration has been on • Understanding and developing theory relating the research environment to broader management of R&D • Tools to assess key factors in the research environment that foster excellence and impact in order to improve these. G. Jordan 10/17/2005
Evolution of the project • 19 focus groups (DOE, industrial, university) and extensive literature review • Defined attributes and organized within the Competing Values Framework (Cameron, Quinn, et al), then extending and validating that framework for R&D. • Developed, tested and refined a survey to capture employee perceptions of their research environment • To link to nature of work • To analyze and present data to encourage action plans • Have used the survey in conjunction with case studies to determine impact of specific management interventions • Beginning to link survey findings with data on performance, solving challenges associated with this • Beginning to develop management and measurement models for multiple levels (project, R&D organization, sector) G. Jordan 10/17/2005
Purpose/ “Products & Resources Mission Services” That S&T Organizational Effectiveness • funds • people and their characteristics • knowledge base/ competencies Achieved Meet Customer • new knowledge Needs • trained students • basic or applied • research or • development • public goods • economic advantage • informed public • phase of debate innovation Assessing the research/work environment focuses on an important part of S&T evaluation • Within an external environment • economic • technical • political/legal • social/ demographic G. Jordan 10/17/2005
Informs management -Strategy Map for Improving New Product Innovation DRAFT 04/20/05 Vision: Effectively innovate and better harness innovation for sustainable service to the mission Customer Perspective Serve needs/solve problems for existing and new customers Organizational Mission & Financial Perspective Existing/ new customers value our innovativeness enough to invest in it • Internal Processes Perspective • Flexible, minimum possible bureaucracy to maintain required controls but not impede innovation • The culture appropriately encourages risk taking, exploration, internal and external scanning, cooperation and collaboration • Strategic planning decisions, with progress measurement, balance continuity with disinvestment • Innovation Cycle • (readiness levels) • New and improved products and product platforms developed with functionality required • Technologies mature toward customer needs • Concepts proved and progress to development for key technologies Our framework describes tensions inherent in management interventions to improve innovativeness Organizational Learning & Growth Barriers and bottlenecks to innovation are identified and coordinated initiatives are in place to remove them. S & T Learning & Growth There is a stable yet dynamic S & T base to support mission (ideas, people, facilities, knowledge/skills ) G. Jordan 10/17/2005
Conceptual framework behind the survey • Contingency theory of organizational effectiveness says performance is highest when an organization’s structure matches its strategy (for a given set of external circumstances) • Not all R&D has the same strategy (e.g. amount of risk or scope, reasons for taking on larger amounts) so there are tensions inherent in management • These can be captured with two dimensions of strategy and two related dimensions of structure • Dimensions of strategic choices • Where to be on continuum from incremental to radical • Where to be on continuum of narrow to broad/ systemic scope • Related are structural choices and tensions • Research autonomy vs. coordination • Specialized vs. complex teams • Organizational autonomy vs. inter-organizational ties G. Jordan 10/17/2005
How you manage depends on your profile • organic or hierarchical structure • inter organizational ties or not Two dimensions result in four Research Profiles Narrow Scope Advance Small, Autonomous Projects Be Sustainable Be New Quality Outcome Incremental Advance Specialized Task Intra Organizational Radical Advance Complex Task Inter Organizational Be First Be Better Broad Scope of Focus Large, Coordinated Programs Profiles build on and names are borrowed from the Competing Values Framework, Cameron and Quinn, 1999 and DeGraff, 2002 G. Jordan 10/17/2005
Small, Research Autonomy Values Individuals Teamwork HR Development Time to Explore Intellectual Integration Encourage Change Complex, External Specialized, Internal Rich in Resources Technical Management Low burden Systems Vision and Strategies Plan & Execute Strategic Relationships Large, Research Coordination Related to research strategy and structure profiles are management practices in four areas RTD Strategy Profiles Narrow Scope Be New Be Sustainable Desired Strategy and Outcomes Incremental Advance Radical Advance Be Better Be First RTD Structure/Management Profiles Broad Scope Structure and Work Environment G. Jordan 10/17/2005
Key attributes of the research environment were determined through … • Information from 19 focus groups of scientists and managers at three DOE laboratories, one industry lab, and one university • “What do you need in your research environment to do excellent work?” • “What attracted you to the lab and what keeps you here?” • Study of current literature for links between excellence, quality outputs and the work environment • Included R&D management, innovation, organizational development/innovation, evaluation literatures • Developed and tested survey questions • PNNL EHSD Division in 1999, Ford Research Lab in 2000 • SNL – 3 Centers in 1998, 17 Centers in 2001 and 2003 • SNL and NOAA case studies in 2003-2004 • NMSU in 2005 G. Jordan 10/17/2005
DOE Research Environment Survey is a diagnostic tool looking at 36 attributes of management practices Value the Individual Respect for People Optimal Use of Skills Management Integrity Build Teams and Teamwork Teamwork & Collaboration Internal Communication Value-Added Management Commit to Employee Growth Technical Career Advancement Educational & Professional Development Quality of Staff Encourage Exploration, Risk Taking Time to Think & Explore Pursuit of New Ideas Autonomy in Decision-Making Integrate Ideas, Internally & Externally Internal Cross-Fertilization of Technical Ideas External Collaborations & Interactions Integrate Ideas & R&D Portfolio Encourage Change & Critical Thinking Sense of Challenge & Enthusiasm Commitment to Critical Thinking ID New Projects and Opportunities Human Resource Development Innovativeness Organizational Effectiveness Provide Capital, Knowledge Resources Equipment & Physical Work Environment Research Competencies/Knowledge Base Salaries & Benefits Ensure Good Technical Management Informed, Decisive Management Rewards & Recognition Internal Resource Allocation Insist on Efficient, Low Burden Systems Laboratory Services Laboratory Systems & Processes Competitiveness/Overhead Rates Clearly Define Goals & Strategies Research Vision & Strategies Sufficient, Stable Funding Investing in Future Capabilities Plan and Execute Well Project Planning & Execution Project-Level Measures of Success Laboratory-Wide Measures of Success Build Strategic Relationships Relationship with Sponsors Champion Foundational Research Reputation for Excellence Support Systems Setting Goals G. Jordan 10/17/2005
Survey components 20-25 minute (often web-based) survey includes • Rate status of 36 attributes (and preferred for some of these) • Overall rating the environment and trend • Questions of specific interest to that organization • Questions on characteristics of their projects • Limited demographics (org. unit, source of funds, job classification, career stage, years at lab) • Two Open-ended questions: major barriers to progress and who can do something about it; Other Longer survey includes sub parts on some attributes Shorter survey asks only 18 of the 36 attributes G. Jordan 10/17/2005
2. People have time to think creatively and explore. (I have time to do my research, think c reatively, and explore new approaches, all in a normal work week. I don’t work on too many projects at once and am free from excessive organizational obligations.) O O O O O O 0 to 20% 21 to 40% 41 to 60% 61 TO 80% 81 to 100% NA Question format • Each overall question is itself complex and defined in a statement that combines notions from sub-parts from previous long surveys • Response is “what percent of the time is this true during past year” (ratio scale) which permit benchmarks • Not all attributes should or could be true 100 percent of the time • there are trade offs • it will differ depending on nature of work and circumstances G. Jordan 10/17/2005
100% Q2. Have Time to Think Creatively 2. People have time to think creatively and explore. (I have time to do my research, think c reatively, and explore new approaches, all in a normal work 80% week. I don’t work on too many projects at once and am free from excessive organizational obligations.) 60% 38% O O O O O O 0 to 20% 21 to 40% 41 to 60% 61 TO 80% 81 to 100% NA 40% 23% 18% 18% 2b. For your profile, for what percent of the time should people have time to think creatively 3% 20% 24% and explore? 24% 22% 19% 0% Now Preferred 11% O O O O O O 0 to 20% 21 to 40% 41 to 60% 61 TO 80% 81 to 100% NA 0% - 20% 21% - 40% 41% - 60% 61% - 80% 81% - 100% Researchers can suggest areas for improvement by providing time preferred Managers can assess the appropriateness of requests for change, knowing the nature of the work and current circumstances G. Jordan 10/17/2005
Significantly Better Significantly Worse Analysis by demographic groups helps tailor management actions (e.g. by Job Classification) • One job classification or level may have issues another does not (Level 3 in this case) • The most unsatisfied do not reveal their job classification • Managers (Level 1) often rate the lab higher than staff *Significant at .05 G. Jordan 10/17/2005 Data shown here are notional.
Key: Values represent the percentage of responses Good to Outstanding Getting Worse Staying the Same Improving Looking at differences across organizations provides benchmarks, stimulates action Overall Satisfaction? ( Good or Better) Getting Better, Same, Worse? ( Getting Better) Department 1 Department 2 Very Poor to Fair Department 3 Department 4 Department 5 Department 6 Department 7 Department 8 Department 9 Data shown here are notional. G. Jordan 10/17/2005
Analyzing differences across time can be useful, especially when tied to management or external changes in that period ANOVA Table 2001-2003 Data shown here are notional. G. Jordan 10/17/2005
What is important to R&D workers? Note: Does not include data from 2003 forward G. Jordan 10/17/2005
Survey combined with case study project data shows significant differences across project characteristics • Nature of the work: basic science vs. technological tasks • Complexity of labor: 6+ departments vs. less • Size: small (< $1M) vs. large projects For example, as expected we observe • a steady decline in group means as move from small/less complex to large/complex • Small/complex rate external collaboration lower than the other groups • Large/complex rate investment in future capabilities lower G. Jordan 10/17/2005
DRAFT 07/25/2004 Example Balanced Scorecard and Strategy Map Good operating satellite service for the nation and global community Data Coverage, Reliability, and Predictability For Ecosystems, Oceans, Climate and Weather, Commerce, Infrastructure Accessible and Cost-effective Data Products and Services Mission Perspective Able to assimilate large amounts of data and diverse sources, technologies Seen as good use of taxpayer dollars Sound, state of the art science Sustained data quality in changing conditions = R&D Products and Services Provide Value to Users Advance scientific understanding, capabilities Develop new R&D products, services Improve existing R&D products, services Transfer, calibrate, and maintain products, services Responsive to user needs, dynamic conditions Customer Perspective Includes indicators for different Research Profiles • # publications, citations • Peer review on amount of learning • # products by type, milestones • Technical progress (sum of multiple attributes of performance) • # products transferred • Level of quality sustained • Customer satisfaction • Cost avoidance indicator + Managerial Excellence Managerial/ Operational Perspective Select a relevant set of R&D projects Execute R&D projects well Robust flow of knowledge among researchers and users Secure funds and handle responsibly • Project mix in portfolio • Grants awarded • Cycle time by type • Quality, innovativeness includes health of research environment • Funds in by type • $ cost share, leverage • Leadership positions • Workshops, web hits + A motivated, prepared, well-equipped, and networked workforce Organizational Learning Perspective Excellent Climate for Research Quality People, Equipment, Computers Good Business Practices • Existing DOE survey • IT capabilities • To be determined G. Jordan 10/17/2005
Areas of current and future research • Continue to define and validate theory for evaluating the research environment within various contexts, moving to portfolio level • Continue to refine research environment survey • Improve nature of work questions • Add data on external influences, both technical and non • Add comparisons across multiple organizations, fields • Develop a menu of sub part questions that drill down in some areas such as mechanisms for cross fertilization of ideas • Analyze performance data to link specific performance with presence of specific attributes • Peer review findings • Real time progress measures linked to use of that progress • Continue to build a data base of ratings of attributes by nature of work, level of performance of the group, and external circumstances G. Jordan 10/17/2005
Selected References Jordan, Gretchen, “What is Important to R&D Workers”, Research Technology Management, Vol. 48 No. 3, May-June 2005. Jordan, Gretchen, Jerry Hage, Jonathon Mote and Bradford Hepler, “An Exploration of Differences Among R&D Projects and Implications for R&D Managers,” R&D Management Journal, forthcoming November 2005. Jordan, Gretchen, “Factors Influencing Advances in Science and Technology: Variation Due to Diversity in Research Profiles,” Chapter in Innovation, Science, and Macro Institutional Change: Knowledge Dynamics, Jerald Hage, Marius Meeus, Editors, Oxford University Press, forthcoming in 2006. Jordan, Gretchen, L. Devon Streit, and J. Stephen Binkley, “Assessing and Improving the Effectiveness of National Research Laboratories,”IEEE Transactions in Engineering Management, 50, no.2 (2003): 228-235. Jordan, G.B. and L.D. Streit. “Recognizing the Competing Values in Science and Technology Organizations: Implications for Evaluation,” in Learning From Science and Technology Policy Evaluation, Shapira, Philip and Kuhlman, Stefan, Eds., Edward Elgar, Cheltenham, UK and Northampton, Mass. 2003. Please contact me if you have questions, suggestions, or opportunities to collaborate. Email gbjorda@sandia.gov, phone 202-314-3040. G. Jordan 10/17/2005