230 likes | 239 Views
This outline discusses the background, governance, examples, current thoughts, and next steps for implementing a governance structure for the SIS Producibility Initiative, including the roles/responsibilities and funding mechanisms involved.
E N D
Status: Governance Structure for SIS Producibility Initiative Richard Turner SIS Producibility Workshop 6-7 September 2006
Outline • Background • Governance • Examples • Current thoughts • Next steps
Background • SIS Productivity needs an organizational structure to • Increase stakeholder confidence • Maintain initiative continuity • Measure progress • Adapt to change • Assure transition • Manage overall initiative activities
Governance Roles/Responsibilities - 1 • Strategic management and vision • Guides the 7-year program, aligns work • Creates and maintains roadmap and goals • Maintains strategic coordination • Measures progress of program • Tactical program management • Allocates resources • Monitors, controls and evaluates contracted work • Validates results • Drives transition
Governance Roles/Responsibilities - 2 • Tactical Research Management • Guides research • Allocates specific research resources • Designates specific research tasks • Manages IP rights • Coordinates publication and knowledge management • Prepares for transition • Funding • Solicits and manages funding • Establishes contracting mechanisms
Summary of governance examples - 1 • Versatile Affordable Advanced Turbine Engines (VAATE) • National program (Services, DARPA, NASA, DOE) • Directed by AFRL, managed by industry members • Detailed, hierarchical technology roadmap • Government contracted research • Microelectronics Advanced Research Corp. (MARCO) • Executed through DARPA, directed by a consortium • Pools funding from industry and gov’t. sources • Consortium selects, contracts with and monitors self-organizing, self-managing Focused Research Centers
Summary of governance examples - 2 • Advanced Research and Technology for Embedded Intelligence and Systems (ARTEMIS) • European Union Joint Technical Initiative • Strategic Research Agenda and steering group • Supports various networks and conferences • Still researching funding approach • IT for European Advancement 2 (ITEA-2) • Blue Book for strategic alignment • National funding and management • Projects nominated to receive ITEA-2 endorsement to help gain national funding
Summary of governance examples - 3 • High Dependability Computing Program (HDCP) • NASA initiative loosely led by CMU • Research in new approaches for developing mission critical systems • Development of testbeds was major part of initiative • Large number of research universities and affiliates participated • Several testbeds developed before funding was dropped in 2005
Observations from examples • VAATE has good roadmap depth and evaluation methods but no specific academic involvement • MARCO’s independent consortium and research centers provide for mixed funding sources and research partnering • Artemis roadmap is appropriate to much of our interest • ITEA-2 funding strategy doesn’t provide “government buy-in for alignment” but evaluation process for projects is good • HDCP experience with testbeds extremely valuable; lack of feedback from customer to researcher and lack of coordination between researchers were issues
Funding/Resources Pooled In-kind/IRAD Company specific/designated Existing/separately funded activities Resource allocation/selection Collaborative decisions Sponsor-directed decisions “Seal of approval” Funding vehicles Groups (FRCs) Individual companies Grants Contracts Intellectual Property Open source Shared by all participants Managed/negotiated Strategy and vision Formal research agenda Notional, evolving roadmap Detailed, milestone-based Steering group of stakeholders Management Single office responsibility Tiered structure Multiple independent but coordinated offices Transition Independent activity external to development management Embedded in research and management Hybrid Validation/Test Track Coordination Part of management Independent assessments Yearly report cards Shared responsibility Managed, experiment-driven test beds Trade Spaces (options not exclusive)
Criteria for organization evaluation • Efficiency of operations (low overhead) • Strong, continuing relationship among sponsors, researchers, and users (feedback/cooperation assured) • Strong, supportive management structure (intentional leadership) • Effectiveness for transition • Stability and longevity (in face of sponsor changes) • Flexibility and adaptability (in face of technology changes) • Multiple funding/resource streams ($, in-kind, IRAD) • Minimal IP problems
Some notional governance structures ITEA-like MARCO-like Researchers Sponsors VAATE-like Central Office Guidance/Mgmt
Possible players • Center for Empirically-Based System/Software Engineering (CeBASE: UMd/Fraunhofer, USC, Nebraska, Miss. St.) • DARPA • ESCHER • NDIA • Software Engineering Institute • Service Labs/SW Centers • Software Test Track • Systems and Software Consortium
Management Group SEI Conceptual MARCO Based Approach - Overview Results FRCs FRCs FRCs Contracts Technical feedback/Broad Coordination Resource Reporting TechnicalReports/Interim Results Feedback Test Track Validation Activities In-kind resources Feedback Tryout Successful approach Feedback SIS Producibility Consortium Programs Productization, Transition, Deployment, Adoption, Use Funding Tryout Guidance/Prioritization/Feedback Gov’t/Industry Feedback Steering Group Programs
Management Group SEI Conceptual MARCO Based Approach - FRCs Results FRCs Self-organizing, self-managing research group; Lead organization may be a university,industry partner, or government lab. Research activities and research managementResource allocation and reporting Cross-FRC peer evaluation Publication – communication support FRCs FRCs Technical feedback/Broad Coordination Feedback Contracts Test Track Validation Activities In-kind resources Feedback Tryout Successful approach Feedback SIS Producibility Consortium Programs Productization, Transition, Deployment, Adoption, Use Funding Tryout Guidance/Prioritization/Feedback Gov’t/Industry Feedback Steering Group Programs
Conceptual MARCO Based Approach – Mgmt. Group Results FRCs FRCs FRCs TechnicalReports/Interim Results Resource Reporting Contracts Management GroupAn independent group to oversee the following; possibly competed Program Mgt. Secretariat/facilitation Contracting RFP generationProposal evaluation Participation monitoringCongressional interface Strategic progress monitoring SEI Technical feedback/Broad Coordination Feedback Test Track Validation Activities In-kind resources Feedback Tryout Successful approach Feedback SIS Producibility Consortium Programs Productization, Transition, Deployment, Adoption, Use Funding Tryout Guidance/Prioritization/Feedback Gov’t/Industry Feedback Steering Group Programs
Conceptual MARCO Based Approach - SEI Results FRCs FRCs FRCs Resource Reporting TechnicalReports/Interim Results Contracts Management Group SEI Technical feedback/Broad Coordination Technical monitoring Research Coordination Communications Technology Transfer Research RepositoryValidation support Feedback Test Track Validation Activities In-kind resources Feedback Tryout Successful approach Feedback SIS Producibility Consortium Programs Productization, Transition, Deployment, Adoption, Use Funding Tryout Guidance/Prioritization/Feedback Gov’t/Industry Feedback Steering Group Programs
Management Group SEI SIS Producibility Consortium Conceptual MARCO Based Approach - Test Track Results Feedback FRCs FRCs Test Track FRCs Validation experiments Hands-on availability for acquisition programs Contracts Resource Reporting TechnicalReports/Interim Results Technical feedback/Broad Coordination Empirical SupportAn independent organization to design and conduct experiments; possibly competed In-kind resources Feedback Tryout Successful approach Feedback Programs Productization, Transition, Deployment, Adoption, Use Funding Tryout Guidance/Prioritization/Feedback Gov’t/Industry Feedback Steering Group Programs
Management Group SEI SIS Producibility Consortium Conceptual MARCO Based Approach - Transition Results Feedback FRCs FRCs Test Track(Competed?) FRCs Contracts Resource Reporting TechnicalReports/Interim Results Technical feedback/Broad Coordination Feedback Tryout Programs In-kind resources Feedback Successful approach Productization, Transition, Deployment, Adoption, Use (Shared responsibility?) TBD Funding Tryout Guidance/Prioritization/Feedback Gov’t/Industry Feedback Steering Group Programs
Next Steps • Get feedback from you on the concept presented • Political issues • Corporate issues • Academic issues • Other ideas, comments, (constructive) criticisms • Can your sponsors buy into something like this? • Identify and develop other possible options • Establish costs, players, etc. • Evaluate pros and cons