280 likes | 435 Views
CI Release 3 Life Cycle Objectives Review. Preliminary Review Board Report Out. Regarding the R3 LCO Board Report. The Report Out represents initial impressions. Board will add, edit, or delete comments in the final report. Review Board Charge.
E N D
CI Release 3 Life Cycle Objectives Review Preliminary Review Board Report Out
Regarding the R3 LCO Board Report The Report Out represents initial impressions. Board will add, edit, or delete comments in the final report.
Review Board Charge • Assess the progress of development efforts and plans during R3 Inception Phase. • Is R3 Inception progress sufficient to proceed to the Elaboration Phase? • Develop findings, recommendations, and suggestions to CI and the OOI PMO.
Summary • Thanks to CI Team for the opportunity to participate in the development. • Impressed with and appreciate the: • Well prepared and coordinated presentations • Impressive amount of work; dedicated team • Well-planned agenda • Excellent logistics – thank you
Evaluation Criteria - Entry Was the R3 LCO documentation presented complete and sufficient to evaluate the progress of the R3 Inception phase? Findings: Plenty of good information; challenge to navigate it. Appears sufficient (could have used more time to review) Recommendations: Request a list of requirements (especially the critical ones) not addressed in current use cases Important to map the critical reqs to UC by LCO
LCO Artifact Set Management Set • System Life Cycle Plan • Risk Register • Elaboration Execution Plan Requirements Set • Use cases (mature) – not mature at this point • Candidate system and subsystem requirements
LCO Artifact Set Design Set • Architecture specification (candidate) • Technology list (baselined) – • What requirements drive the technology choices? • Concern with resource investments in technologies not linked to a critical requirement. • Prototype reports – documentation was minimal and was not clear on what was learned from the exercise. Was a risk mitigated?
Evaluation Criteria – Exit #1 • Are the use cases understood and do they provide a complete description of the release scope? Does the prioritization of use cases look appropriate? Findings: • Challenge to link presentation material • Little/no input from MIOs into UC formulation or ranking (an action). Any req not met/dropped could impact MIOs. • Action – PMO to facilitate MIO involvement in UC development/refinement/prioritization. Recommendation: List critical recommendations not addressed by existing UC
Evaluation Criteria – Exit #2 • Are the candidate requirements understood and do they cover the critical use cases? Findings: • The Review Board has not gone through all the requirements in detail to be able to answer this. We are confident the CI team understands the requirements in DOORS. Recommendations: • CI move forward with elaboration. • The Marine IOs agree to look at requirements and UC to support prioritization/ranking of UC.
General Recommendation Recommendation to preface presentations with the criteria addressed.
Evaluation Criteria – Exit #3 • Are the critical risks identified and have they been mitigated through exploratory prototypes? Findings: • It was difficult to see the link between the prototypes and the risks. Recommendations: • The Board would like to see a clearer representation of the links between specific risk(s) and the choices of prototype efforts (eg., brief description of risk X, description of mitigations/prototype, and resulting impact to risk X and R3.
Evaluation Criteria – Exit #4 • Are the candidate architectures viable as demonstrated through analysis and prototyping? Findings: • The Board feels the candidate architectures are viable. There remains concern that not all critical prototype efforts are complete. Recommendation: Focus effort on critical prototypes with an eye to cost:benefit analysis.
Evaluation Criteria – Exit #5 • Is the Elaboration Execution Plan credible? Based on artifacts/presentations/discussions, are board members confident that the elaboration plan will be successful? Finding: The plan for completing R2-related tasks while entering R3 Elaboration is not clear. Will there be impact to the progress of R3? Recommendations: • Develop a plan and schedule for getting marine IO involvement in use case maturation. • Use cases need to be stabilized in the near term.
Observations about this review Recommend better representation from IOs with expertise in software architecture. All artifacts must be posted 1 full week in advance; background material and available artifacts (if mostly done) could be posted 2 weeks in advance to allow reviewers to go through all artifacts. Update the document map to reflect the artifacts actually posted (i.e., clearer mapping of document names). Broader OOI team would benefit from attending these reviews (in person or webex). A clarification.
Board Questions at end of Day 1 • In general, what level of maturity and review do you expect the use cases to have achieved by LCO? Based on R1 and R2 experiences, is this sufficient? Have we achieved that level of maturity? (for John G.) • What level of maturity and review do you need the use cases to have achieved to support architecture development in Elaboration? Based on R1 and R2 experiences, is this sufficient? Have we achieved that level of maturity? (for Michael M.) • Who are the groups of users canvassed by CIUX (just ball park numbers and types [scientists, operators, data managers, etc.) for input on UI screens/functions to see if the design reflects their input? Have you done beta testing with those people? • Are the existing use cases sufficient to address requirements and how was this determined? How were use cases ranked (i.e., what was the criteria used)?
Breakout Session Assignments #1 Tuesday, 14:45-16:00 Group 1 – Data Management (Andrea, Steve, Sue) Group 2 – Analysis and Synthesis (Art, Doug, Ed, Jon) #2 Wednesday, 08:30-10:00 Group 1 – Common Operating Infrastructure () Common Execution Infrastructure Group 2 – Sensing and Acquisition () #3 Wednesday, 10:15-11:45 Group 1 – Marine Integration, Sensor Sets, Dataset Agents (Sue) Group 2 – Planning and Prosecution ()
R3 Review Board Members Sue Banahan* - OOI Program Office (PMO) Ed Chapman - PMO, OOI Chief Systems Engineer Mark Fornwall – USGS/Ocean Biogeogr. Info. Syst. Jonathan Fram – OSU, Endurance Array Steve Gaul – CG, Systems Architect & CI Interface Doug Luther – U Hawaii, RSN Project Scientist Andrea McCurdy – PMO, Assoc. Project Manager EPE Art Miller – SIO Climate Sciences, Oceanogr./Sr. Lecturer *chair
Release 3 • R3 will deliver OnDemand Measurement Processing • Incorporates: • R1 Data Distribution Network • R2 Managed Instrument Network • Add the end-to-end control of how data are processed, support more advanced workflows of instrument providers and data product consumers, and on-demand measurements supporting event-driven opportunistic observations. • First release of the Integrated Observatory Network (ION) that is intended for exposure to the general public.
Level 1 Science Themes Ocean-Atmosphere Exchange Climate Variability, Ocean Circulation, and Ecosystems Turbulent Mixing and Biophysical Interactions Coastal Ocean Dynamics and Ecosystems Fluid-Rock Interactions and the Subseafloor Biosphere Plate-scale, Ocean Geodynamics