600 likes | 1.03k Views
Software Review Process and Level of FAA Involvement. Software and Complex Electronic Hardware Standardization Conference. Brenda S. Ocker, Software Technical Specialist. July 27, 2005. Presentation Overview. Background Overview of the Software Review Process Four Types of Software Reviews
E N D
Software Review Process and Level of FAA Involvement Software and Complex Electronic Hardware Standardization Conference Brenda S. Ocker, Software Technical Specialist July 27, 2005
Presentation Overview • Background • Overview of the Software Review Process • Four Types of Software Reviews • Preparing, Conducting, and Documenting the Software Review • Determining the Level of FAA Involvement • Summary
Background • RTCA/DO-178B, Section 9.2 • “Certification authority reviews may take place at the applicant’s facilities or the applicant’s suppliers’ facilities.” • RTCA/DO-178B, Section 10.3 • “The certification authority may review at its discretion the software life cycle processes and their outputs during the software life cycle as discussed in subsection 9.2.”
Background • Order 8110.49, Chapter 2 • Provides guidelines for performing software reviews • Documents the review approach, which is detailed in FAA Job Aid “Conducting Software Reviews Prior to Certification”
Background • Order 8110.49, Chapter 3 • Provides guidelines for determining the Level of FAA Involvement (LOFI) in a software project • When the FAA should be involved • To what extent the FAA should be involved • Which areas the FAA should focus their involvement
Software Review Process Assess LOFI Determine Number, Type, and Timing of Software Reviews ConductSoftware Reviews Use Job Aid & DO-178B
Objectives of Software Review • Address technical issues in a timely manner • Physically examine compliance data • Verify adherence to plans and procedures • Monitor designees
Review Process & Life Cycles • Begin early in the software life cycle • Integrated throughout the software life cycle • Regular communication between applicant and certification authority
Four Types of Software Reviews • Planning • Development • Verification • Final Certification
Software Planning Review • Stage of Involvement (SOI) # 1 • Determine if plans and standards provide an acceptable means for satisfying objectives of RTCA/DO-178B
Software Planning Review • Conducted when initial software planning process is complete • Plans and standards have been internally reviewed • Plans and standards have been reviewed by SQA • Plans and standards are approved and under configuration control
Software Planning Review SOI # 1 Planning Requirements Design • PSAC • SDP • SVP • SCMP • SQAP • Standards • Tool Qual. Plans • Verif. Data • SQA data • SCM data Code/Integration Integration/Test Tables A-1, A-8, A-9, A-10
Software Development Review • Stage of Involvement (SOI) # 2 • Determine if software development in accordance with approved plans and standards
Software Development Review • Conducted when at least 50% of development data is complete • High-level requirements are documented, reviewed, and traceable to system requirements • Software architecture is defined and reviews and analyses are complete • Low-level requirements are documented, reviewed, and traceable to high-level requirements • Source code implements low-level requirements, is traceable to low-level requirements, and has been reviewed
Software Development Review Planning SOI # 2 Requirements Design Code/Integration • Software Req. Data: - High-Level Reqs. - Derived High- Level Reqs. • Req. Standards • Verif. data • SQA data • SCM data Integration/Test • Design Description: - Architecture - Low-Level Reqs. - Derived Low-Level Reqs. • Design Standards • Verif. data • SQA data • SCM data • Source Code • Executable Object Code • Code Standards • Verif. data • SQA data • SCM data Tables A-2, A-3, A-4, A-5, A-8, A-9, A-10
Software Verification Review • Stage of Involvement (SOI) # 3 • Determine if software verification in accordance with approved plans • Ensure requirements, design, code and integration appropriately verified • Ensure verification process will achieve • Requirements based test coverage • Appropriate level of structural coverage
Software Verification Review • Conducted when at least 50% of verification and testing data is complete • Development data is complete, reviewed, and under configuration control • Test cases and procedures are documented, reviewed, and under configuration control • Test cases and procedures have been executed (formally or informally) • Test results are documented • Testing environment is documented and controlled
Software Verification Review Planning Requirements SOI # 3 Design Code/Integration Integration/Test • Test cases and procedures • Test results • Verif. data • SQA data • SCM data Tables A-2, A-6, A-7, A-8, A-9, A-10
Final Certification Review • Stage of Involvement (SOI) # 4 • Determine compliance of final product with objectives of RTCA/DO-178B • Verify that all software related problem reports, action items, and certification issues have been addressed
Final Certification Review • Conducted when final software build is complete and ready for formal system certification approval • Software Conformity Review has been conducted • Software Accomplishment Summary and Software Configuration Index are complete • All other software life cycle data are complete and under configuration control
Final Certification Review Planning Requirements Design • PSAC • SDP • SVP • SCMP • SQAP • Standards • Verif. data • SQA data • SCM data Code/Integration SOI # 4 • Software Req. Data: - High-Level Reqs. - Derived High- Level Reqs. • Verif. data • SQA data • SCM data Integration/Test • Design Description: - Architecture - Low-Level Reqs. - Derived Low-Level Reqs. • Verif. data • SQA data • SCM data • Source Code • Executable Object Code • Verif. data • SQA data • SCM data • Test Cases and Procedures • Test Results • Verif. data • SQA data • SCM data • Cert liaison ALLTables TablesA-1, A-8, A-9, A-10 Tables A-2, A-7 A-8, A-9, A-10 Tables A-2, A-3, A-4, A-5, A-6, A-8, A-9, A-10
Review Process • Prepare for the review • Notify the applicant • Conduct the review • Document the results • Brief the applicant • Follow-up activities
Preparing for the Review • Assemble review team • Software engineering • Systems engineering • SCM • SQA • Prepare for review • Draft agenda • Coordinate with applicant
Notify the Applicant • Provide written notification • Purpose of review • Date and duration • Review team • DO-178B objectives to be addressed and data to be reviewed • Data to be submitted before review • Request appropriate personnel be available to answer questions during review • Recommend applicant perform self-assessment before review
Conducting the Review • Certification authority entry briefing • Applicant briefing • Conduct review • Reading • Interviewing • Sampling • Witnessing • Certification authority exit briefing
Document the Results • List of all data items reviewed • Explanation of Findings and Observations • Finding: Identification of a failure to show compliance with one or more of the DO-178B objectives • Observation: Identification of potential software life cycle improvement • Identify any other certification issues and action items
Follow-up • Provide written report to applicant • Follow-up on findings, observations, certification issues, and action items • Schedule additional reviews as needed
Level of FAA Involvement • When the FAA should be involved • To what extent the FAA should be involved • Which areas the FAA should focus their involvement
Level of FAA Involvement • Number, type, and depth of reviews will vary depending on: • Software Level • Product attributes (e.g. size, complexity) • New technology or novel design • Experience with DO-178B • Experience with Certification • Designee Support • Other special considerations
Determining the LOFI • Applicant and Certification Authority work together to determine LOFI at the beginning of the project • Three Levels of Involvement • High • Medium • Low
Determining the LOFI • Software level is the starting point: • Level A = Medium or High • Level B = Medium or High • Level C = Low or Medium • Level D = Low • Reference Table 3-1 in Order 8110.49
Determining the LOFI • Need to consider other relevant criteria for levels A, B, and C • Software Certification Experience • Demonstrated Software Development Capability • Software Service History • Current System and Software Application • Designee Capabilities • Reference Table 3-2 in Order 8110.49
Figure 3-2: Other Relevant Criteria • Software Certification Experience • Experience with civil aircraft or engine certification • Experience with RTCA/DO-178B • Experience with RTCA/DO-178 or RTCA/DO-178A • Experience with other software standards (other than RTCA/DO-178[ ])
Figure 3-2: Other Relevant Criteria • Demonstrated Software Development Capability • Ability to consistently produce RTCA/DO-178B software products • Cooperation, openness, and resource commitments • Ability to manage software development and subcontractors • Capability assessments (for example, Software Engineering Institute Capability Maturity Model, ISO 9001-3) • Development team average based on relevant software development experience
Figure 3-2: Other Relevant Criteria • Software Service History • Incidents of software-related problems (as a percentage of affected products) • Company management’s support of DERs • Company software quality assurance organization and configuration management process • Company stability and commitment to safety • Success of past company certification efforts
Figure 3-2: Other Relevant Criteria • Current System and Software Application • Complexity of the system architecture, functions, and interfaces. • Complexity and size of the software and safety features • Novelty of design and use of new technology • Software development and verification environment • Use of alternative methods or additional considerations
Figure 3-2: Other Relevant Criteria • Designee Capabilities • Experience of DER(s) with RTCA/DO-178B • Designee authority, autonomy, and independence • Designee cooperation, openness, and issue resolution effectiveness • Relevance of assigned DERs experience. • Designees’ current workload. • Experience of DER(s) with other software standards (other than RTCA/DO-178[])
Figure 3-2: Other Relevant Criteria • Minimum and Maximum values for each criteria • “Experience with RTCA/DO-178B” • No projects = 0 • 2-4 projects = 5 • More than 5 projects = 10 • “Experience with other software standards” • No projects = 0 • 4-6 projects = 2 • More than 7 projects = 4 • Scale is weighted • “Experience with RTCA/DO-178B” is weighted higher than “Experience with other software standards”
Total Score Result (TSR) (from figure 3-2) Software Level A Software Level B Software Level C Software Level D TSR < 80 HIGH HIGH MEDIUM LOW 130 < TSR MEDIUM MEDIUM LOW LOW 80 < TSR < 130 HIGH MEDIUM MEDIUM LOW Figure 3-3: LOFI Determination NOTE 1: If the TSR is close to the TSR boundary values (that is, 80or 130), use the software level and engineering judgment to determine the most appropriate LOFI. NOTE 2: If any criterion in figure 3-2 is not applicable, the assessormay use the average value or adjust the figure 3-3 boundaries.
Determining the LOFI • Projects with issues that require new FAA policy typically require more FAA involvement • Level A and B = High • Level C and D = Medium
Determining the LOFI • Document specifics of FAA involvement using Appendix 1 in Order 8110.49 • Number and type of FAA on-site reviews • Number and type of FAA desk reviews • Data to be submitted to the FAA • Delegation to DERs
Appendix 1: LOFI Worksheet Plan Based on LOFI Assessment: (for example, number of FAA on-site reviews, number of FAA desk reviews, data to be submitted to the FAA, and delegation to DERs) Mid-Mid-Project Adjustments: (based on project improvements or problems) Actual Project Results: (for example, number of FAA on-site reviews, number of FAA desk reviews, data submitted to the FAA, and delegation to DERs)
Example: High LOFI • Minimal delegation • Recommend approval of all data • Software specialists involved • FAA involvement throughout software life cycle • At least 2 on-site reviews • Submittal of all software plans • Submittal of SAS, SCI, Verification Results, and Compliance Matrix
Example: Medium LOFI • Moderate delegation • Recommend Approval of PSAC and SAS • Approve SCI and other data • FAA involvement at beginning and end of software life cycle • Software specialists may be involved • At least 1 on-site review • Submittal of PSAC, SCI, and SAS • Potential submittal of other software plans and Compliance Matrix
Example: Low LOFI • Maximum delegation • DER Recommend Approval of PSAC and SAS • DER Approve all other software data • Minimal FAA involvement • No on site reviews • Few desk-top reviews • Software specialists not involved • Submittal of PSAC, SCI, SAS
Additional Information • Reference the FAA Software Job Aid "Conducting Software Reviews Prior to Certification” • View “Using the Software Job-Aid to Conduct Software Reviews” Video • Available through FAA Web-site:http://www.faa.gov/certification/aircraft/ • Link to “Aircraft Certification Software” on lower right hand side of page
Summary • Overview of the Software Review Process • Four Types of Software Reviews • Planning • Development • Verification • Final • Preparing, Conducting, and Documenting the Software Review
Summary • Determine the LOFI at beginning of project • Order 8110.49, Figures 3-1, 3-2, and 3-3 • Document LOFI Assessment • Order 8110.49, Appendix 1 • Make adjustments to the LOFI during the project, if needed • Reference the examples • Order 8110.49, Appendix 2, 3, and 4