160 likes | 313 Views
Context application of IRSN methodology to the reference case study safety assessment of MADTEB limitation functions Scope of the assessment assessment of the application software only system software and networks are out of the scope of the exercise Scope of this presentation
E N D
Context application of IRSN methodology to the reference case study safety assessment of MADTEB limitation functions Scope of the assessment assessment of the application software only system software and networks are out of the scope of the exercise Scope of this presentation concentrates on IRSN work (subject of the benchmark) contains no FANP confidential material BE SECBS – IRSN assessment
Assessment of the process Assessment of the product Requirement specification System specification (design) Generated code Code verification System integration Validation Synthesis and recommandations Assessment tasks
Input documents Quality Assurance Plan for Benchmark Exercise Verification and Validation Plan Goal to assess the definition and the coherence of the life-cycle phases, of their inputs and outputs, of the verification process and of the criteria allowing to end a phase and begin the next one to assess the process against IEC 60880 and french Basic Safety Rule requirements, e.g. : explicit set of phases (requirement, design, verification, ..) formalization of each phase and of the documents produced Independance of the verification team … Means : critical document review Assessment of the process
Goal completeness, clarity, coherence and precision of the requirements regarding functional and temporal behavior, accuracy, tolerance to hardware and software fault, interfaces with other systems and users But gaining independent knowledge of the plant needs is not possible in this limited exercise Means : critical document review Input document : Requirement Specification for the Benchmark Exercise (2 versions : assessment of V1 resulted in questions to FANP ) Assessment of the product : requirements
(« system specification » in FANP terminology ) Starting from the existing platform, two design levels are actually performed and assessed: architectural design application software design System software is out of the scope of the project should be assessed in an actual case (design and engineering process) system properties supporting the application behavior should be demonstrated Assessment of the product : design (1)
Goal : to assess how the architecturesatisfies the requirement 1 : are the properties and interfaces of the existing hardware and software necessary to the safety demonstration clearly and precisely written ? 2 : is the set of requirements of the application software exhaustively written ? 3 : assess the demonstration, based on 1 and 2, that the application software interacts adequately with the existing plat-form to assess the completeness, the clarity and the precision of the application software design documentation. the documentation should demonstrate how the SPACE diagrams implement the application software requirements (behavior, interfaces,fault tolerance..). it should also be demonstrated that the application software design does not include any non-required feature. Means : critical document review System Specification for the Benchmark Exercise Detailed Function Diagrams of the Four-Train Configuration Assessment of the product : design (2)
Goal clarity and justification of the coding choices made, as well those built in SPACE and those left to the user of SPACE. correctness of the generated code ; clarity, testability, maintainability and portability (as target hardware and the compiler may change) Means critical document review building of the object code (to check the completeness of the available files) code quality analysis using QAC semantic analysis to search for run-time errors using PolySpace Verifier Assessment of the product : generated code
Goal : to assess the code verification performed by the manufacturer the verification plan of the manufacturer should demonstrate the relevance, the clarity and the adequate level of detail of the choices made regarding the test bench and the coverage criteria it should include the test scenarios including the acceptance criteria. itshould make it possible to an independent team to objectively conclude on whether or not each module performs and interacts with the other modules as required. IRSN finally assesses the results of the verification and the discrepancy reports Assessment of the verification
Goal : to assess the validation performed by the manufacturer the manufacturer validation plan should document the test scenarios adequate coverage of the ranges of input signals and of computed variables and of the interactions between redundant units predefined expected outputs should be included the tests must also demonstrate the accuracy and response time, as well as the fault tolerance property of the system. this plan should be developed with the required independence level. IRSN finally assesses the results of the validation and the discrepancy reports Assessment of the validation (1)
Means critical document review test coverage evaluation using GATeL (test generation) and CLAIRE (execution) apply generic criteria to the software to build a list of potential tests (categories) check whether each category is empty or not (step 1) run (by simulation) the manufacturer’s tests to check whether each non empty category is covered or not (step 2 produce a test scenario (inputs, expected outputs) for each non empty non covered category (step 3) Assessment of the validation (2)
Assessment of the validation (3) First coverage criteria :1 - Local behaviors of modules Principle: Define categories for each type of module Exemples : - &, or, => truth table or part of the truth table - pulse => cat 1:single isolated pulse (nominal sollicitation) Cat 2:Input starts at 1 (non obvious starting condition) Cat 3: double input pulse while output is set ( distinguishes betwen retriggable non retrigable modules) - Flip flop: Cat 1: … Cat 2:… Cat 3: reset while set is at 1 (priority of R over S) - …
Assessment of the validation (4) Second criteria –Elementary logical triggering conditions Principle: - select one binary output - One category = One of the involved inputs triggers the output, the other inputs being unchanged.
Assessment of test coverage using Gatel/Claire Step 1: Establishing the coverage matrix Gatel • Determination of categories • Filter unreachable categories • (Lustre program + Constraint solver) Coverage criteria Functionnal diagrams Step 2: Filling the coverage matrix Claire • Running the test on an instrumented model of the application program Test scenarios Source or binary code (if available) Step 3: Generate « missing » tests Gatel (idem step 1) Test scenario (I/O)
Synthesis of the assessments and of the findings Recommandation to the Safety Authority to accept or not the system eventually, to ask for additional verification and validation Conclusion