610 likes | 728 Views
How to be Confident that a Tool Works and How to Document it Scientifically. Dr. James R. Lyle National Institute of Standards and Technology. Plan of Talk. The Problem Overview of V&V Overview of Conformance Testing An Example: CFTT. The Problem.
E N D
How to be Confident that a Tool Works and How to Document it Scientifically Dr. James R. Lyle National Institute of Standards and Technology
Plan of Talk • The Problem • Overview of V&V • Overview of Conformance Testing • An Example: CFTT
The Problem • A Forensics Lab might use several software tools • How does the lab establish that the tools work correctly
Some Possible Approaches • Ad Hoc Testing • Formal Verification &Validation • Conformance Testing
Ad Hoc Testing is not Enough • Just running a few test cases is not enough • Test case selection must be justified • Test environment and procedures must be documented • Test results must be documented
Goals of V&V V&V is a process that provides an objective assessment of products and processes throughout the software life cycle: • Correct • Complete • Consistent • Testable
Definitions of V & V (informal) • Verification: Does the software solve the problem correctly. • Validation: Does the software solve the correct problem.
Definitions of Verification • The process of evaluating a system to determine whether the products of a given development phase satisfy the conditions imposed at the start of the phase. IEEE 610 • Confirmation by examination and provisions of objective evidence that the specified requirements have been fulfilled. ISO 8402:1994/IEEE 1012
Definitions of Validation • The process of evaluating a system during or at the end of the development process to determine whether it satisfies specified requirements. IEEE 610 • Confirmation by examination and provisions of objective evidence that the particular requirements for a specific intended use are fulfilled. ISO 8402:1994/ IEEE 1012
V&V Processes in a Project • Management • Acquisition • Supply • Development • Operation • Maintenance
Development V&V Activites • Concept – delineation of a solution • Requirements – what should be done • Design – how to do it • Implementation – create the code • Test – try it out (systematically) • Installation & checkout – deploy
Wait a sec – what do we want? • We want to assess a COTS tool for a computer forensics lab • We don’t need: concept, design or implementation • We just want to do testing • But to test we need requirements • Should look at V&V of requirements and tests
Requirements V&V Tasks • Traceability Analysis • Software Requirements Evaluation • Interface Analysis • Criticality Analysis • System V&V Test Plan • Acceptance V&V Test Plan • Configuration Management Assessment • Hazard Analysis • Risk Analysis
Requirements Evaluation • Correctness • Consistency • Completeness • Accuracy • Readability • Testability
Requirements Correctness • Requirements within constraints and assumptions of the system • Requirements comply with standards, regulations, policies, physical laws • Validate data usage and format
Requirements Consistency • Terms and concepts are consistent • Function interactions and assumptions are consistent • Internal consistency
Requirements Completeness • All functions specified • Interfaces: hardware, software & user • Performance criteria • Configuration data
Requirements Accuracy • Precision (e.g., truncation & rounding) • Modeled physical phenomena conform to system accuracy and physical laws
Requirements Readability • Legible, understandable and unambiguous to the target audience • All acronyms, mnemonics, abbreviations, terms and symbols are defined
Requirements Testability • There must be objective acceptance criteria for validating the requirements
Testing V&V Tasks • Traceability Analysis • Test Procedure Generation • Test Case Execution & Report • Hazard Analysis • Risk Analysis
Test Case Traceability • Each test case must be derived from one or more requirements • Constructing a requirements to test cases matrix is helpful
Test Case Procedure Generation • Each test case must have a documented procedure for test execution
Test Case Execution • Run each test case • Document results • Anomaly identification and documentation procedure
Hazard Analysis • Hazard: a source of potential harm or damage (to people, property or the environment) • Hazard Analysis: identification and characterization of hazards
Risk Analysis • Risk: combination of the frequency (or probability) and the consequence of a given hazard. • Risk Analysis: systematic use of available information to identify hazards and to estimate the risk to individuals, populations, property or the environment.
V & V of COTS Summary • Create set of tool Requirements • Review • Create Test Cases & Test Environment • Review • Execute Test Cases • Write Report
What Is Conformance Testing? • Used to check an implementation against a standard or specification • ISO Guide 2 defines Conformance as the fulfillment of a product, process or service of specified requirements. • Requirements are specified in a standard or specification.
Components for Testing • Specification (less formal) or standard (ISO, IEEE, ANSI) • Conformance Test Suite • What about validation and certification? • Need: Testing Lab, Certification Authority and dispute resolution
(Requirements) Specification • Tool User Manual? No – too specific • Spec should cover core functionality required for correct operation of similar tools. • Should apply to most similar tools • OK if some features omitted – cover omitted features in another spec • Use V&V techniques to review spec
Conformance Test Suite • Test cases & test case documentation • Derived from the specification (using V&V techniques: traceable, complete, consistent, etc) • Description of test purpose • Pass/fail criteria • Trace back to requirements in spec
Testing Methodology • Source code is not available (a good thing) • Apply black box testing theory to create tests • Test both legal and illegal inputs • Tools may provide optional features – some test cases may be executed only if a tool provides an optional feature
Who Runs the Tests? • Testing Lab, Vendor, Tool User • Lab may be accredited or just recognized as qualified to run the tests • Lab produces a test report
The Test Report • Detail pass/fail for each test case • Complete description of Tool Under Test • Name of test lab • Date of test • Name and Version of test suite • Unambiguous statement of results
Certification Authority • Reviews (validates) test results • Explicit criteria for issuing a certificate • Issues certificate (or brand) for validated product • Another definition of Validation: process necessary to perform conformance testing in accordance with a prescribed procedure and official test suite. • Certification: acknowledgement that a validation was completed and the criteria established for issuing certificates was met
NIST CFTT Products • Forensics tool function specifications • Forensics tool function test methodologies • Test support software • Forensics tool function test reports
Specification Development • CFTT sponsors identify tool function • Focus group (LE & NIST) to identify requirements • NIST drafts specification for external review • NIST develops test methodology, test harness
Tool Testing Process • Sponsors identify tool to test • Write test plan (identify test cases to run) • Run Tests • Write Test Report
Examples From CFTT • Writing Requirements • Support Software Documentation • Test Cases
Outline of a Specification • Introduction • Scope • Technical Background • Requirements: what should the tool do • Test Assertions: If/then statements that are testable • Test Methodology • Test Cases: combinations of test assertions • Traceability Matrices
SW Write Blocker Requirements • Informal requirement: No change allowed to a drive that contains evidence • Also: Must be able to read the entire drive • Formal: (1) The tool shall block any commands to a protected disk in the write, configuration, or miscellaneous categories. • (2) The tool shall not block any commands to a protected disk in the read, control or information categories
Apply V&V to Spec for … • Correctness • Consistency • Completeness • Accuracy • Readability • Testability
Requirements Completeness • Completeness: E.g., look at combinations of parameters: Read/Write vs protected or not Need another: (3) The tool shall not block any commands to an unprotected disk.
Requirements Testability • SWB test methodology: replace the int 13 routine with one that counts the number of times each I/O function is called, but does not execute any commands. • Any blocked commands have a count of 0
A Testable Requirement? • Documentation shall be correct insofar as the mandatory and any implemented optional requirements are concerned.
Evolution of an Imaging Req • If a source disk is imaged to a destination disk of the same geometry then the disks compare equal. • If a duplicate copy is created directly from a source disk of the same geometry, then the disks must compare equal. • The tool shall create a bit-stream duplicate of the original. • If there are no errors accessing the source media, then the tool shall create a bit-stream duplicate of the original.
Imaging Test Methodology • Init source to known state • SHA-1 for source • Init destination to known state • Run tool • Compare source to destination • Rehash source
Test Case Traceability • Every test case must be traceable and unambiguous. • This case is neither: TEST CASE: DI-167 Create an image from a BIOS-IDE source disk to a BIOS-IDE destination disk and the source contains a FAT32 partition where the source disk is smaller than the destination and where the source disk contains a deleted partition EXPECTED RESULTS: src compares qualified equal to dst deleted-partition is recovered
Test Logging • Log everything, automatically if practical • Hardware, Software, Versions • Time/date • Operator