640 likes | 837 Views
Presented by Maxwell Drew and Dan Kaiser Southwest State University Computer Science Program. The Software Development Life Cycle: An Overview. Last Time. Brief review of the testing process Dynamic Testing Methods Static Testing Methods Deployment in MSF Deployment in RUP.
E N D
Presented by Maxwell Drew and Dan Kaiser Southwest State University Computer Science Program The Software DevelopmentLife Cycle: An Overview
Last Time • Brief review of the testing process • Dynamic Testing Methods • Static Testing Methods • Deployment in MSF • Deployment in RUP
Session 8:Security and Evaluation • General Systems Engineering Concepts • Information Systems Security Engineering Process • Relation of ISSE Process to other Processes • Product, Process & Resource Evaluation • Course Evaluations
Information SystemsSecurity Engineering • General Systems Engineering Concepts • Information Systems Security Engineering Process • Relation of ISSE Process to other Processes
Discover Needs • Mission/Business Description • Policy Consideration • Mission Needs Statement (MNS) • Concept of Operations (CONOPS)
Define System Functionality • Objectives - MoE • System Context/Environment • Requirements - RTM • Functional Analysis
Define System • Functional Allocation - CM • Preliminary Design–Baseline Configuration • Detailed Design - CI
Implement System • Procurement • Build • Test
Assess Effectiveness • Interoperability • Availability • Training • Human/Machine Interface • Cost
ISSE Activities • Describing information protection needs • Generating information protection requirements based on needs early in the systems engineering process • Satisfying the requirements at an acceptable level of information protection risk • Building a functional information protection architecture based on requirements • Allocating information protection functions to a physical and logical architecture • Designing the system to implement the information protection architecture • Balancing information protection risk management and other ISSE considerations within the overall system context of cost, schedule, and operational suitability and effectiveness
ISSE Activities - Continued • Participating in trade-off studies with other information protection and system engineering disciplines • Integrating the ISSE process with the systems engineering and acquisition processes • Testing the system to verify information protection design and validate information protection requirements • Supporting the customers after deployment and tailoring the overall process to their needs
Figure 3-2 Impact of Mission, Threats, and Policies on Information Protection Requirements Discover Information Protection Needs
Mission Information Protection Needs • What kind of information records are being viewed, updated, deleted, initiated, or processed (classified, financial, proprietary, personal private, etc.)? • Who or what is authorized to view, update, delete, initiate, or process information records? • How do authorized users use the information to perform their duties? • What tools (paper, hardware, software, firmware, and procedures) are authorized users using to perform their duties? • How important is it to know with certainty that a particular individual sent or received a message or file?
Threats to Information Management • Types of Information • Legitimate users and uses of information • Threat agent considerations - Capability - Intent - Willingness - Motivation - Damage to mission
Information Protection Policy Considerations • Why protection is needed • What protection is needed • How protection is achieved not considered at this stage
Information Protection Policy Issues • The resources/assets the organization has determined are critical or need protection • The roles and responsibilities of individuals that will need to interface with those assets (as part of their operational mission needs definition) • The appropriate way (authorizations) authorized individuals may use those assets (security requirements).
Define Information Protection System • Information Protection Objectives – MoE • System Context/Environment • Information Protection Requirements – RTM • Functional Analysis
Information Protection Objectives Should Explain: • The mission objectives supported by information protection objective • The mission-related threat driving the information protection objective • The consequences of not implementing the objective • Information protection guidance or policy supporting the objective
Design Information Protection System • Functional Allocation • Preliminary Information Protection Design • Detailed Information Protection Design
Preliminary Information Protection Design Activities • Reviewing and refining Discover Needs and Define System activities' work products, especially definition of the CI-level and interface specifications • Surveying existing solutions for a match to CI-level requirements • Examining rationales for proposed PDR-level (of abstraction) solutions • Verification that CI specifications meet higher-level information protection requirements • Supporting the certification and accreditation processes • Supporting information protection operations development and life-cycle management decisions • Participating in the system engineering process
Detailed Information Protection Design Activities • Reviewing and refining previous Preliminary Design work products • Supporting system- and CI-level design by providing input on feasible information protection solutions and/or review of detailed design materials • Examining technical rationales for CDR-level solutions • Supporting, generating, and verifying information protection test and evaluation requirements and procedures • Tracking and applying information protection assurance mechanisms • Verifying CI designs meet higher level information protection requirements • Completing most inputs to the life-cycle security support approach, including providing information protection inputs to training and emergency training materials • Reviewing and updating information protection risk and threat projections as well as any changes to the requirements set • Supporting the certification and accreditation processes • Participating in the system engineering process
Implement Information Protection System • Procurement • Build • Test
Implement Information Protection SystemGeneral Activities • Updates to the system information protection threat assessment, as projected, to the system's operational existence • Verification of system information protection requirements and constraints against implemented information protection solutions, and associated system verification and validation mechanisms and findings • Tracking of, or participation in, application of information protection assurance mechanisms related to system implementation and testing practices
Implement Information Protection SystemGeneral Activities (cont.) • Further inputs to and review of evolving system operational procedure and life-cycle support plans, including, for example, Communication Security (COMSEC) key distribution or releasability control issues within logistics support and information protection relevant elements within system operational and maintenance training materials • A formal information protection assessment in preparation for the Security Verification Review • Inputs to Certification and Accreditation (C&A) process activities as required • Participation in the collective, multidisciplinary examination of all system issues
Build Information Protection System • Physical Integrity. • Have the components that are used in the production been properly safeguarded against tampering? • Personnel Integrity. • Are the people assigned to construct or assemble the system knowledgeable in proper assembly procedures, and are they cleared to the proper level necessary to ensure system trustworthiness?
Test Information Protection System Activities • Reviewing and refining Design Information Protection System work products • Verifying system- and CI-level information protection requirements and constraints against implemented solutions and associated system verification and validation mechanisms and findings • Tracking and applying information protection assurance mechanisms related to system implementation and testing practices • Providing inputs to and review of the evolving life-cycle security support plans, including logistics, maintenance, and training • Continuing risk management activities • Supporting the certification and accreditation processes • Participating in the systems engineering process
Assess Effectiveness • Interoperability. • Does the system protect information correctly across external interfaces? • Availability. • Is the system available to users to protect information and information assets? • Training. • What degree of instruction is required for users to be qualified to operate and maintain the information protection system? • Human/Machine Interface. • Does the human/machine interface contribute to users making mistakes or compromising information protection mechanisms? • Cost. • Is it financially feasible to construct and/or maintain the information protection system?
Relation to Other Processes • System Acquisition Process • Risk Management Process • DITSCAP • Common Criteria International Standard
Figure 3-5 Risk Management Process Risk Management Process
Figure 3-7 Risk Plane Risk Plane
Figure 3-9 Security Concepts and Relationships in the Common Criteria Security Concepts & Relationships
Figure 3-12 Uses of Evaluation Results Use of Evaluation Results
Evaluation • General Techniques • Evaluating the Product • Evaluating the Process • Evaluating Resources
Categories of Evaluation • Feature analysis: • rate and rank attributes • Survey: • document relationships • Case study • sample from variables • Formal experiment • sample over variables
Example Feature Analysis Tool 1: t-OO-l Tool 2: ObjecTool Tool 3: EasyDesign Importance Feature Good user interface 4 5 4 3 Object-oriented design 5 5 5 5 Consistency checking 5 3 1 3 Use cases 5 4 4 2 Runs on Unix 4 4 5 5 Score 82 77 73 Table 12.1. Design tool ratings
Case Study Types • Sister projects: • each is typical and has similar values for the independent variables • Baseline: • compare single project to organizational norm • Random selection: • partition single project into parts
Formal Experiment • Controls variables • Uses methods to reduce bias and eliminate confounding factors • Often replicated • Instances are representative: • sample over the variables (whereas case study samples from the variables)
Evaluation Steps • Setting the hypothesis • the tentative supposition that we think explains the behavior we want to explore • Maintaining control over variables • decide what effects our hypothesis • Making investigation meaningful • determine the degree to which results can be generalized
Common Evaluation Pitfalls Pitfall Description 1. Confounding Another factor is causing the effect. 2. Cause or effect? The factor could be a result, not a cause, of the treatment. 3. Chance There is always a small possibility that your result happened by chance. 4. Homogeneity You can find no link because all subjects had the same level of the factor. 5. Misclassification You can find no link because you cannot accurately classify each subject’s level of the factor. 6. Bias Selection procedures or administration of the study inadvertently bias the result. 7. Too short The short-term effects are different from the long-term ones. 8. Wrong amount The factor would have had an effect, but not in the amount used in the study. 9. Wrong situation The factor has the desired effect, but not in the situation studied. Table 12.2. Common pitfalls in evaluation. Adapted with permission from (Liebman 1994)
Assessment vs. Prediction • An assessment system examines an existing entity by characterizing it numerically • Prediction system predicts characteristic of a future entity; involves a model with associated prediction procedures • deterministic prediction (we always get the same output for an input) • stochastic prediction (output varies probabilistically)
Product Quality Models • Boehm’s Model • ISO 9126 Model