1 / 25

CC in the Real World

This article discusses the real-world issues faced by practitioners when implementing Common Criteria (CC) in the field of technology. It highlights challenges such as difficulty in matching agencies' needs with evaluated products, lack of awareness among vendors regarding the evaluation process, reduction in number of product validators, and a lack of performance measures. The article also explores issues with CC, including timeliness, cost, narrow scope of certification, limited mutual recognition, and comprehensibility.

dalep
Download Presentation

CC in the Real World

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CC in the Real World Fiona Pattinson NASA are owners of this picture produced by The Visible Earth When technologies based on the state of the art in a rapidly maturing discipline are put into practice some “real world” issues surface for the practitioners to cope with.

  2. In the U.S. GAO report:(CC consumers) • March 2006 • difficulty in matching agencies’ needs with the availability of NIAP-evaluated products • vendors’ lack of awareness regarding the evaluation process • a reduction in the number of validators to certify products • a lack of performance measures and difficulty in documenting the effectiveness of the NIAP process http://www.gao.gov/new.items/d06392.pdf

  3. Vendors Issues • May 2007 GCN: “Symantec: Common Criteria is bad for you” • CC is a closed standard development • Does not address areas where vulnerabilities occur • Starts late in the development process • It is difficult to produce a complete ST at the beginning of a project • Assurance vs convenience balance http://www.gcn.com/online/vol1_no1/44205-1.html

  4. General Issues with CC… • Timeliness • Takes too long • Cost • Costs too much • Validity/Scope of certification • Too narrow • Mutual Recognition • limited • Composition • Was not addressed • Comprehensibility • Too hard to cope with David Brewer: Is the Common Criteria the only way?”: https://www.ipa.go.jp/event/iccc2005/pdf/B2-10.pdf

  5. “The product I want to buy is not on the list.” • Over the last few years: • greater emphasis in understanding the risks presented by a greater number and variety of IT products • Emphasis on certification as a solution to help assist risk based decisions • Scalability • Can the CC schemes handle the needed volume?

  6. Rise in number of evaluations ITSEC Evaluations* (*From “Is the Common Criteria the only way?”, David Brewer, 2005)

  7. “I don’t know where to start with CC.” • This *is* a FAQ from new vendors & teams • Most national schemes publish good beginner info • So does the CC Portal • Many labs provide basic information (and consultancy!) • “I’ve heard that CC is expensive, takes forever, needs lots of specialised documents, is hard work.. etc” 'Does Common Criteria need marketing?', ICCC 2006, Tokyo.

  8. “It costs HOW much?! ” • Effects • Some claim it is used as a trade control measure • Prices smaller companies out of the market • Compare • FIPS 140-2: Smaller scope, smaller cost • Consultation and testing components • ISMS: • Consultation and auditing costs

  9. “Our products are released every 6 months, but we were told that our CC evaluation would take 9-18 months.” • This may be true for a first time evaluation • There’s a learning curve for product teams using CC • As a product and dev team’s “evaluation career” progress it is possible to have certification coincide to within a few weeks of the end of the development cycle

  10. Example: Linux

  11. How can I make CC easier next time?Improving development process with assurance provision in mind • CC is not a direct process or product improvement standard • If an organization has assurance provision as part of a product-line strategy then security process and product improvement are vital • Informal, or formal (SSE:CMM ISO/IEC 21827, CMMI, TickIT, even ISO/IEC 9001 and 90003) • Processes may even address the needs of particular standards • Secure System Design • Design to provide assurance

  12. “But you didn’t evaluate anything I use or that’s remotely useful.” • Unrealistic boundaries are specified in order to: • Reduce evaluation effort (time/cost) • Avoid problem areas  • CC is deliberately flexible, and does not prevent ST writers defining unrealistic boundaries • The specification of inappropriate boundaries bring CC into disrepute as consumers are left with claims that are meaningless in their objective of assessing their risk

  13. In practice this can be alleviated by: • Scheme policies • Specification and adoption of appropriate PPs • Ethical laboratories and consultants advise their customers appropriately • Compare with : • Specifications e.g. FIPS 140-2 • Boundary is more rigorous but still mis-used sometimes • ISMS(& QMS) • (At least for QMS) the specifications for scope have been mis-used and caused disrepute.

  14. “It doesn’t evaluate a whole product” • Security functionality only • No assurance given for non-security functionality • Some product vulnerabilities appear from the TOE environment and are “dismissed” by broad assumptions • This is technically correct • Hard for real-world users to understand • Does it help assess risk more accurately in the real-world? • Somewhat true also for FIPS 140-2, although CMVP certificates and security policies tend to reference the whole product

  15. “CC is inflexible; it can’t be used for my product!” • CC is very flexible • SFR extensions • PP • Interpretations • Ongoing standards development • It is difficult to produce a complete ST at the beginning of a project • “Before and during the evaluation, the ST specifies “what is to be evaluated”. In this role, the ST serves as a basis for agreement between the developer and the evaluator on the exact security properties of the TOE and the exact scope of the evaluation…. After the evaluation, the ST specifies “what was evaluated…” (CC part 1 A.3.1) • Compare • FIPS 140-2 which is more restrictive, and actually means that there are several products that it can’t be used for or that it is very difficult to be used for. • ISMS can be used for any organization

  16. “It’s hard work” Yes it is! • The more assurance end users need, the harder work it is • Internal effort is related to evaluation strategy, and can be reduced • Process improvements

  17. “It assumes a waterfall model for development” • No it doesn’t! CC paradigm is flexible and it fits all development models. • Spiral methods, pair programming, iterative and incremental, agile, open source development, rapid prototyping etc can and have been used. • The evaluation too can be built in to the processes of these models and use similar paradigms • Examples: • The successful evaluation of Open Source development paradigm for various Linux evals, the provision of evaluation artifacts and evidence back to the community. • An iterative evaluation methodology to track and pace iterative product development.

  18. “It requires too much documentation”

  19. It is a formalized methodology, and documentation that would not otherwise be produced is required in order to provide high assurance • E.g. ST, Correspondence analyses, • Other documentation is often produced “just for CC” that the lab should be able to find in a regular mature product development • E.g. Design documentation, Process documentation, Vulnerability assessment

  20. “Documentation is never the cause of a security failure so why emphasise that!” • Documentation is an evaluation tool. It shows • The design is sound: A product can (and often is) designed poorly. Design flaws => vulnerabilities • The design is implemented to the level of abstraction specified • It shows the system was designed to implement the security requirements • Specialised documentation is an important part of providing assurance: Especially boundaries • PPs and ST for CC • Security Policy, Finite State Model (FSM) for FIPS 140-2 • Scope and Statement of Applicability (SOA) for ISMS • Labs must be prepared to read and use existing documentation in any format.

  21. “We want ONE evaluation that’s accepted everywhere” • Mutual Recognition • CCRA /SOGIS are examples • ISO/IEC 15408 • Failed for Smartcards • Failing because of need for higher levels of assurance (medium level robustness PPs) which are not covered under current MRAs

  22. Addressing these issues using the CC paradigm • Developing useful Protection Profiles • Using a “staged” evaluation strategy, so that initial evaluation of a product at a lower evaluation level builds a good platform for later evaluation at a higher level • Pursuing ongoing communication between consumers, vendors, evaluation labs, and schemes, so that emerging consumer requirements on mature product lines are clearly understood by all parties in the evaluation process • Conducting development, Common Criteria consulting, and Common Criteria evaluation efforts in parallel with development efforts, so that certification of new products is timely (atsec example: Red Hat Linux evaluation project) http://www.atsec.com/downloads/pdf/efficient_cc_evaluations.pdf

  23. Conclusion • Real World Issues occur with every certification framework • Both standard and the certification scheme have to be sound. • Some specialised documentation is necessary, but this should be minimal. • The definition of clear boundaries are vital. • Balance “certification as a tax” vs “real-life improvement and benefits of certification” • Communication is vital • Mutual Recognition arrangements • Including training for all stakeholders (Overseers, Labs, Vendors, Consumers, Sponsors)

  24. Thank You Fiona Pattinson atsec Information Security Corporation Austin, Texas, USA Phone: +1 512 615 7382 Email: fiona@atsec.com www.atsec.com

  25. References & Bibliography • Olthoff, K. G. 1999, 'A cursory examination of market forces driving the Common Criteria', in New Security Paradigms Workshop, ACM Press, pp. 61-66. • Gordon, L. A. & Loeb, M. P. 2001, Economic Aspects of Information Security, Rainbow Technologies. • Mahon, G. 2002, 'Developers backward engineering their products to meet the Common Criteria', in International Common Criteria Conference (ICCC), Ottawa. • Merkow, M. 2003, Security assurance through the Common Criteria, New Riders Pub., Indianapolis, IN. • Apted, A. J. , Carthigaser, M. & Lowe, C. 2002, 'Common problems with the Common Criteria', in International Common Criteria Conference, Ottawa. • Aizuddin, A. , The Common Criteria ISO/IEC 15408– The insight, some thoughts, questions and issues. • Smith, R. E. , Trends in Government Endorsed Security Product Evaluations, Secure Computing Corporation. • The common criteria -- Good, bad or indifferent? , Information Security Technical Report, vol. 2, no. 1, pp. 5-6 PY - 1997 AU - L. • Security evaluation and certification: The future of a national scheme. , Information Security Technical Report, vol. 2, no. 1, pp. 6-6 PY - 1997 AU - H. • Brewer, D. 2005, 'Is the Common Criteria the only way?', International Common Criteria Conference, Tokyo. • Oldegård, M. & Farnes, M.-L. 2005, 'Does Common Criteria need marketing?', International Common Criteria Conference, Tokyo. • Mueller, S., 4/10/2006, ”Efficient Common Criteria Evaluations”: http://www.atsec.com/downloads/pdf/efficient_cc_evaluations.pdf • Katzke, S. 2004, The Common Criteria (CC) Years (1993-2008): Looking Back and Ahead. • GAO. 2006, Information Assurance: National Partnership offers benefits, but Faces Considerable Challenges, United States Government Accountability Office. • Government, Computer News, 2007/05/04, “Symantec: Common Criteria is bad for you” Jackson, J: http://www.gcn.com/online/vol1_no1/44205-1.html • CCMB-2006-09-01 "Common Criteria for Information Technology Security Evaluation Version 3.1: Part 1: Introduction and general model" The Common Criteria Management Board.

More Related