200 likes | 357 Views
Common Vulnerabilities and Exposures. Steve Christey Margie Zuk June 22, 2000. CVE. Context: The Vulnerability Life Cycle. Mailing lists, Newsgroups, Hacker sites. Start Here. Discovery. Incident Response Teams Incident Reports. Academic Study Advisories. Incident Handling. Analysis.
E N D
Common Vulnerabilities and Exposures Steve Christey Margie Zuk June 22, 2000
CVE Context: The Vulnerability Life Cycle Mailing lists, Newsgroups, Hacker sites Start Here Discovery Incident Response Teams Incident Reports Academic Study Advisories Incident Handling Analysis Intrusion Detection Systems Databases Newsletters Detection Collection Protection Vulnerability Assessment Tools
A Roadblock to Information Sharing:Same Problem, Different Names
The Implications • Difficult to correlate data across multiple organizations and tools • E.g. IDS and assessment tools • E.g. security tools and fix information • Incident information • Difficult to conduct a detailed comparison of tools or databases • Vulnerabilities are counted differently • Which is more comprehensive?
Common Vulnerabilities and Exposures (CVE):One Common Language • Lists all publicly known security problems • Assigns unique identifier to each problem • Remains independent of multiple perspectives • Is publicly open and shareable • Community-wide effort via the CVE Editorial Board
Addressing Common Misconceptions of CVE • Not a full-fledged vulnerability database • Simplicity avoids competition, limits debate • Intended for use by vulnerability database maintainers • Not a taxonomy or classification scheme • Focuses on vulnerabilities instead of attacks • Does not cover activities such as port mapping • Not just “vulnerabilities” in the classical sense • Definitions of “vulnerability” vary greatly • “Exposure” covers a broader notion of “vulnerability” • Competing vendors are working together to adopt CVE
CVE Editorial Board • Members from 25 different organizations including researchers, tool vendors, response teams, and end users • Mostly technical representatives • Review and approve CVE entries • Discuss issues related to CVE maintenance • Monthly meetings (face-to-face or phone) • Publicly viewable mailing list archives
Active Editorial Board Members(as of June 19, 2000) Response Teams Ken Armstrong - CanCERT Bill Fithen - CERT Coordination Center Scott Lawler - DOD-CERT Tool Vendors David Balenson - NAI Andy Balinsky - Cisco Scott Blake - BindView Andre Frech - ISS Kent Landfield - info-ops.com Jim Magdych - NAI David Mann - BindView Craig Ozancin - AXENT Paul E. Proctor - CyberSafe Mike Prosser - Symantec Marcus Ranum - NFR Steve Schall - Intrusion.com Tom Stracener - Hiverworld Bill Wall - Harris Kevin Ziese - Cisco Academic/Educational Matt Bishop - UC Davis Computer Security Lab Pascal Meunier - Purdue University CERIAS Alan Paller - SANS Institute Gene Spafford - Purdue University CERIAS Network Security Eric Cole - Vista IT Kelly Cooper - GTE Internetworking Information Providers Russ Cooper - NTBugtraq Elias Levy - Bugtraq, Security Focus Ron Nguyen - Ernst and Young Ken Williams - eSecurityOnline.com OS Vendors David LeBlanc - Microsoft Casper Dik - Sun Other Security Analysts Steve Northcutt - SANS Adam Shostack - Zero-Knowledge Systems Stuart Staniford - Silicon Defense MITRE Dave Baker, Steve Christey, Bill Hill
Using CVE from Advisories to IDSes Which tools test for these problems? Do my systems have these problems? Does my IDS have the signatures? Tool 1 Popular Attacks IDS CVE-1 CVE-2 CVE-3 CVE-1 CVE-3 CVE-4 CVE-1 CVE-2 CVE-3 CVE-4 Tool 2 CVE-3 CVE-4 I can’t detect exploits of CVE-2 - how well does Tool 1 check for it?
Tool 2 Tool 1 CVE-3 CVE-4 CVE-1 CVE-2 CVE-3 Using CVE from Attacks to Incident Recovery YES Public Databases I detected an attack on CVE-3. Did my assessment say my system has the problem? CVE-2 CVE-3 Clean up Close the hole Advisories Report the incident CVE-1 CVE-2 CVE-3 NO Don’t send an alarm But the attack succeeded! Tell your assessment vendor Go to YES
CVE Compatibility • Ensures that a tool or database can “speak CVE” and correlate data with other CVE-compatible products • Requirements • Find items by CVE name (CVE searchable) • Include CVE name in output for each item (CVE output) • Provide MITRE with database items that are not in CVE yet • Good faith effort to keep mappings accurate • 25 organizations have declared their intentions to make their products CVE compatible
* * * * * * * * * * * * * * * = CVE already being used in a product Organizations Working Toward CVE Compatibility • Internet Security Systems • Nessus Project • Network Associates • Network Security Wizards • NIST • NTBugtraq • SANS Institute • Security Focus - Bugtraq • Symantec (L-3) • UC Davis • White Hats • World Wide Digital Security • Advanced Research Corp. • Alliance Qualité Logiciel • AXENT • BindView • CERIAS/Purdue University • CERT • Cisco • CyberSafe • Cyrano • Ernst and Young • Harris Corp. • Hiverworld, Inc. • Intrusion.com
Adding New Entries to CVE • Board member submits raw information to MITRE • Submissions are grouped, refined, and proposed back to the Board as candidates • Form: CAN-YYYY-NNNN • Strong likelihood of becoming CVE-YYYY-NNNN • Not a guarantee • Delicate balance between timeliness and accuracy • Board reviews and votes on candidates • Accept, modify, recast, reject, reviewing • If approved, the candidate becomes a CVE entry • Entry is included in a subsequent CVE version • Published on CVE web site • Entries may later be modified or deprecated
Stages of Security Information in CVE Submissions Candidates Entries Raw information Obtained from MITRE, Board members, and other data feeds Combined and refined Placed in clusters Proposed to Editorial Board Accepted or rejected Backmap tells submitters what candidates were assigned to their submissions Added to CVE list Submissions, candidates removed from the “pool” Published in an official CVE version ….. ….. CVE-2000-0001 CAN-2000-0001 ….. ….. <REJECTED> CAN-2000-0002 ….. ….. CVE-2000-0003 CAN-2000-0003 ….. ….. Back-map
Content Decisions • Explicit guidelines for content of CVE entries • Ensure consistency within CVE • Provide “lessons learned” for researchers • Three basic types • Inclusion • What goes into CVE? What doesn’t, and why? • Example: weak encryption, bugs in beta code • Level of Abstraction • Example: default passwords, or multiple bugs in the same application - one or many entries? • Format • Example: what goes into a CVE description? • Challenge: what to do with incomplete information?
1. Inject Candidatenumbers into advisories 2. Establish CVE at product level in order to... 3. … enable CVE to permeate the policy level. The CVE Strategy Unreviewed Bugtraqs, Mailing lists, Hacker sites Discovery Products Policy time Reviewed Advisories CERT, CIAC, Vendor advisories Scanners, Intrusion Detection, Vulnerability Databases Methodologies Purchasing Requirements Education
CVE Status as of June 2000 • Latest CVE version: 20000602 • 700 entries • 722 additional candidates being reviewed by Editorial Board • Received vulnerability databases from 10 organizations • Will help create more legacy candidates • Processing 10,000 submissions (database items) • May produce over 1000 additional candidates? • Candidate numbers used in 5 security advisories • CVE names included in Top Internet Security Threats list • Editorial Board discussing content decisions • Affects ~300 candidates
Future Directions • Add more entries to CVE • Goal: 1000 entries by September 2000 • Add entries for all 1999, 2000 advisories • Add more candidates • Goals: 1000 new candidates by September 2000 • Cover 80% of items in each participating tool or database • Use CVE identifiers in advisories, newsletters, etc. • Use CVE in deeper product analysis • Work with users and vendors to establish CVE as a de facto standard
For More Information http://cve.mitre.org