1 / 28

Cyber Security is a Mess: Is There a Way Out?

EPFL Workshop on Cyber Risk and Information Security June 3, 2014. Cyber Security is a Mess: Is There a Way Out?. Defeating Malice is our Job. Malice is dynamic, adaptive, reacting to changes in practitioner’s product We need to be proactive, not retroactive

aletha
Download Presentation

Cyber Security is a Mess: Is There a Way Out?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EPFL Workshop on Cyber Risk and Information Security June 3, 2014 Cyber Security is a Mess: Is There a Way Out?

  2. Defeating Malice is our Job • Malice is dynamic, adaptive, reacting to changes in practitioner’s product • We need to be proactive, not retroactive • Security Products are like fruit; they don’t last long

  3. EngineeringPractices -- Inadequate • The Internet was not built to to address known risks • Insufficient Emphasis on Mutual Suspicion and other Security Primitives • Risks and Costs Were Passed to End Users – You and Me!

  4. Two “Bubbles Fiscal and Trust • Fiscal Bubble – Credit Derivative Collapse (We are still in a LONG recovery!) • Trust Bubble Collapse (Coming Soon?) Stuxnet and WikiLeaks make situation worse) • BOTH have components that are widely used, but little understood by users and not fully analyzed! • This is a Recipe for Disaster!

  5. Human Trust Depends On: • Identity • Role • Capabilities • Intent • It is limited, does not scale readily, is easily revoked, and once revoked, is not easily recovered

  6. Cyber Trust Depends On: • Identity • Usually, not much else • It is Transitive (Spreads Easily and Widely), and Hard to Revoke

  7. 650+ CAs Trusted by Browsers

  8. A REALLY MAJOR PROBLEM!! • MISMATCH between Human Trust and Cyber Trust leads to Cognitive Dissonance! • NOT UNDERSTOOD by most People (even Techies) • Leads people astray, creates RISKS • As with Fiscal Bubble, a Recipe for Disaster!

  9. Pharmacy Example: • In the 1980’s, 3-10 medical professionals knew your medications; an example of Human Trust. • Today, easily many hundreds have access to that data through drug databases; an example of Cyber Trust. • This additional exposure of your data WAS NOT NECESSARY, but merely the result of coding expediency (and a lack of well-thought-out requirements)!

  10. Technical Debt: • “Shipping first time code is like going into debt. A little debt speeds development as long as it is paid back promptly with a rewrite… The danger occurs when the debt is not repaid. Every minute spent on not-quite-right code counts as interest on that debt.” Ward Cunningham

  11. Refining Cunningham’s Concept: • I use the phrase “Technical Debt” to cover two types of debt: • Conceptual errors in the design of a product, and • Implementation errors in the product as built.

  12. Pharmacy Example Revisited: • The pharmacy example has both types of technical debt. • The design allows too many pharmacists to have easy access, and • The implementation probably contains cyber vulnerabilities that will let in non-pharmacists (hackers), compounding your risks!

  13. Fixing Errors: • Conceptual errors are best solved early with applied brain-power during design time; “smarts”, not dollars or time, is most important. • Implementation errors are typically found in deployed products and can only be solved with time and money (usually lots of both).

  14. Prediction: • We COULD (not Necessarily WILL) feel pain (a major Cyber Security Breach with lasting National Impact) in as little as long weeks to short months.

  15. Current State of Cyber Security: • Your Cyber Systems function and serve you NOT due to the • EXPERTISE of your Security Staff, • but due to the • SUFFERANCE of your opponents.

  16. Saydjari Congressional Testimony • Given three years of preparation, $500M and 30 days to actually execute an attack, • an adversary can destabilize the U.S. and depress the economy with attacks on critical infrastructure, • thus reducing our ability to project military power, depleting our will to fight, and creating panic and distrust in the government. (paraphrased)

  17. Steps to Doing it RIGHT: • Primacy: Security Team needs to be involved at the beginning of a new system, and with all changes to it. • Robust Control: Sensitive systems require robust control, so that only authorized parties, operating with valid permissions, can control the systems, subject to full audit and review of their individual actions at need. • Mutual Suspicion: Use between peer processes, AND between subordinates and their controllers. Software can monitor hardware, and hardware can monitor software.

  18. Steps to Doing it RIGHT: • Least Privilege: No entity should have more privileges than needed . Especially, no single person should be in complete control of all systems • Stark Sub-Setting: Install the minimum number of simplest components needed to do the job. Install nothing else. • Isolation Barriers: Hypervisors running on Microkernel OS’es reduce the attack surface and isolate components from each other.

  19. Steps to Doing it RIGHT: • Partitions and Blinded Interfaces: Design the overall system, then partition; contract so that no single contractor knows enough to mount an attack. • De-link Complex Systems: Minimize 'tight coupling' of systems, i.e. reduce interdependency and reduce information flows.

  20. Steps to Doing it RIGHT: • Are Software Code Sequences “Reasonable”?: Do A, B, C, D, E might be fine, wereas Do E, B, C, D, A could be a disaster • CAPI’s: Use high level “Cryptographic Application Programming Interfaces”. They reduce the attack surface.

  21. Steps to Doing it RIGHT: • UNFETTERED Red Team efforts: Don’t cripple the effort -- Don’t constrain them; tell them what you know is weak, so they don’t duplicate that effort. But let them look; they will find OTHER things there you weren’t aware of. • Physical, Cyber, and Human Security: These technical systems interact; but they may be managed in separate organizations that do not interact, so the separate management organizations MUST insure that any system THREAT or SECURITY analysis they perform addresses the vulnerabilities that may arise when these system security components interact, or harm will ensue.

  22. Quantum Computing Threat: • Quantum Computing is a SEVERE threat to PKC Algorithms that support key portions of Internet Connectivity and Security. (http://www.merkle.com/) • We need 5 years to find and vet Quantum-Computing Resistant Algorithms and another 10-15 years to fully deploy them world-wide. • Quantum Computers capable of mounting the attacks may well be available before the current algorithms can be replaced with resistant ones in time to be safe! • I think the odds now favor quantum computers, NOT the internet!

  23. Changing our Toxic Environment: II • LEAN FORWARD! ANTICIPATE, PLAN AHEAD. It is not about expediency or being “First to market”; it is about being “Good enough to Market”! • Face it; the Internet is too important today for both citizens and corporations; whether you like it or not, it has become a de-facto “Utility”. Get over it. Utilities get regulated, either by adequate self-regulation, or the government will step in. • So far, it may still be your choice, but not for long.

  24. Changing our Toxic Environment: III • “Compliance Regimes” may be necessary, but they cannot be sufficient. • By definition, they focus on past problems, not present or emerging problems. • You may not be able to predict the nature of the next attack, but you can provide defined data structures, points to monitor, and choke functions that allow you to detect that SOMETHING unusual is going on and permit real-time mitigating actions to limit damage, if not totally avoiding it. • It is simply no longer Credible to say, “Not My Problem”!

  25. Changing our Toxic Environment: IV • Let’s compare NSA support to Military in 1970’s – 1980’s With Commercial security Vendor’s support… (OVER SIMPLIFIED!) • COMMERCIAL Vendor: Want Message security? We have DES radios; OK? Good; done deal. • NSA: Want message security? You realize that if opponents cannot read it, they will jam it. Do you want Anti-jam as well? Good – But if they cannot jam you, they will direction find (DF) you, and send rockets… Would you like Low Probability of Detection (LPD) as well? Good…

  26. Good Resources: • Brian D. Snow • 40 years of experience in • Crypto/Cyber/Systems Security • Additional support material at: • http://www.acsac.org/2005/papers/Snow.pdf • http://www.csl.sri.com/neumann/chats4.pdf • Also see: • IEEE Security & Privacy, May/June 2005, pp. 65-67 • Communications of the ACM, Aug 2009 Vol 52 NO. 8, pp 30-32

  27. Contact Information Brian D. SnowBrianSnow@comcast.net301 854-3255

  28. Questions Questions?

More Related