190 likes | 336 Views
By Matt Hargett <matt @ use.net>. Software Security Without The Source Code. Introduction. Matt Hargett Security QA Engineer for 7 years NAI/McAfee, TurboLinux, Cenzic Discovered many critical vulnerabilities Created binary code analysis product BugScan. Overview.
E N D
By Matt Hargett <matt @ use.net> Software Security Without The Source Code
Introduction • Matt Hargett • Security QA Engineer for 7 years • NAI/McAfee, TurboLinux, Cenzic • Discovered many critical vulnerabilities • Created binary code analysis product • BugScan
Overview • Why we need to measure software security • Kinds of security policy testing • Whitebox approaches • Blackbox approaches • Effectiveness against real-world exploits • What you can do with this information
Why we need to measure • Brand and reputation damage • Proprietary information leakage • Unplanned disruption • Violation of privacy policy • Espionage • Terrorism
How do we measure • Whitebox • Manual code inspection • Source code static analysis • Binary static analysis • Runtime analysis • Blackbox • Web Applications • Network protocols • APIs
Blackbox Network Testing • Sniff, fuzz, replay • Sniff network traffic • Systematically fuzz the relevant data • Remove delimeters, bitwalk, ... • {admin.command\0::\43\89\42} • Replay fuzzed packets to server • Fuzzing via Proxy • Route traffic through a proxy • Proxy fuzzes data systematically • Fuzzed data gets passed on • Repeat client operation • Protocol-specific fuzzing • Make special client for specific protocol(s)
Blackbox Network Testing:In the Real World • Sniff, fuzz, replay • Not stateful • Doesn't work with encryption • Only fuzzes client-side • Fuzzing via Proxy • Lets real client/server handle state • Doesn't work with encryption • Fuzzes client and server data • Protocol-specific fuzzing • Handles real state of client/server • Does encryption itself • Can get great code coverage • Fuzzes client server and data
Blackbox Network Testing:General Jeers • Detecting when you've evoked a problem • Measuring code coverage • Slow process • Expensive to scale
Blackbox Web Testing • Sniff, fuzz, replay • Auto-crawl or manual clicks • Sniff browser requests • Systematically fuzz the relevant data • Insert SQL Injection, Command Injection, XSS, ... • POST /foo.cgi?name=bob&pass=... • Fuzzing via Proxy • Optionally crawl to generate requests • Send requests through a proxy • Proxy fuzzes requests systematically • Repeat browser operation • GUI Automation tools • Automate real browser interaction • Put bad data into form and cookie fields
Blackbox Web Testing:In the Real World • Sniff, fuzz, replay • Server-side state mishaps • Little to no javascript support • No flash, java, web service support • Finds WAY low-hanging fruit • Fuzzing via Proxy • Server-side state mishaps • Must have browser automation anyways • Operations must be self-contained • GUI Automation tools • Maintenance of stored tests • Tests must be self-contained • Can be part of standard QA
Blackbox Web Testing:General Jeers • Detecting when you've evoked a problem • Measuring code coverage • Slow process • Expensive to scale • Getting past captchas
Whitebox Testing Without Source • Manual review • Call-based static analysis • Pointer and control/data flow static analysis
Whitebox Testing Without Source:In the Real World • Manual review • Going through instruction by instruction • Tedious, time consuming, error prone, rare skill • Pair rev-enging, unit testing • Rare skill, unit test exploits may not be real-world • call-based analysis • 42,897 strcpy calls detected!!!!#%* • HttpResponse.GetValue() before Statement.Execute() • Dispose called inside Dispose • Pointer and control/data flow analysis • Must be inter-function to be useful • Must track global/static data • Inter-module tracking also important
Whitebox Testing Without Source:General Jeers • People • Hire manual reviewers with a proven track record of real-world exploitable bugs patched by a vendor • Tools • Difficult to use • Poor quality • False positives
Whitebox Testing With Source:General Jeers • Worthless when source not available • People • Hire manual reviewers with a proven track record of real-world exploitable bugs patched by a vendor • Tools • There is no free unit test lunch • Demand vendors demonstrate finding novel real-world exploitable bugs their tool finds OOTB • Demand vendors demonstrate finding previously known real-world exploitable bugs their tool finds OOTB • Demand third party vendor-neutral benchmarks • No good visualization/exploration tools for manual reviewers
Whitebox Testing:General Jeers, Source or Not • Code is code, period • Most tools are a retarded joke • False positive rates above 10% on large (100KLOC) means a useless tool • High priority report items should be real-world exploitable 95% of the time • Custom signatures shouldn't require extra cost, permission, or license • Tools • Demand vendors demonstrate finding novel real-world exploitable bugs their tool finds OOTB • Demand vendors demonstrate finding previously known real-world exploitable bugs their tool finds OOTB • Demand third party vendor-neutral benchmarks • No good visualization tools for manual reviewers • Zealousy on a given approach
Recommendations • Engage a holistic approach • Blackbox and Whitebox • Use multiple vendor tools to cross-check • Source and binary • Runtime and static • Use protocol-specific fuzzers • Ask vendor for code coverage on open source implementation(s) of said protocol(s) • Use UI automation tools for web apps • Any good tool will require tuning
Taking Action • If open source, fix the problem yourself • Contact vendor • If vendor cannot supply a fix in 30 days • Escalate the issue • Find a new vendor • If open source, fix the problem yourself • Vendors will string you along
Thank You matt @ use . net Questions?