110 likes | 293 Views
1. Prelude. Diebold’s electronic voting system source code was discovered and subsequently leaked due to it being on a Diebold web server. Although it is unclear exactly how it was discovered, this type of information leakage is what this pre-proposal will be covering. 2.
E N D
Prelude • Diebold’s electronic voting system source code was discovered and subsequently leaked due to it being on a Diebold web server. • Although it is unclear exactly how it was discovered, this type of information leakage is what this pre-proposal will be covering. 2
Project Overview and Goals • This presentation is an overview of preliminary findings on discovering and retrieving private information on the world wide web. • The goal of this project will be to identify the best methods of attack and defense for information leakage. • The result of this research will be a set of design guidelines to mitigate identified vulnerabilities in web sites. 3
Obscurity Leads to Exploitation • Just because it’s not listed does not mean that it is not there. • Many “dynamic” sites do not employ “real” security measures to protect files. • e.g., Drupal and Joomla default installations. • These sites generally use “fake” paths or just php/asp/cgi/pl/py/etc files to give you the file. • Drupal pathauto is an example of “fake” paths. 4
Current Tools • Nikto • Checks for known files of popular systems for the purpose of vulnerability assessment. • Makes no attempt to find private files. • NavNet • “Fusker” - geared towards getting free photos and videos. • Not very light weight. • .net 2.0 + GUI (with built in web browser) • DirBuster • True brute forcing application, closer to what we want. • Uses lists that are not very dynamic. • dsc00828, dscn0279 • Not very light weight. • Java + GUI 5
My Tool - wdivulge • Written in Perl. • Allows rapid prototyping and development • Multiplatform • Command line tool • Threaded for maximum efficiency • Planned open source release via GPL (unless sponsored) • Modular • Possible plans for “glue” code to pass data off to other attack tools like sqlmap • Smarter Detection 6
Smarter Detection • Needed because brute forcing is really expensive. • Web is SLOW! • Compared to local brute forcing. • Uses customizable “smart lists”. • "DSCN"[####]/".jpg" versus dscn0279 • Pass results back to parent thread. • Not yet implemented. • Could speed things up or slow things down depending on target. 7
Early Results • Able to find many hidden files depending on naming schemes. • Default names are the easiest. • Helpful if you have access to at least one file. • Complex names are not discovered yet. • e.g., “hey^s3cr3t%%$.txt” • Attacks are only as good as the “smart lists”. • Better lists needed! • Doesn’t work against .htaccess protected sites or binaries embedded in databases. • Tested on Apache server, not on IIS yet. 8
Defense • .htaccess control seems to be the best in terms of performance. • Password protection for basic sites. • “Deny from all” directories + Dupral Protected Download module (there are other modules like this). • Storing binaries in the database. • Causes a performance hit due to all files having to go through your database. • Works well against this type of attack, but could leave you even more vulnerable if your site is vulnerable to attacks like sql injection. • Don’t put sensitive information on your web server. • Best defense, also the least practical in many situations. • Internal web servers may also be vulnerable depending on your network setup. Adrian Lamo gained access to many internal pages via poorly configured proxies. 9
Future • Better lists will increase results. • Lists require considerable research to maximize efficiency. • There is often no “universal” list for a certain file type due to personalized naming standards. • wdivulge is intended as a framework. • Support of other tools like sqlmap will result in better information discovery. • Adding “smart directory” discovery is critical. • We cannot brute force if we do not know where to brute force. • Similar to “smart lists” but based on internal workings of various content management systems (CMS). • Scraping of public sites will be needed for non public CMS. • Sponsorship? 10
Q/A • Questions? 11