210 likes | 310 Views
Bob Bruen Garth Bruen. “ (Ab) Using ICANN’s Procedures as a Way to minimize Spam”. Standard Approaches. Filter & Block Identify Spammers Blacklist Criminal Prosecution Civil Litigation Challenge/Response Reputation Protection. Definition: Infrastructure The Front End. ICANN
E N D
Bob Bruen Garth Bruen “(Ab)Using ICANN’s Procedures as a Way to minimize Spam”
Standard Approaches • Filter & Block • Identify Spammers • Blacklist • Criminal Prosecution • Civil Litigation • Challenge/Response • Reputation Protection
Definition: InfrastructureThe Front End • ICANN • Top Level Registrars • Retail Registrars • ISPs • Policies and Procedures • Resources Capacity
Front End Problems • Because of: • Weak procedures • Policies not followed • Inadequate resources • Consquences are: • Target rich environment • Spam platform • Enhances botnets, malware, etc
Whois Data Problem Report SystemWDPRS • Whoisdata accuracy REQUIRED • 15 days to fix whois record • Created for just these complaints • One at a time complaints • Designed for small numbers
Modern Complaint Process • Match spammers capability • Employ large scale operations • Automate everything • Processing spam submissions • Filing of complaints Follow ups
KnujOn • Delivers Massively Scalable Automated Spam Handling • Strict Use of ICANN Procedures Once Detected Front End Spam Prevention Compliments Spam Detection & Elimination
What Is Different • Not a honeypot – real people • Spam collection spans years • Targeting transaction sites • Apply ICANN policy enforcement • Scale of complaints filed • ICANN Report 2006: ~45% was Project KnujOn
250,000 200,000 150,000 100,000 50,000 0 '06 '07 '08 '09 Volume of KnujOn Reports KnujOn Complaint Volume Through ICANN WDPR 2008 anticipated will be 4 times that of 2007
KnujOn – Key Processes • “Follow the money” • User submitted spam (ftp or email) • Spam analyzed for Transaction site • Whois data acquired & verified • Automated complaint filed if not accurate • Follow up
MetaData • Large Database • We can correlate • Scam sites & individuals • Sites & criminal groups • Groups, ISPs, Registrars • Analyze trends
Scale Problem • 50,000,000 Registrations in 2007 • 50,000 Complaints - Apparent Limit • Off by three orders of magnitude • Shutdown 55,000+ (PoC) • 20,000-25,000/day submissions
93% of Complaints at 10 Registrars All other registrars 10 Registrars “Big” Problem Actually Small
Repairing the Infrastructure • Evaluate registrar services • Rate registrars • Rate ISPs • Challenge Privacy Protection • Test Whois Services • Identifying Fake DNS servers
Registrar Evaluation • Number of complaints • Filed & total • Acknowledgment/timeliness • Action taken • Rot days • Engaged
Rot Days • “Rot days” = Suspend date – file date • Should be shorter than: • Tasting days = 5 days (Add Grace Period) • Average life time = 5 days (UCSD paper) • Unfortunately increasing
Sample Registrar RatingCaveats • Only uses our filed complaints • Relative ratings matter • Small sample n = 9 (~1000 registrars) • Better & worse registrars exist • Only .com numbers
Example Rating Table Registrar Total Domains Complaints Filed Complaints rate MONIKER 1,956,780 29,855 1.53% directnfo 1,064,697 9,201 0.86% ENOM 6,179,440 39,609 0.64% BIZCN 223,728 815 0.36% NETSOL 5,046,746 15,397 0.31% Markmon 206,593 594 0.29% TUCOWS 4,552,986 7,646 0.17% nameking 788,110 713 0.09% GODADDY 15,295,392 12,036 0.08% Sorted by Rate – Smaller is better
Goals • Fix the WDPRS • Enforce the rules • Audit the Registrars • Terminate the bad registrars
Thank You Bob Bruen bob.bruen@coldrain.net http://www.coldrain.net Garth Bruen garth.bruen@coldrain.net http://www.knujon.com