390 likes | 544 Views
Who Is Peeping at Your Passwords at Starbucks?. To Catch an Evil Twin Access Point. DSN 2010 Yimin Song, Texas A&M University Chao Yang, Texas A&M University Guofei Gy, Texas A&M University. Agenda. Introduction Analysis Algorithm Evaluation Conclusion. Agenda. Introduction
E N D
Who Is Peeping at Your Passwords at Starbucks? To Catch an Evil Twin Access Point DSN 2010 Yimin Song, Texas A&M University Chao Yang, Texas A&M University Guofei Gy, Texas A&M University
Agenda • Introduction • Analysis • Algorithm • Evaluation • Conclusion
Agenda • Introduction • Wireless Network Review • Evil Twin Attack • Analysis • Algorithm • Evaluation • Conclusion
Wireless Network Review AP AP BF Internet data SIFS ACK • Wireless terminology • AP – Access Point • SSID – Service Set Identifier • RSSI – Received Signal Strength Indication sender receiver DIFS hub, switch or router • 802.11 CSMA/CA • DIFS – Distributed Inter-Frame Spacing • SIFS – Short Inter-Frame Spacing • BF – Random Backoff Time BSS 1 BSS 2
Evil Twin Attack • A phishing Wi-Fi AP that looks like a legitimate one (with the same SSID name). • Typically occurred near free hotspots, such as airports, cafes, hotels, and libraries. • Hard to trace since they can be launched and shut off suddenly or randomly, and last only for a short time after achieving their goal.
Evil Twin Attack (cont.) • Related work • Monitors radio frequency airwaves and/or additional information gathered at router/switches and then compares with a known authorized list. • Monitors traffic at wired side and determines if a machine uses wired or wireless connections. Then compare the result with an authorization list to detect if the associated AP is a rogue one.
Agenda • Introduction • Analysis • Network Setting in This Model • Problem Description • Server IAT (Inter-packet Arrival Time) • Algorithm • Evaluation • Conclusion
Network Setting in This Model Table 1: Variables and settings in this model
Problem Description • An evil twin typically still requires the good twin for Internet access. Thus, the wireless hops for a user to access Internet are actually increased. • Fig. 1: Illustration of the target problem in this paper • What statistics can be used to effectively distinguish one-hop and two-hop wireless channels on user side? • Are there any dynamic factors in a real network environment that can affect such statistics? • How to design efficient detection algorithms with the consideration of these influencing factors?
Server IAT (cont.) • Fig. 2: Server IAT illustration in the normal AP scenario
Server IAT (cont.) • Fig. 2: Server IAT illustration in the normal AP scenario
Server IAT (cont.) Fig. 5: IAT distribution under RSSI=50% Fig. 4: IAT distribution under RSSI=100%
Agenda • Introduction • Analysis • Algorithm • TMM (Trained Mean Matching Algorithm) • HDT (Hop Differentiating Technique) • Improvement by Preprocessing • Evaluation • Conclusion
TMM • Trained Mean Matching Algorithm (TMM) requires knowing the distribution of Server IAT as a prior knowledge. • Given a sequence of observed Server IATs, if the mean of these Server IATs has a higher likelihood of matching the trained mean of two-hop wireless channels, we conclude that the client uses two wireless network hops to communicate with the remote server indicating a likely evil twin attack, and vice versa.
HDT (cont.) Fig. 2: Server IAT illustration in the normal AP scenario Fig. 6: 6-AP IAT illustration in the normal AP scenario
Agenda • Introduction • Analysis • Algorithm • Evaluation • Environment Setup • Datasets • Effectiveness • Cross Validation • Conclusion
Environment Setup Fig. 8: Environment for evil twin AP Fig. 7: Environment for normal AP
Datasets Table 3: The percentage of filtered packets Table 2: RSSI ranges and corresponding levels
Effectiveness Table 5: False positive rate for HDT and TMM Table 4: Detection rate for HDT and TMM
Effectiveness (cont.) Fig. 9: Cumulative probability of the number of decision rounds for HDT to output a correct result
Effectiveness (cont.) Table 7: False positive rate when number of input data in one decision round is 50 Table 6: Detection rate when number of input data in one decision round is 50 Table 7: False positive rate when number of input data in one decision round is 100
Effectiveness (cont.) Fig. 10: Detection rate for multi-HDT using different numbers of input data in one decision round
Cross Validation Fig. 11: Detection rate for TMM under different RSSI ranges
Cross Validation (cont.) Fig. 12:Detection rate under different 802.11g networks
Cross Validation (cont.) Fig. 13: False positive rate under different 802.11g networks
Agenda • Introduction • Analysis • Algorithm • Evaluation • Discussion and Conclusion • Discussion • Conclusion
Discussion • More wired hops? • Several studies showed that the delays from the wired link is not comparable to those in the wireless link. • We can trade-off for more decision rounds. • Use a server within small hops. • Maybe use techniques similar to “traceroute” to know the wired transfer time and then exclude/subtract them to minimize the noisy effect at wired side.
Discussion (cont.) • Will attacker increase IAT to avoid detection? • Users don’t like a slow connection. Eq. 1: Attacker may delay the packet to reduce the SAIR • What if some evil twin AP connect to wired network instead of using normal AP? • That’s our future work.
Conclusion • We propose TMM and HDT to detect evil twin attack where TMM requires trained data and HDT doesn’t. • HDT is particularly attractive because it doesn’t rely on trained knowledge or parameters, and is resilient to changes in wireless environments.