270 likes | 417 Views
Detecting Network Neutrality Violations with Causal Inference. Mukarram Bin Tariq, Murtaza Motiwala Nick Feamster, Mostafa Ammar Georgia Tech http://gtnoise.net/nano/. The Network Neutrality Debate. Users have little choice of access networks.
E N D
Detecting Network Neutrality Violations with Causal Inference Mukarram Bin Tariq, Murtaza Motiwala Nick Feamster, Mostafa Ammar Georgia Techhttp://gtnoise.net/nano/
The Network Neutrality Debate Users have little choice of access networks. ISPs want to “share” from monetizable traffic that they carry for content providers. November 6, 2006
Goal: Make ISP Behavior Transparent Source: Glasnost project Our goal: Transparency.Expose performance discrimination to users.
Existing Techniques are Too Specific • Detect specific discrimination methods and policies • Testing for TCP RST packets (Glasnost) • ToS-bits based de-prioritization (NetPolice) • Limitations • Brittle: discrimination methods may evolve • Evadable • ISP can whitelist certain servers, destinations, etc. • ISP can prioritize monitoring probes • Active probes may not reflect user performance • Monitoring is not continuous
Main Idea: Detect Discrimination From Passively Collected Data This talk: Design, implementation, evaluation, and deployment of NANO Objective: Establish whether observed degradation in performance is caused by ISP Method:Passively collect performance data and analyze the extent to which an ISP causes this degradation
Ideal: Directly Estimate Causal Effect Causal Effect= E(Real Throughput using ISP) E(Real Throughput not using ISP) Performance with the ISP Baseline Performance “Ground truth” values for performance with and without the ISP (“treatment variable”) Problem: Need both ground truth values observed for same client. These values are typically not available.
Instead: Estimate Association from Observed Data Observed Performance with the ISP Association= E(Observed Throughput using ISP) E ( Observed Throughput not using ISP) Observed Baseline Performance Problem: Association does not equal causal effect. How to estimate causal effect from association?
Association is Not Causal Effect Why? Confounding variablescan confuse inference. • Suppose Comcast users observe lower BitTorrent throughput. • Can we assume that Comcast is discriminating? • No! Other factors (“confounders”) may correlate with both the choice of ISP and the output variable. ClientSetup Time of Day Comcast ? Location Content BTThroughput
Strawman: Random Treatment Common approach in epidemiology. H S S S H S S S H Untreated Treated H H H H S H S S S = 0.8 - 0.25 = 0.55 α θ S = “sick”H = “healthy” Treat subjects randomly, irrespective of their initial health. Measure association with new outcome. Association converges to causal effect if the confounding variables do not change during treatment.
The Internet Does Not Permit Random Treatment Alternate approach: Stratification • Random treatment requires changing ISP. • Problems • Cumbersome: Nearly impossible to achieve for large number of users • Does not eliminate all confounding variables (e.g., change of equipment at user’s home network)
Stratification: Adjusting for Confounders Causal Effect (θ) 0.55 -0.11 Strata H H H H H Treated H H H H H H H H S S S S S S S S 0.75 0.44 H S Baseline H H H H H S S S S S S S 0.20 0.55 Step 1:Enumerate confounderse.g., setup ={ , } Step 2:Stratify along confounder variable values and measure association Association implies causation (no otherexplanation)
Stratification on the Internet: Challenges What is baseline performance? What are the confounding variables? Which data to use, and how to collect it? How to infer the discrimination method?
What is the baseline performance? • Baseline: Service performance when ISP not used • Need some ISP for comparison • Approach: Average performance over other ISPs • Limitation: Other ISPs may also discriminate
What are the confounding variables? • Client-side • Client setup: Network Setup, ISP contract • Application: Browser, BT Client, VoIP client • Resources: Memory, CPU, network utilization • Other: Location, number of users sharing home connection • Temporal • Diurnal cycles, transient failures
What data to use; how to collect it? http://www.gtnoise.net/nano/ • NANO-Agent: Client-side, passive collection • per-flow statistics: throughput, jitter, loss, RST packets • application associated with flow • resource monitoring • CPU, memory, network utilization • Performance statistics sent to NANO-Server • Monitoring, stratification, inference
Evaluation: Three Experiments Experiment 1: Simple Discrimination • HTTP Web service • Discriminating ISPs drop packets Experiment 2: Long Flow Discrimination • Two HTTP servers S1 and S2 • Discriminating ISPs throttle traffic for S1 or S2 if the transfer exceeds certain threshold Experiment 3: BitTorrent Discrimination • Discriminating ISP maintains list of preferred peers • Higher drop rate for BitTorrent traffic to non-preferred peers
Experiment Setup Clients Running NANO-Agent D1 D2 N1 N2 N3 Internet ISPs Access ISP 5 ISPs in Emulab 2 Discriminating Service Providers PlanetLab nodes HTTP and BitTorrent Discrimination Throttling and dropping Policy with Click router Confounding Variables Server location near servers (West coast nodes) far servers (remaining PlanetLabnodes) ~200 PlanetLab nodes
Without Stratification, Detecting Discrimination is Difficult Simple Discrimination Overall throughput distribution in discriminating and non-discriminating ISPs is similar.
Stratification Identifies Discrimination Discriminating ISPs have clearly identifiable causal effect on throughput Simple Long-Flow BitTorrent Neutral ISPs are absolved
Implementation and Deployment http://gtnoise.net/nano/ Performance Relative to Other Users DNS Latency Traffic Breakdown Throughput • Implementation • Linux version available • Windows and MacOS versions in progress • Now: 27 users • Need thousands for inference • Performance dashboard may help attract users
Summary and Next Steps • Internet Service Providers discriminate against classes of users and application traffic today. • Need passive approach • ISP discrimination techniques can evolve, or may not be known to users. • Tradeoff: Must be able to enumerate confounders • NANO: Network Access Neutrality Observatory • Infers discrimination from passively collected data • Detection succeeds in controlled environments • Deployment in progress. Need more users. http://gtnoise.net/nano/
NANO Can Infer Discrimination Criteria Approach Evaluation ISP throttles throughput of a flow larger than 13MB or about 10K packets cum_pkts <= 10103 -> not_discriminated cum_pkts > 10103 -> discriminated
Why Association != Causal Effect? Sleep Aspirin Diet ? Health Age OtherDrugs • Positive correlation in health and treatment • Can we say that Aspirincauses better health? • Confounding Variables correlate with both cause and outcome variables and confuse the causal inference
Causality: An Analogy from Health Epidemiology: study causal relationships between risk factors and health outcome NANO: infer causal relationship between ISP and service performance degradation
Without Stratification, Detecting Discrimination is Hard Simple Discrimination Experiment Long Flow Discrimination Experiment Overall throughput distribution in discriminating and non-discriminating ISPs is similar.Server location is confounding.