1 / 61

Class Discussion “Analyzing the MAC-levelBehavior of Wireless Networks in the Wild”

This discussion will critique a paper analyzing the MAC-level behavior of wireless networks and provide guidance on critiquing the main contributions, methodology, limitations, and potential improvements of the study. The session will delve into evaluating the significance of the paper and offering insights for researchers and builders. Attendees will explore the need for tools like WIT for measuring and enhancing wireless LAN performance. Background information on wireless measurement-driven analysis and the challenges faced will be discussed, along with the proposed solution of passive monitoring to gather accurate data. Join the discussion led by Jerry Sussman to understand the complexities of analyzing wireless networks and the implications for research and practice.

luceron
Download Presentation

Class Discussion “Analyzing the MAC-levelBehavior of Wireless Networks in the Wild”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Class Discussion“Analyzing the MAC-levelBehavior of Wireless Networks in the Wild” Discussion Guided by Jerry Sussman

  2. Critique Guidance • Critique Instructions: • Critique the paper, not the me! • All students should read the paper before class • Critique is due prior to the following week’s class • Discussion Leader MUST use slides to guide the discussion • Critiques should be organized / structured per website • This is a 300-level course….there should be a lot of discussion.

  3. Critique Guidance • (10%) State the problem the paper is trying to solve. • (20%) State the main contribution of the paper: solving a new problem, proposing a new algorithm, or presenting a new evaluation (analysis). If a new problem, why was the problem important? Is the problem still important today? Will the problem be important tomorrow?  If a new algorithm or new evaluation (analysis), what are the improvements over previous algorithms or evaluations? How do they come up with the new algorithm or evaluation?  • (15%) Summarize the (at most) 3 key main ideas (each in 1 sentence.)  • (30%) Critique the main contribution • Rate the significance of the paper on a scale of 5 (breakthrough), 4 (significant contribution), 3 (modest contribution), 2 (incremental contribution), 1 (no contribution or negative contribution). Explain your rating in a sentence or two. • Rate how convincing the methodology is: how do the authors justify the solution approach or evaluation? Do the authors use arguments, analyses, experiments, simulations, or a combination of them? Do the claims and conclusions follow from the arguments, analyses or experiments? Are the assumptions realistic (at the time of the research)? Are the assumptions still valid today? Are the experiments well designed? Are there different experiments that would be more convincing? Are there other alternatives the authors should have considered? (And, of course, is the paper free of methodological errors.) • What is the most important limitation of the approach? • (15%) What lessons should researchers and builders take away from this work. What (if any) questions does this work leave open? • (10%) Propose your improvement on the same problem. • Note: the purpose of this template is to serve as a starting point, instead of a constraint. Use your judgment and creativity. Some advice through the resource link of the class can be helpful.

  4. Agenda • Authors • Summary • Background • Wit • Theory Behind Wit • Implementation of Wit • Wit Evaluation • Inference versus Additional Monitors • Application in Live Environment • Conclusion

  5. Authors Ratual Mahajan Microsoft Maya Rodrig University of Washington David Wetherall University of Washington John Zahorjan University of Washington Funding NSF Presented SIGCOMM’06 September 11-15, 2006 Pisa, Italy

  6. Summary First • Paper Documents WIT • Passive Wireless Analysis Tool • Analyzes MAC-Level behavior on Wireless Networks • Paper Assesses WIT Performance • Based on Real & Simulated Data • Authors tested WIT against live Wireless Network

  7. Why Is WIT Needed? • ???

  8. Why Is WIT Needed? • Understand how live networks communicate in different situations: • Highly loaded environment • Low load environments • Interfering wireless LANs, etc. • Critical to knowing how to improve performance of wireless LANs.

  9. Background • Measurement-driven analysis of live networks • Critical to understanding live performance of networks • Critical to improving performance • Measurement-driven refers to: • Part Measured / Collected data • Part ‘generated’ data

  10. Background • Wireless Measurement-Driven Analysis • At time of paper publication, Lacking in: • Software Collection/Analysis Tools • Performance data from wireless networks • Reasons: • Based on Simple Network Mgt Protocol (SNMP) logs from AP • AP logs • Low fidelity (i.e. course logs) of AP Side • No data from client view • Packet traces from Wired hosts next to AP • Traces omit wireless retransmissions

  11. Background • Unrealistic Solution • Instrument entire wireless network • Proven Successful in control environments • Unrealistic and not a match for commercial application • Only Realistic Solution • Obtain trace via passive monitoring • 1 or more nodes declared “monitors” • Monitors placed in vicinity of wireless network • Record attributes of all transmissions • Trivial to deploy

  12. Background • Problems with “Passive Monitoring” • Data / Traces may be incomplete • Packets dropped due to weak signal • Packets dropped due to collisions • Difficult to know what packets are missing from a monitor • Monitor stations can’t determine if destination properly received packets • Important for determining reception probability

  13. Background • This paper is trying to: • Find a way to assemble an accurate trace of wireless environment for analysis • Use data from multiple monitoring stations • Determine missing packets • Re-create missing packets • Combine into single Trace file • Determine Network Performance • How often do clients retransmit their packets • Determine loss effects between two nodes • Effect of increased load on the network

  14. Background • Authors attempt to solve problem with WIT: • Paper presents WIT, a tool for Measurement-Driven Analysis. • WIT has 3 modules which solve key problems identified earlier

  15. Wit

  16. Why Is WIT Needed? • Quantify Wireless Network Performance • Estimate # of competing stations • Assist in diagnosing wireless network problems

  17. WIT Core Processing Steps • Merging procedure • Packet Reconstruction • Determination of Network Performance

  18. Merging procedure{1st Core Processing Step} • Combine incomplete traces from multiple, independent monitors • Provides a complete trace for follow-on steps • Based upon collected date • Not inferred or reconstructed

  19. Packet Reconstruction{Second Core Processing Step} • Reconstructs packets not captured by any monitor • Strong inference engine • Determines if packet received at destination • Again, provides more complete trace for follow-on step

  20. Determination of Network Performance{Third Core Processing Step} • WIT Calculates Network Performance • Input: Constructed trace • Output: • Typical simple network measurements • Packet reception probabilities • Estimates number of nodes contending for medium • Not previously achieved according to authors

  21. Passive Monitoring Pipeline

  22. WIT Evaluation • After Development of WIT, Authors faced with Evaluation Task • Used mix of real and simulated data • Used WIT at SIGCOMM 2004 conference • Multi-monitor traces captured • Uncovered MAC-layer characteristics of environment • Network was dominated by period of low contention during which the medium was poorly utilized, even though APs were waiting to tx packets • Suggests 802.11 MAC tuned for high traffic levels that are uncommon on real networks. • Authors claim this can’t be obtained by other methods

  23. Now for the Theory behind WIT phases{Implementation of Phases will follow….}

  24. 3 Core Phases • Merging of Traces • Inferring Missing Information • Deriving Measurments / Performance

  25. 3 Core Phases • Merging of Traces • Inferring Missing Information • Deriving Measurments / Performance

  26. Merging of Traces

  27. Merging of Traces • Input: • Number of Packet traces • 1 Trace per monitor • Timestamps reflect local AP Receive Packet time

  28. Merging of Traces • Output: • Merge into single, consistent timeline for all packets observed • Eliminate duplicates • Assign coherent timestamps to all packets independent of monitor • Timestamp accuracy to a few microseconds required. • Identify and Eliminate Duplicates

  29. Merging of Traces • Timing, the critical element • Only few packets carry info guaranteed to be unique over a few miliseconds • Only way to distinguish duplicates is by time • Accurate timestamps are vital to creating the merged trace • Reference packets are the key

  30. Merging of Traces • Three Step Merging Process • Identify the reference packets common to both monitors • Beacons generated by APs as references • Contain unique source MAC address • Contain 64-bit value of local, microsecond resolution timer

  31. Merging of Traces • Three Step Merging Process • Use reference timestamps to translate the time coordinates • Pair up two reference timestamps across two traces • Time interval of secondary is altered to match baseline trace • Constant added to align the two traces between the two individual reference points • Resizing / alignment process adjusts for clock drift and alignment bias between two monitors

  32. Merging of Traces • Three Step Merging Process • Identify and Remove duplicates • Identify by matching: • Packet Type • Same Source • Same Destination • Time stamp that is less than ½ of minimum time to transmit a packet Note: The code for this would be straight forward however I suspect much time was spent reviewing the data and proving that the code/scheme worked.

  33. Merging of Traces • Waterfall Merging Process • Merge two traces • Then merge third trace to baseline trace • Approach is not most time efficient • Approach provides improved precision: • New reference points continually added • Easier to find set of shared reference points as more monitor traces merged

  34. 3 Core Phases • Merging of Traces • Inferring Missing Information • Deriving Measurments / Performance

  35. Inferring Missing Information

  36. Inferring Missing Information • Two Fundamental Purposes: • Infer missing packets from collected & merged data • Estimate whether packets were received by their destination • Authors claim this is new

  37. Inferring Missing Information • Key Technique: • Transmitted packets imply useful data about the packets it must have received • Example: • AP send ASSOCIATION RESPONSE only if it recently receive an ASSOCIATION REQUEST. • If the merge trace contains the response but no request then we know request was successfully sent • Also, sender and destination of missing request are known from response packet.

  38. Inferring Missing Information • Processing the merged trace • Scan each packet and process • Classify each packet type • Generate markers • Ex: Ongoing conversation end • Formal Language Approach (FSM) • Infer Packet Reception • Infer Missing Packets • Construct Packets as Required

  39. 3 Core Phases • Merging of Traces • Inferring Missing Information • Deriving Measurements/Performance

  40. Deriving Measurements / Performance

  41. Inferring Missing Information • Merged Trace Can Be Mined: • Many ways to study detailed behavior • Packet Reception probability • Estimate number of stations that are competing for medium per snapshot in time • Requires access to ‘State’ • ‘randomly selected backoff values • DATA & DATAretry packets

  42. Now for the Implementation of WIT

  43. WIT Implementation • WIT Implented in 3 Components • halfWit • nitWit • dimWit • Half, Nit, & DIM correspond to three pipeline phases discussed earlier

  44. WIT Implementation • halfWit • Merge phase • 1st Insert all traces into database • Database used to merge data as defined earlier • Database also used to pass final merged trace to nitWit • Uses merge-sort methodology • Traces handled like queues

  45. WIT Implementation • nitWit • Inference phase • nitWit take output of halfWit • Determines and recreates missing packets • Annotates captured and inferred packets • Critical annotation for each packet is whether it was received. • Retry packet fields are tracked Note: Original implementation did not ‘merge’ captured and inferred packets because exact timing uncertainty. Different than theory writeup section.

  46. WIT Implementation • dimWit • Derived Measures Component • dimWit take output of nitWit • Produces summary network information • Produces number of contenders in the network • Implemented to analyze tens of millions of packets in a few minutes.

  47. Wit Evaluation

  48. Wit Evaluation • Purpose of Evaluation: • Understand how well each phase works • Key questions to be evaluated: • Quality of time synchronization? • Quality of merged product? • Accuracy of inferences? • Fraction of missing packets inferred? • Number of Contenders – accuracy? • Analyze improvement from more monitors or more inference?

  49. Wit Evaluation • Reality of this type of evaluation: • Comparing against ground truth unrealistic • Too much detail • Unrealistic to create absolute controlled environment • Reduced to simulation as primary validation method

  50. Wit Evaluation • Simulated Environment: • 2 Access Points (AP’s) • 40 clients randomly distributed on a grid • Packet Simulator • Reception probability based on: • signal strength • Transmission rate • Existing packets in environment • Random bit errors

More Related