1 / 14

Methodology for Characterizing Performance of IEEE 802.11 Nodes in Multi-hop Environments

This paper presents a methodology to analyze the performance of IEEE 802.11 nodes in multi-hop environments. It addresses possible inaccuracies in wireless hardware that can distort measurements, proposes a novel metric to account for errors in detecting other transmissions, and provides a measurement methodology to obtain the metric value. The study emphasizes the importance of correctly characterizing hardware behavior before drawing conclusions based on measurement observations.

Download Presentation

Methodology for Characterizing Performance of IEEE 802.11 Nodes in Multi-hop Environments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Methodology to characterize the performance of IEEE 802.11 nodes to be deployed in multi-hop environments” “Marc Portoles Comeras, Andrey Krendzel, Josep Mangues-Bafalluy” “Centre Tecnològic de Telecomunicacions de Catalunya (CTTC)” April 20, 2007

  2. Outline • Introduction • A study case: what fails? • Accounting for errors in detecting other transmissions:  • Measuring  • Experimental results • Using the value in practice • Conclusions

  3. Introduction • Motivation: • A large body of theoretical work on wireless mesh networking has still not been experimentally tested • When going experimental, however, unexpected issues are prone to arise • Previous work has shown that good theory can be too promptly discarded when the hardware being used is not correctly calibrated • The objective here is to analyze possible inaccuracies in the wireless hardware that may distort measurements in experimental wireless mesh networking

  4. A study case: what fails? • Saturation throughput study: 802.11 leads to asymmetry • Theoretical model* prediction: B receives 4-6% of time share • Experimental results:B is active more than 20% of time! • What’s failing? • Is the model accurate? • Does the hardware behave as expected? -------------------------------------------------------------------------------------------------------------------------------------------------- *Claude Chaudet et al. “Study of the Impact of Asymmetry and Carrier Sense Mechanism in IEEE 802.11 Multihops Networks through a Basic Case”, in proceeding MSWIN 2004, Venezia, Italy

  5. Accounting for errors in detecting other transmissions • Active 802.11 node can be in any of three states: • active: sending data • idle: not detecting activity in the medium • stall: detecting the medium busy or receiving data • The maximum data that a backlogged station can send,

  6. Accounting for errors in detecting other transmissions • Let’s define a metric  = 1 - Pr{node senses medium idle when it is being used} • Maximum data that a non-ideal station sends • In case we could compute Taccu_stall exactly

  7. Measuring  • A scenario to measure the metric  • Requirement: Taccu_stall must be easy to compute • Flow 1 and Flow 2 characteristics: • Independent, identical and multicast • In this case  Taccu_stall = Tactive •  can be obtained through measuring the maximum transmission rate that Flow 2 achieves

  8. Measuring  • Workload can affect the measure • If the NUT is not able to handle the data rates that it is required to (Flow 1 + Flow 2) the measure of metric  is not reliable • We need a method to detect whether the NUT is loosing data due to excessive workload

  9. Measuring  • Workload can affect the measure • A validation curve to detect excessive workload

  10. Experimental results • A corrrect measure of the metric without workload interference

  11. Experimental results • An example of workload interference in the measure

  12. Using the value in practice • Applying  to the initial example • There is a high probability (11% in our experiments) that B does not correctly detect transmissions from A and C • Strong bias in the measure • Possible solutions are: • Modify the model to include hardware inaccuracies • Choose alternative hardware or correctly tune the one that is being used.

  13. Conclusions • This paper • Shows how wireless hardware solutions may fail in detecting transmissions from other stations • Proposes a novel metric to account for the probability that a station fails in detecting other transmissions • Proposes a measurement methodology to obtain a value of the metric • The study also draws attention on the importance of correctly characterizing hardware behavior before rising conclusions out of measurement observations.

  14. Thanks for your kind attention! • Questions? Marc Portoles Comeras Centre Tecnològic de Telecomunicacions de Catalunya (CTTC) – Barcelona Contact e-mail: marc.portoles@cttc.es

More Related